Science.gov

Sample records for quantitative performance evaluation

  1. Quantitative evaluation of wrist posture and typing performance: A comparative study of 4 computer keyboards

    SciTech Connect

    Burastero, S.

    1994-05-01

    The present study focuses on an ergonomic evaluation of 4 computer keyboards, based on subjective analyses of operator comfort and on a quantitative analysis of typing performance and wrist posture during typing. The objectives of this study are (1) to quantify differences in the wrist posture and in typing performance when the four different keyboards are used, and (2) to analyze the subjective preferences of the subjects for alternative keyboards compared to the standard flat keyboard with respect to the quantitative measurements.

  2. Quantitative evaluation on the performance and feature enhancement of stochastic resonance for bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Li, Guoying; Li, Jimeng; Wang, Shibin; Chen, Xuefeng

    2016-12-01

    Stochastic resonance (SR) has been widely applied in the field of weak signal detection by virtue of its characteristic of utilizing noise to amplify useful signal instead of eliminating noise in nonlinear dynamical systems. How to quantitatively evaluate the performance of SR, including the enhancement effect and the degree of waveform distortion, and how to accurately extract signal amplitude have become two important issues in the research on SR. In this paper, the signal-to-noise ratio (SNR) of the main component to the residual in the SR output is constructed to quantitatively measure the enhancement effect of the SR method. And two indices are constructed to quantitatively measure the degree of waveform distortion of the SR output, including the correlation coefficient between the main component in the SR output and the original signal, and the zero-crossing ratio. These quantitative indices are combined to provide a comprehensive quantitative index for adaptive parameter selection of the SR method, and eventually the adaptive SR method can be effective in enhancing the weak component hidden in the original signal. Fast Fourier Transform and Fourier Transform (FFT+FT) spectrum correction technology can extract the signal amplitude from the original signal and effectively reduce the difficulty of extracting signal amplitude from the distorted resonance output. The application in vibration analysis for bearing fault diagnosis verifies that the proposed quantitative evaluation method for adaptive SR can effectively detect weak fault feature of the vibration signal during the incipient stage of bearing fault.

  3. Quantitative performance evaluation of the EM algorithm applied to radiographic images

    NASA Astrophysics Data System (ADS)

    Brailean, James C.; Giger, Maryellen L.; Chen, Chin-Tu; Sullivan, Barry J.

    1991-07-01

    In this study, the authors evaluate quantitatively the performance of the Expectation Maximization (EM) algorithm as a restoration technique for radiographic images. The 'perceived' signal-to-nose ratio (SNR), of simple radiographic patterns processed by the EM algorithm are calculated on the basis of a statistical decision theory model that includes both the observer's visual response function and a noise component internal to the eye-brain system. The relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to quantitatively compare the effects of the EM algorithm to two popular image enhancement techniques: contrast enhancement (windowing) and unsharp mask filtering.

  4. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  5. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC).

  6. Evaluation of fourier transform profilometry performance: quantitative waste volume determination under simulated Hanford waste tank conditions

    SciTech Connect

    Jang, Ping-Rey; Leone, Teresa; Long, Zhiling; Mott, Melissa A.; Perry Norton, O.; Okhuysen, Walter P.; Monts, David L.

    2007-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of chemical and radioactive species of interest. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We have completed a preliminary performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. Based on a Hanford C-200 series tank with camera access through a riser with significant offset relative to the centerline, we devised a testing methodology that encompassed a range of obstacles likely to be encountered 'in tank'. These test objects were inspected by use

  7. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  8. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    PubMed Central

    Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas

    2015-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID

  9. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications

    PubMed Central

    Wei, Wentao; Huang, Qiu; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring. PMID:28251150

  10. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  11. Three-year randomised clinical trial to evaluate the clinical performance, quantitative and qualitative wear patterns of hybrid composite restorations.

    PubMed

    Palaniappan, Senthamaraiselvi; Elsen, Liesbeth; Lijnen, Inge; Peumans, Marleen; Van Meerbeek, Bart; Lambrechts, Paul

    2010-08-01

    The aim of the study was to compare the clinical performance, quantitative and qualitative wear patterns of conventional hybrid (Tetric Ceram), micro-filled hybrid (Gradia Direct Posterior) and nano-hybrid (Tetric EvoCeram, TEC) posterior composite restorations in a 3-year randomised clinical trial. Sixteen Tetric Ceram, 17 TEC and 16 Gradia Direct Posterior restorations were placed in human molars and evaluated at baseline, 6, 12, 24 and 36 months of clinical service according to US Public Health Service criteria. The gypsum replicas at each recall were used for 3D laser scanning to quantify wear, and the epoxy resin replicas were observed under scanning electron microscope to study the qualitative wear patterns. After 3 years of clinical service, the three hybrid restorative materials performed clinically well in posterior cavities. Within the observation period, the nano-hybrid and micro-hybrid restorations evolved better in polishability with improved surface gloss retention than the conventional hybrid counterpart. The three hybrid composites showed enamel-like vertical wear and cavity-size dependant volume loss magnitude. Qualitatively, while the micro-filled and nano-hybrid composite restorations exhibited signs of fatigue similar to the conventional hybrid composite restorations at heavy occlusal contact area, their light occlusal contact areas showed less surface pitting after 3 years of clinical service.

  12. Three-year randomised clinical trial to evaluate the clinical performance, quantitative and qualitative wear patterns of hybrid composite restorations

    PubMed Central

    Palaniappan, Senthamaraiselvi; Elsen, Liesbeth; Lijnen, Inge; Peumans, Marleen; Van Meerbeek, Bart

    2009-01-01

    The aim of the study was to compare the clinical performance, quantitative and qualitative wear patterns of conventional hybrid (Tetric Ceram), micro-filled hybrid (Gradia Direct Posterior) and nano-hybrid (Tetric EvoCeram, TEC) posterior composite restorations in a 3-year randomised clinical trial. Sixteen Tetric Ceram, 17 TEC and 16 Gradia Direct Posterior restorations were placed in human molars and evaluated at baseline, 6, 12, 24 and 36 months of clinical service according to US Public Health Service criteria. The gypsum replicas at each recall were used for 3D laser scanning to quantify wear, and the epoxy resin replicas were observed under scanning electron microscope to study the qualitative wear patterns. After 3 years of clinical service, the three hybrid restorative materials performed clinically well in posterior cavities. Within the observation period, the nano-hybrid and micro-hybrid restorations evolved better in polishability with improved surface gloss retention than the conventional hybrid counterpart. The three hybrid composites showed enamel-like vertical wear and cavity-size dependant volume loss magnitude. Qualitatively, while the micro-filled and nano-hybrid composite restorations exhibited signs of fatigue similar to the conventional hybrid composite restorations at heavy occlusal contact area, their light occlusal contact areas showed less surface pitting after 3 years of clinical service. PMID:19669176

  13. Evaluating quantitative research reports.

    PubMed

    Russell, Cynthia L

    2005-01-01

    As a novice reviewer, it is often difficult to trust your evaluation of a research report. You may feel uncertain in your interpretations. These are common concerns and can be remedied by reading and discussing research reports on research listservs, through journal clubs, or with other nephrology nurses. Practice using the criteria for research report evaluation and you too can perfect critiquing a research report!

  14. Performance evaluation of fourier transform profilometry for quantitative waste volume determination under simulated hanford waste tank conditions

    SciTech Connect

    Jang, P.R.; Leone, T.; Long, Z.; Mott, M.A.; Norton, O.P.; Okhuysen, W.P.; Monts, D.L.

    2007-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of the residual waste. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We have completed a preliminary performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. Based on a Hanford C-200 series tank with camera access through a riser with significant offset relative to the centerline, we devised a testing methodology that encompassed a range of obstacles likely to be encountered 'in-tank'. These test objects were inspected by use of FTP and the volume of

  15. Quantitative Methods for Software Selection and Evaluation

    DTIC Science & Technology

    2006-09-01

    Quantitative Methods for Software Selection and Evaluation Michael S. Bandor September 2006 Acquisition Support Program...5 2 Evaluation Methods ...Abstract When performing a “buy” analysis and selecting a product as part of a software acquisition strategy , most organizations will consider primarily

  16. A longitudinal evaluation of performance of automated BCR-ABL1 quantitation using cartridge-based detection system

    PubMed Central

    Enjeti, Anoop; Granter, Neil; Ashraf, Asma; Fletcher, Linda; Branford, Susan; Rowlings, Philip; Dooley, Susan

    2015-01-01

    SummaryAn automated cartridge-based detection system (GeneXpert; Cepheid) is being widely adopted in low throughput laboratories for monitoring BCR-ABL1 transcript in chronic myelogenous leukaemia. This Australian study evaluated the longitudinal performance specific characteristics of the automated system. The automated cartridge-based system was compared prospectively with the manual qRT-PCR-based reference method at SA Pathology, Adelaide, over a period of 2.5 years. A conversion factor determination was followed by four re-validations. Peripheral blood samples (n = 129) with international scale (IS) values within detectable range were selected for assessment. The mean bias, proportion of results within specified fold difference (2-, 3- and 5-fold), the concordance rate of major molecular remission (MMR) and concordance across a range of IS values on paired samples were evaluated. The initial conversion factor for the automated system was determined as 0.43. Except for the second re-validation, where a negative bias of 1.9-fold was detected, all other biases fell within desirable limits. A cartridge-specific conversion factor and efficiency value was introduced and the conversion factor was confirmed to be stable in subsequent re-validation cycles. Concordance with the reference method/laboratory at >0.1–≤10 IS was 78.2% and at ≤0.001 was 80%, compared to 86.8% in the >0.01–≤0.1 IS range. The overall and MMR concordance were 85.7% and 94% respectively, for samples that fell within ± 5-fold of the reference laboratory value over the entire period of study. Conversion factor and performance specific characteristics for the automated system were longitudinally stable in the clinically relevant range, following introduction by the manufacturer of lot specific efficiency values. PMID:26166664

  17. Blind Analysis of Fortified Pesticide Residues in Carrot Extracts using GC-MS to Evaluate Qualitative and Quantitative Performance

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Unlike quantitative analysis, the quality of the qualitative results in the analysis of pesticide residues in food are generally ignored in practice. Instead, chemists tend to rely on advanced mass spectrometric techniques and general subjective guidelines or fixed acceptability criteria when makin...

  18. Comparison of two real-time quantitative polymerase chain reaction strategies for minimal residual disease evaluation in lymphoproliferative disorders: correlation between immunoglobulin gene mutation load and real-time quantitative polymerase chain reaction performance.

    PubMed

    Della Starza, Irene; Cavalli, Marzia; Del Giudice, Ilaria; Barbero, Daniela; Mantoan, Barbara; Genuardi, Elisa; Urbano, Marina; Mannu, Claudia; Gazzola, Anna; Ciabatti, Elena; Guarini, Anna; Foà, Robin; Galimberti, Sara; Piccaluga, Pierpaolo; Gaidano, Gianluca; Ladetto, Marco; Monitillo, Luigia

    2014-09-01

    We compared two strategies for minimal residual disease evaluation of B-cell lymphoproliferative disorders characterized by a variable immunoglobulin heavy chain (IGH) genes mutation load. Twenty-five samples from chronic lymphocytic leukaemia (n = 18) or mantle cell lymphoma (n = 7) patients were analyzed. Based on IGH variable region genes, 22/25 samples carried > 2% mutations, 20/25 > 5%. In the IGH joining region genes, 23/25 samples carried > 2% mutations, 18/25 > 5%. Real-time quantitative polymerase chain reaction was performed on IGH genes using two strategies: method A utilizes two patient-specific primers, whereas method B employs one patient-specific and one germline primer, with different positions on the variable, diversity and joining regions. Twenty-three samples (92%) resulted evaluable using method A, only six (24%) by method B. Method B poor performance was specifically evident among mutated IGH variable/joining region cases, although no specific mutation load above, which the real-time quantitative polymerase chain reaction failed was found. The molecular strategies for minimal residual disease evaluation should be adapted to the B-cell receptor features of the disease investigated.

  19. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  20. Reliability and validity of a quantitative color scale to evaluate masticatory performance using color-changeable chewing gum.

    PubMed

    Hama, Yohei; Kanazawa, Manabu; Minakuchi, Shunsuke; Uchida, Tatsuro; Sasaki, Yoshiyuki

    2014-03-19

    In the present study, we developed a novel color scale for visual assessment, conforming to theoretical color changes of a gum, to evaluate masticatoryperformance; moreover, we investigated the reliability and validity of this evaluation method using the color scale. Ten participants (aged 26.30 years) with natural dentition chewed the gum at several chewing strokes. Changes in color were measured using a colorimeter, and then, linearregression expressions that represented changes in gum color were derived. The color scale was developed using these regression expressions. Thirty-two chewed gums were evaluated using colorimeter and were assessed three times using the color scale by six dentists aged 25.27 (mean, 25.8) years, six preclinical dental students aged 21.23 (mean, 22.2) years, and six elderly individuals aged 68.84 (mean, 74.0) years. The intrarater and interrater reliability of evaluations was assessed using intraclass correlation coefficients. Validity of the method compared with a colorimeter was assessed using Spearman's rank correlation coefficient. All intraclass correlation coefficients were > 0.90, and Spearman's rank-correlation coefficients were > 0.95 in all groups. These results indicated that the evaluation method of the color-changeable chewing gum using the newly developed color scale is reliable and valid.

  1. Quantitative evaluation of signal integrity for magnetocardiography.

    PubMed

    Zhang, Shulin; Wang, Yongliang; Wang, Huiwu; Jiang, Shiqin; Xie, Xiaoming

    2009-08-07

    Magnetocardiography (MCG) is a non-invasive diagnostic tool used to investigate the activity of the heart. For applications in an unshielded environment, in order to extract the very weak signal of interest from the much higher background noise, dedicated hardware configuration and sophisticated signal processing techniques have been developed during the last decades. Being powerful in noise rejection, the signal processing may introduce signal distortions, if not properly designed and applied. However, there is a lack of an effective tool to quantitatively evaluate the signal integrity for MCG at present. In this paper, we have introduced a very simple method by using a small coil driven by a human ECG signal to generate a simulated MCG signal. Three key performance indexes were proposed, which are correlation in time domain, relative heights of different peaks and correlation in frequency domain, to evaluate the MCG system performance quantitatively. This evaluation method was applied to a synthetic gradiometer consisting of a second-order axial gradiometer and three orthogonal reference magnetometers. The evaluation turned out to be very effective in optimizing the parameters for signal processing. In addition, the method can serve as a useful tool for hardware improvement.

  2. An approach for relating the results of quantitative nondestructive evaluation to intrinsic properties of high-performance materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1990-01-01

    One of the most difficult problems the manufacturing community has faced during recent years has been to accurately assess the physical state of anisotropic high-performance materials by nondestructive means. In order to advance the design of ultrasonic nondestructive testing systems, a more fundamental understanding of how ultrasonic waves travel and interact within the anisotropic material is needed. The relationship between the ultrasonic and engineering parameters needs to be explored to understand their mutual dependence. One common denominator is provided by the elastic constants. The preparation of specific graphite/epoxy samples to be used in the experimental investigation of the anisotropic properties (through the measurement of the elastic stiffness constants) is discussed. Accurate measurements of these constants will depend upon knowledge of refraction effects as well as the direction of group velocity propagation. The continuing effort for the development of improved visualization techniques for physical parameters is discussed. Group velocity images are presented and discussed. In order to fully understand the relationship between the ultrasonic and the common engineering parameters, the physical interpretation of the linear elastic coefficients (the quantities that relate applied stresses to resulting strains) are discussed. This discussion builds a more intuitional understanding of how the ultrasonic parameters are related to the traditional engineering parameters.

  3. Quantitative evaluation of yeast's requirement for glycerol formation in very high ethanol performance fed-batch process

    PubMed Central

    2010-01-01

    Background Glycerol is the major by-product accounting for up to 5% of the carbon in Saccharomyces cerevisiae ethanolic fermentation. Decreasing glycerol formation may redirect part of the carbon toward ethanol production. However, abolishment of glycerol formation strongly affects yeast's robustness towards different types of stress occurring in an industrial process. In order to assess whether glycerol production can be reduced to a certain extent without jeopardising growth and stress tolerance, the yeast's capacity to synthesize glycerol was adjusted by fine-tuning the activity of the rate-controlling enzyme glycerol 3-phosphate dehydrogenase (GPDH). Two engineered strains whose specific GPDH activity was significantly reduced by two different degrees were comprehensively characterized in a previously developed Very High Ethanol Performance (VHEP) fed-batch process. Results The prototrophic strain CEN.PK113-7D was chosen for decreasing glycerol formation capacity. The fine-tuned reduction of specific GPDH activity was achieved by replacing the native GPD1 promoter in the yeast genome by previously generated well-characterized TEF promoter mutant versions in a gpd2Δ background. Two TEF promoter mutant versions were selected for this study, resulting in a residual GPDH activity of 55 and 6%, respectively. The corresponding strains were referred to here as TEFmut7 and TEFmut2. The genetic modifications were accompanied to a strong reduction in glycerol yield on glucose; the level of reduction compared to the wild-type was 61% in TEFmut7 and 88% in TEFmut2. The overall ethanol production yield on glucose was improved from 0.43 g g-1 in the wild type to 0.44 g g-1 measured in TEFmut7 and 0.45 g g-1 in TEFmut2. Although maximal growth rate in the engineered strains was reduced by 20 and 30%, for TEFmut7 and TEFmut2 respectively, strains' ethanol stress robustness was hardly affected; i.e. values for final ethanol concentration (117 ± 4 g L-1), growth

  4. Development of a combined in vitro cell culture--quantitative PCR assay for evaluating the disinfection performance of pulsed light for treating the waterborne enteroparasite Giardia lamblia.

    PubMed

    Garvey, Mary; Stocca, Alessia; Rowan, Neil

    2014-09-01

    Giardia lamblia is a flagellated protozoan parasite that is recognised as a frequent cause of water-borne disease in humans and animals. We report for the first time on the use of a combined in vitro HCT-8 cell culture-quantitative PCR assay for evaluating the efficacy of using pulsed UV light for treating G. lamblia parasites. Findings showed that current methods that are limited to using vital stains before and after cyst excystation are not appropriate for monitoring or evaluating cyst destruction post PUV-treatments. Use of the human ileocecal HCT-8 cell line was superior to that of the human colon Caco-2 cell line for in vitro culture and determining PUV sensitivity of treated cysts. G. lamblia cysts were also shown to be more resistant to PUV irradiation compared to treating similar numbers of Cryptosporidium parvum oocysts. These observations also show that the use of this HCT-8 cell culture assay may replace use of animal models for determining disinfection performances of PUV for treating both C. parvum and G. lamblia.

  5. Quantitative framework for prospective motion correction evaluation

    PubMed Central

    Pannetier, Nicolas; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2014-01-01

    Purpose Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. Method A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Results Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Conclusion Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. PMID:25761550

  6. Quantitative and chemical fingerprint analysis for the quality evaluation of Isatis indigotica based on ultra-performance liquid chromatography with photodiode array detector combined with chemometric methods.

    PubMed

    Shi, Yan-Hong; Xie, Zhi-Yong; Wang, Rui; Huang, Shan-Jun; Li, Yi-Ming; Wang, Zheng-Tao

    2012-01-01

    A simple and reliable method of ultra-performance liquid chromatography with photodiode array detector (UPLC-PDA) was developed to control the quality of Radix Isatidis (dried root of Isatis indigotica) for chemical fingerprint analysis and quantitative analysis of eight bioactive constituents, including R,S-goitrin, progoitrin, epiprogoitrin, gluconapin, adenosine, uridine, guanosine, and hypoxanthine. In quantitative analysis, the eight components showed good regression (R > 0.9997) within test ranges, and the recovery method ranged from 99.5% to 103.0%. The UPLC fingerprints of the Radix Isatidis samples were compared by performing chemometric procedures, including similarity analysis, hierarchical clustering analysis, and principal component analysis. The chemometric procedures classified Radix Isatidis and its finished products such that all samples could be successfully grouped according to crude herbs, prepared slices, and adulterant Baphicacanthis cusiae Rhizoma et Radix. The combination of quantitative and chromatographic fingerprint analysis can be used for the quality assessment of Radix Isatidis and its finished products.

  7. Performance of phalangeal quantitative ultrasound parameters in the evaluation of reduced bone mineral density assessed by DX in patients with 21 hydroxylase deficiency.

    PubMed

    Gonçalves, Ezequiel M; Sewaybricker, Leticia E; Baptista, Fatima; Silva, Analiza M; Carvalho, Wellington R G; Santos, Allan O; de Mello, Maricilda P; Lemos-Marini, Sofia H V; Guerra, Gil

    2014-07-01

    The purpose of this study was to verify the performance of quantitative ultrasound (QUS) parameters of proximal phalanges in the evaluation of reduced bone mineral density (BMD) in patients with congenital adrenal hyperplasia due to 21-hydroxylase deficiency (21 OHD). Seventy patients with 21 OHD (41 females and 29 males), aged between 6-27 y were assessed. The QUS measurements, amplitude-dependent speed of sound (AD-SoS), bone transmission time (BTT), and ultrasound bone profile index (UBPI) were obtained using the BMD Sonic device (IGEA, Carpi, Italy) on the last four proximal phalanges in the non-dominant hand. BMD was determined by dual energy X-ray (DXA) across the total body and lumbar spine (LS). Total body and LS BMD were positively correlated to UBPI, BTT and AD-SoS (correlation coefficients ranged from 0.59-0.72, p < 0.001). In contrast, when comparing patients with normal and low (Z-score < -2) BMD, no differences were found in the QUS parameters. Furthermore, UBPI, BTT and AD-SoS measurements were not effective for diagnosing patients with reduced BMD by receiver operator characteristic curve parameters. Although the AD-SoS, BTT and UBPI showed significant correlations with the data obtained by DXA, they were not effective for diagnosing reduced bone mass in patients with 21 OHD.

  8. C-arm cone beam CT guidance of sinus and skull base surgery: quantitative surgical performance evaluation and development of a novel high-fidelity phantom

    NASA Astrophysics Data System (ADS)

    Vescan, A. D.; Chan, H.; Daly, M. J.; Witterick, I.; Irish, J. C.; Siewerdsen, J. H.

    2009-02-01

    Surgical simulation has become a critical component of surgical practice and training in the era of high-precision image-guided surgery. While the ability to simulate surgery of the paranasal sinuses and skull base has been conventionally limited to 3D digital simulation or cadaveric dissection, we have developed novel methods employing rapid prototyping technology and 3D printing to create high-fidelity models from real patient images (CT or MR). Such advances allow creation of patient-specific models for preparation, simulation, and training before embarking on the actual surgery. A major challenge included the development of novel material formulations compatible with the rapid prototyping process while presenting anatomically realistic flexibility, cut-ability, drilling purchase, and density (CT number). Initial studies have yielded realistic models of the paranasal sinuses and skull base for simulation and training in image-guided surgery. The process of model development and material selection is reviewed along with the application of the phantoms in studies of high-precision surgery guided by C-arm cone-beam CT (CBCT). Surgical performance is quantitatively evaluated under CBCT guidance, with the high-fidelity phantoms providing an excellent test-bed for reproducible studies across a broad spectrum of challenging surgical tasks. Future work will broaden the atlas of models to include normal anatomical variations as well as a broad spectrum of benign and malignant disease. The role of high-fidelity models produced by rapid prototyping is discussed in the context of patient-specific case simulation, novel technology development (specifically CBCT guidance), and training of future generations of sinus and skull base surgeons.

  9. Influence of sulphur-fumigation on the quality of white ginseng: a quantitative evaluation of major ginsenosides by high performance liquid chromatography.

    PubMed

    Jin, Xin; Zhu, Ling-Ying; Shen, Hong; Xu, Jun; Li, Song-Lin; Jia, Xiao-Bin; Cai, Hao; Cai, Bao-Chang; Yan, Ru

    2012-12-01

    White ginseng was reported to be sulphur-fumigated during post-harvest handling. In the present study, the influence of sulphur-fumigation on the quality of white ginseng and its decoction were quantitatively evaluated through simultaneous quantification of 14 major ginsenosides by a validated high performance liquid chromatography. Poroshell 120 EC-C18 (100mm×3.0mm, 2.7μm) column was chosen for the separation of the major ginsenosides, which were eluted with gradient water and acetonitrile as mobile phase. The analytes were monitored by UV at 203nm. The method was validated in terms of linearity, sensitivity, precision, accuracy and stability. The sulphur-fumigated and non-fumigated white ginseng samples, as well as their respective decoctions, were comparatively analysed with the newly-validated method. It was found that the contents of nine ginsenosides detected in raw materials decreased by about 3-85%, respectively, and the total content of the nine ginsenosides detected in raw materials, decreased by almost 54% after sulphur-fumigation. On the other hand, the contents of 10 ginsenosides detected in decoctions of sulphur-fumigated white ginseng were decreased by about 33-83%, respectively, and the total content of ginsenosides was decreased by up to 64% when compared with that of non-fumigated white ginseng. In addition, ginsenoside Rh(2) and Rg(5) could be detected in the decoctions of sulphur-fumigated white ginseng but not in that of non-fumigated white ginseng. It is suggested that sulphur-fumigation can significantly influence not only the contents of original ginsenosides, but also the decocting-induced chemical transformation of ginsenosides in white ginseng.

  10. Evaluating Mandibular Cortical Index Quantitatively

    PubMed Central

    Yasar, Fusun; Akgunlu, Faruk

    2008-01-01

    Objectives The aim was to assess whether Fractal Dimension and Lacunarity analysis can discriminate patients having different mandibular cortical shape. Methods Panoramic radiographs of 52 patients were evaluated for mandibular cortical index. Weighted Kappa between the observations were varying between 0.718–0.805. These radiographs were scanned and converted to binary images. Fractal Dimension and Lacunarity were calculated from the regions where best represents the cortical morphology. Results It was found that there were statistically significant difference between the Fractal Dimension and Lacunarity of radiographs which were classified as having Cl 1 and Cl 2 (Fractal Dimension P:0.000; Lacunarity P:0.003); and Cl 1 and Cl 3 cortical morphology (Fractal Dimension P:0.008; Lacunarity P:0.001); but there was no statistically significant difference between Fractal Dimension and Lacunarity of radiographs which were classified as having Cl 2 and Cl 3 cortical morphology (Fractal Dimension P:1.000; Lacunarity P:0.758). Conclusions FD and L can differentiate Cl 1 mandibular cortical shape from both Cl 2 and Cl 3 mandibular cortical shape but cannot differentiate Cl 2 from Cl 3 mandibular cortical shape on panoramic radiographs. PMID:19212535

  11. Combination of quantitative analysis and chemometric analysis for the quality evaluation of three different frankincenses by ultra high performance liquid chromatography and quadrupole time of flight mass spectrometry.

    PubMed

    Zhang, Chao; Sun, Lei; Tian, Run-tao; Jin, Hong-yu; Ma, Shuang-Cheng; Gu, Bing-ren

    2015-10-01

    Frankincense has gained increasing attention in the pharmaceutical industry because of its pharmacologically active components such as boswellic acids. However, the identity and overall quality evaluation of three different frankincense species in different Pharmacopeias and the literature have less been reported. In this paper, quantitative analysis and chemometric evaluation were established and applied for the quality control of frankincense. Meanwhile, quantitative and chemometric analysis could be conducted under the same analytical conditions. In total 55 samples from four habitats (three species) of frankincense were collected and six boswellic acids were chosen for quantitative analysis. Chemometric analyses such as similarity analysis, hierarchical cluster analysis, and principal component analysis were used to identify frankincense of three species to reveal the correlation between its components and species. In addition, 12 chromatographic peaks have been tentatively identified explored by reference substances and quadrupole time-of-flight mass spectrometry. The results indicated that the total boswellic acid profiles of three species of frankincense are similar and their fingerprints can be used to differentiate between them.

  12. Apprentice Performance Evaluation.

    ERIC Educational Resources Information Center

    Gast, Clyde W.

    The Granite City (Illinois) Steel apprentices are under a performance evaluation from entry to graduation. Federally approved, the program is guided by joint apprenticeship committees whose monthly meetings include performance evaluation from three information sources: journeymen, supervisors, and instructors. Journeymen's evaluations are made…

  13. Influence of processing procedure on the quality of Radix Scrophulariae: a quantitative evaluation of the main compounds obtained by accelerated solvent extraction and high-performance liquid chromatography.

    PubMed

    Cao, Gang; Wu, Xin; Li, Qinglin; Cai, Hao; Cai, Baochang; Zhu, Xuemei

    2015-02-01

    An improved high-performance liquid chromatography with diode array detection combined with accelerated solvent extraction method was used to simultaneously determine six compounds in crude and processed Radix Scrophulariae samples. Accelerated solvent extraction parameters such as extraction solvent, temperature, number of cycles, and analysis procedure were systematically optimized. The results indicated that compared with crude Radix Scrophulariae samples, the processed samples had lower contents of harpagide and harpagoside but higher contents of catalpol, acteoside, angoroside C, and cinnamic acid. The established method was sufficiently rapid and reliable for the global quality evaluation of crude and processed herbal medicines.

  14. Chemical fingerprint and quantitative analysis for the quality evaluation of Vitex negundo seeds by reversed-phase high-performance liquid chromatography coupled with hierarchical clustering analysis.

    PubMed

    Shu, Zhiheng; Li, Xiuqing; Rahman, Khalid; Qin, Luping; Zheng, Chengjian

    2016-01-01

    A simple and efficient method was developed for the chemical fingerprint analysis and simultaneous determination of four phenylnaphthalene-type lignans in Vitex negundo seeds using high-performance liquid chromatography with diode array detection. For fingerprint analysis, 13 V. negundo seed samples were collected from different regions in China, and the fingerprint chromatograms were matched by the computer-aided Similarity Evaluation System for Chromatographic Fingerprint of TCM (Version 2004A). A total of 21 common peaks found in all the chromatograms were used for evaluating the similarity between these samples. Additionally, simultaneous quantification of four major bioactive ingredients was conducted to assess the quality of V. negundo seeds. Our results indicated that the contents of four lignans in V. negundo seeds varied remarkably in herbal samples collected from different regions. Moreover, the hierarchical clustering analysis grouped these 13 samples into three categories, which was consistent with the chemotypes of those chromatograms. The method developed in this study provides a substantial foundation for the establishment of reasonable quality control standards for V. negundo seeds.

  15. Evaluating steam trap performance

    SciTech Connect

    Fuller, N.Y.

    1985-08-08

    This paper presents a method for evaluating the performance level of steam traps by preparing an economic analysis of several types to determine the equivalent uniform annual cost. A series of tests on steam traps supplied by six manufacturers provided data for determining the relative efficiencies of each unit. The comparison was made using a program developed for the Texas Instruments T1-59 programmable calculator to evaluate overall steam trap economics.

  16. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  17. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  18. Evaluating Performance of Components

    NASA Technical Reports Server (NTRS)

    Katz, Daniel; Tisdale, Edwin; Norton, Charles

    2004-01-01

    Parallel Component Performance Benchmarks is a computer program developed to aid the evaluation of the Common Component Architecture (CCA) - a software architecture, based on a component model, that was conceived to foster high-performance computing, including parallel computing. More specifically, this program compares the performances (principally by measuring computing times) of componentized versus conventional versions of the Parallel Pyramid 2D Adaptive Mesh Refinement library - a software library that is used to generate computational meshes for solving physical problems and that is typical of software libraries in use at NASA s Jet Propulsion Laboratory.

  19. Performance Evaluation Process.

    ERIC Educational Resources Information Center

    1998

    This document contains four papers from a symposium on the performance evaluation process and human resource development (HRD). "Assessing the Effectiveness of OJT (On the Job Training): A Case Study Approach" (Julie Furst-Bowe, Debra Gates) is a case study of the effectiveness of OJT in one of a high-tech manufacturing company's product…

  20. A QUANTITATIVE TECHNIQUE FOR PERFORMING PLASMAPHERESIS

    PubMed Central

    Melnick, Daniel; Cowgill, George R.

    1936-01-01

    1. A special apparatus and technique are described which permit one to conduct plasmapheresis quantitatively. 2. The validity of the methods employed, for determining serum protein concentration and blood volume as prerequisites for the calculation of the amount of blood to be withdrawn, are discussed. PMID:19870575

  1. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems.

  2. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  3. A Novel Assessment Tool for Quantitative Evaluation of Science Literature Search Performance: Application to First-Year and Senior Undergraduate Biology Majors

    ERIC Educational Resources Information Center

    Blank, Jason M.; McGaughey, Karen J.; Keeling, Elena L.; Thorp, Kristen L.; Shannon, Conor C.; Scaramozzino, Jeanine M.

    2016-01-01

    Expertise in searching and evaluating scientific literature is a requisite skill of trained scientists and science students, yet information literacy instruction varies greatly among institutions and programs. To ensure that science students acquire information literacy skills, robust methods of assessment are needed. Here, we describe a novel…

  4. Quantitative evaluation of chemisorption processes on semiconductors

    NASA Astrophysics Data System (ADS)

    Rothschild, A.; Komem, Y.; Ashkenasy, N.

    2002-12-01

    This article presents a method for numerical computation of the degree of coverage of chemisorbates and the resultant surface band bending as a function of the ambient gas pressure, temperature, and semiconductor doping level. This method enables quantitative evaluation of the effect of chemisorption on the electronic properties of semiconductor surfaces, such as the work function and surface conductivity, which is of great importance for many applications such as solid- state chemical sensors and electro-optical devices. The method is applied for simulating the chemisorption behavior of oxygen on n-type CdS, a process that has been investigated extensively due to its impact on the photoconductive properties of CdS photodetectors. The simulation demonstrates that the chemisorption of adions saturates when the Fermi level becomes aligned with the chemisorption-induced surface states, limiting their coverage to a small fraction of a monolayer. The degree of coverage of chemisorbed adions is proportional to the square root of the doping level, while neutral adsorbates are independent of the doping level. It is shown that the chemisorption of neutral adsorbates behaves according to the well-known Langmuir model, regardless of the existence of charged species on the surface, while charged adions do not obey Langmuir's isotherm. In addition, it is found that in depletive chemisorption processes the resultant surface band bending increases by 2.3kT (where k is the Boltzmann constant and T is the temperature) when the gas pressure increases by one order of magnitude or when the doping level increases by two orders of magnitude.

  5. Sample metallization for performance improvement in desorption/ionization of kilodalton molecules: quantitative evaluation, imaging secondary ion MS, and laser ablation.

    PubMed

    Delcorte, A; Bour, J; Aubriet, F; Muller, J-F; Bertrand, P

    2003-12-15

    The metallization procedure, proposed recently for signal improvement in organic secondary ion mass spectrometry (SIMS) (Delcorte, A.; Médard, N.; Bertrand, P. Anal.Chem. 2002, 74, 4955)., has been thoroughly tested for a set of kilodalton molecules bearing various functional groups: Irganox 1010, polystyrene, polyalanine, and copper phthalocyanine. In addition to gold, we evaluate the effect of silver evaporation as a sample treatment prior to static SIMS analysis. Ion yields, damage cross sections, and emission efficiencies are compared for Ag- and Au-metallized molecular films, pristine coatings on silicon, and submonolayers of the same molecules adsorbed on silver and gold. The results are sample-dependent but as an example, the yield enhancement calculated for metallized Irganox films with respect to untreated coatings is larger than 2 orders of magnitude for the quasimolecular ion and a factor of 1-10 for characteristic fragments. Insights into the emission processes of quasimolecular ions from metallized surfaces are deduced from kinetic energy distribution measurements. The advantage of the method for imaging SIMS applications is illustrated by the study of a nonuniform coating of polystyrene oligomers on a 100-microm polypropylene film. The evaporated metal eliminates sample charging and allows us to obtain enhanced quality images of characteristic fragment ions as well as reasonably contrasted chemical mappings for cationized PS oligomers and large PP chain segments. Finally, we report on the benefit of using metal evaporation as a sample preparation procedure for laser ablation mass spectrometry. Our results show that the fingerprint spectra of Au-covered polystyrene, polypropylene, and Irganox films can be readily obtained under 337-nm irradiation, a wavelength for which the absorption of polyolefins is low. This is probably because the gold clusters embedded in the sample surface absorb and transfer the photon energy to the surrounding organic medium.

  6. Functional Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Greenisen, Michael C.; Hayes, Judith C.; Siconolfi, Steven F.; Moore, Alan D.

    1999-01-01

    The Extended Duration Orbiter Medical Project (EDOMP) was established to address specific issues associated with optimizing the ability of crews to complete mission tasks deemed essential to entry, landing, and egress for spaceflights lasting up to 16 days. The main objectives of this functional performance evaluation were to investigate the physiological effects of long-duration spaceflight on skeletal muscle strength and endurance, as well as aerobic capacity and orthostatic function. Long-duration exposure to a microgravity environment may produce physiological alterations that affect crew ability to complete critical tasks such as extravehicular activity (EVA), intravehicular activity (IVA), and nominal or emergency egress. Ultimately, this information will be used to develop and verify countermeasures. The answers to three specific functional performance questions were sought: (1) What are the performance decrements resulting from missions of varying durations? (2) What are the physical requirements for successful entry, landing, and emergency egress from the Shuttle? and (3) What combination of preflight fitness training and in-flight countermeasures will minimize in-flight muscle performance decrements? To answer these questions, the Exercise Countermeasures Project looked at physiological changes associated with muscle degradation as well as orthostatic intolerance. A means of ensuring motor coordination was necessary to maintain proficiency in piloting skills, EVA, and IVA tasks. In addition, it was necessary to maintain musculoskeletal strength and function to meet the rigors associated with moderate altitude bailout and with nominal or emergency egress from the landed Orbiter. Eight investigations, referred to as Detailed Supplementary Objectives (DSOs) 475, 476, 477, 606, 608, 617, 618, and 624, were conducted to study muscle degradation and the effects of exercise on exercise capacity and orthostatic function (Table 3-1). This chapter is divided into

  7. A study on the quantitative evaluation of skin barrier function

    NASA Astrophysics Data System (ADS)

    Maruyama, Tomomi; Kabetani, Yasuhiro; Kido, Michiko; Yamada, Kenji; Oikaze, Hirotoshi; Takechi, Yohei; Furuta, Tomotaka; Ishii, Shoichi; Katayama, Haruna; Jeong, Hieyong; Ohno, Yuko

    2015-03-01

    We propose a quantitative evaluation method of skin barrier function using Optical Coherence Microscopy system (OCM system) with coherency of near-infrared light. There are a lot of skin problems such as itching, irritation and so on. It has been recognized skin problems are caused by impairment of skin barrier function, which prevents damage from various external stimuli and loss of water. To evaluate skin barrier function, it is a common strategy that they observe skin surface and ask patients about their skin condition. The methods are subjective judgements and they are influenced by difference of experience of persons. Furthermore, microscopy has been used to observe inner structure of the skin in detail, and in vitro measurements like microscopy requires tissue sampling. On the other hand, it is necessary to assess objectively skin barrier function by quantitative evaluation method. In addition, non-invasive and nondestructive measuring method and examination changes over time are needed. Therefore, in vivo measurements are crucial for evaluating skin barrier function. In this study, we evaluate changes of stratum corneum structure which is important for evaluating skin barrier function by comparing water-penetrated skin with normal skin using a system with coherency of near-infrared light. Proposed method can obtain in vivo 3D images of inner structure of body tissue, which is non-invasive and non-destructive measuring method. We formulate changes of skin ultrastructure after water penetration. Finally, we evaluate the limit of performance of the OCM system in this work in order to discuss how to improve the OCM system.

  8. Design, implementation and multisite evaluation of a system suitability protocol for the quantitative assessment of instrument performance in liquid chromatography-multiple reaction monitoring-MS (LC-MRM-MS).

    PubMed

    Abbatiello, Susan E; Mani, D R; Schilling, Birgit; Maclean, Brendan; Zimmerman, Lisa J; Feng, Xingdong; Cusack, Michael P; Sedransk, Nell; Hall, Steven C; Addona, Terri; Allen, Simon; Dodder, Nathan G; Ghosh, Mousumi; Held, Jason M; Hedrick, Victoria; Inerowicz, H Dorota; Jackson, Angela; Keshishian, Hasmik; Kim, Jong Won; Lyssand, John S; Riley, C Paige; Rudnick, Paul; Sadowski, Pawel; Shaddox, Kent; Smith, Derek; Tomazela, Daniela; Wahlander, Asa; Waldemarson, Sofia; Whitwell, Corbin A; You, Jinsam; Zhang, Shucha; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Borchers, Christoph H; Buck, Charles; Fisher, Susan J; Gibson, Bradford W; Liebler, Daniel; Maccoss, Michael; Neubert, Thomas A; Paulovich, Amanda; Regnier, Fred; Skates, Steven J; Tempst, Paul; Wang, Mu; Carr, Steven A

    2013-09-01

    Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps

  9. Milankovitch radiation variations: a quantitative evaluation.

    PubMed

    Shaw, D M; Donn, W L

    1968-12-13

    A quantitative determination of changes in the surface temperature caused by variations in insolation calculated by Milankovitch has been made through the use of the thermodynamic model of Adem. Under extreme conditions, mean coolings of 3.1 degrees and 2.7 degrees C, respectively, at latitudes 25 degrees and 65 degrees N are obtained for Milankovitch radiation cycles. At the sensitive latitude 65 degrees N, a mean cooling below the present temperature for each of the times of radiation minimum is only 1.4 degrees C. This result indicates that the Milankovitch effect is rather small to have triggered glacial climates.

  10. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  11. Algorithm performance evaluation

    NASA Astrophysics Data System (ADS)

    Smith, Richard N.; Greci, Anthony M.; Bradley, Philip A.

    1995-03-01

    Traditionally, the performance of adaptive antenna systems is measured using automated antenna array pattern measuring equipment. This measurement equipment produces a plot of the receive gain of the antenna array as a function of angle. However, communications system users more readily accept and understand bit error rate (BER) as a performance measure. The work reported on here was conducted to characterize adaptive antenna receiver performance in terms of overall communications system performance using BER as a performance measure. The adaptive antenna system selected for this work featured a linear array, least mean square (LMS) adaptive algorithm and a high speed phase shift keyed (PSK) communications modem.

  12. Vender Performance Evaluation.

    ERIC Educational Resources Information Center

    Grant, Joan; Perelmuter, Susan

    1978-01-01

    Vendor selection can mean success or failure of an approval plan; this study evaluates three book vendors by comparing their plans on the bases of speed, bibliographic accuracy, and discounts. (Author/CWM)

  13. Instrument performance evaluation

    SciTech Connect

    Swinth, K.L.

    1993-03-01

    Deficiencies exist in both the performance and the quality of health physics instruments. Recognizing the implications of such deficiencies for the protection of workers and the public, in the early 1980s the DOE and the NRC encouraged the development of a performance standard and established a program to test a series of instruments against criteria in the standard. The purpose of the testing was to establish the practicality of the criteria in the standard, to determine the performance of a cross section of available instruments, and to establish a testing capability. Over 100 instruments were tested, resulting in a practical standard and an understanding of the deficiencies in available instruments. In parallel with the instrument testing, a value-impact study clearly established the benefits of implementing a formal testing program. An ad hoc committee also met several times to establish recommendations for the voluntary implementation of a testing program based on the studies and the performance standard. For several reasons, a formal program did not materialize. Ongoing tests and studies have supported the development of specific instruments and have helped specific clients understand the performance of their instruments. The purpose of this presentation is to trace the history of instrument testing to date and suggest the benefits of a centralized formal program.

  14. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  15. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: QA TESTS, QUANTITATION AND SPECTROSCOPY

    EPA Science Inventory

    Confocal Microscopy System Performance: QA tests, Quantitation and Spectroscopy.

    Robert M. Zucker 1 and Jeremy M. Lerner 2,
    1Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research Development, U.S. Environmen...

  16. QUANTITATIVE EVALUATION OF FIRE SEPARATION AND BARRIERS

    SciTech Connect

    Coutts, D

    2007-04-17

    Fire barriers, and physical separation are key components in managing the fire risk in Nuclear Facilities. The expected performance of these features have often been predicted using rules-of-thumb or expert judgment. These approaches often lack the convincing technical bases that exist when addressing other Nuclear Facility accident events. This paper presents science-based approaches to demonstrate the effectiveness of fire separation methods.

  17. A quantitative evaluation of alcohol withdrawal tremors.

    PubMed

    Aarabi, Parham; Norouzi, Narges; Dear, Taylor; Carver, Sally; Bromberg, Simon; Gray, Sara; Kahan, Mel; Borgundvaag, Bjug

    2015-01-01

    This paper evaluates the relation between Alcohol Withdrawal Syndrome tremors in the left and right hands of patients. By analyzing 122 recordings from 61 patients in emergency departments, we found a weak relationship between the left and right hand tremor frequencies (correlation coefficient of 0.63). We found a much stronger relationship between the expert physician tremor ratings (on CIWA-Ar 0-7 scale) of the two hands, with a correlation coefficient of 0.923. Next, using a smartphone to collect the tremor data and using a previously developed model for obtaining estimated tremor ratings, we also found a strong correlation (correlation coefficient of 0.852) between the estimates of each hand. Finally, we evaluated different methods of combining the data from the two hands for obtaining a single tremor rating estimate, and found that simply averaging the tremor ratings of the two hands results in the lowest tremor estimate error (an RMSE of 0.977). Looking at the frequency dependence of this error, we found that higher frequency tremors had a much lower estimation error (an RMSE of 1.102 for tremors with frequencies in the 3-6Hz range as compared to 0.625 for tremors with frequencies in the 7-10Hz range).

  18. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  19. Evaluation (not validation) of quantitative models.

    PubMed Central

    Oreskes, N

    1998-01-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  20. Quantitative evaluation of ocean thermal energy conversion (OTEC): executive briefing

    SciTech Connect

    Gritton, E.C.; Pei, R.Y.; Hess, R.W.

    1980-08-01

    Documentation is provided of a briefing summarizing the results of an independent quantitative evaluation of Ocean Thermal Energy Conversion (OTEC) for central station applications. The study concentrated on a central station power plant located in the Gulf of Mexico and delivering power to the mainland United States. The evaluation of OTEC is based on three important issues: resource availability, technical feasibility, and cost.

  1. Quantitative code accuracy evaluation of ISP33

    SciTech Connect

    Kalli, H.; Miwrrin, A.; Purhonen, H.

    1995-09-01

    Aiming at quantifying code accuracy, a methodology based on the Fast Fourier Transform has been developed at the University of Pisa, Italy. The paper deals with a short presentation of the methodology and its application to pre-test and post-test calculations submitted to the International Standard Problem ISP33. This was a double-blind natural circulation exercise with a stepwise reduced primary coolant inventory, performed in PACTEL facility in Finland. PACTEL is a 1/305 volumetrically scaled, full-height simulator of the Russian type VVER-440 pressurized water reactor, with horizontal steam generators and loop seals in both cold and hot legs. Fifteen foreign organizations participated in ISP33, with 21 blind calculations and 20 post-test calculations, altogether 10 different thermal hydraulic codes and code versions were used. The results of the application of the methodology to nine selected measured quantities are summarized.

  2. Quantitative evaluation fo cerebrospinal fluid shunt flow

    SciTech Connect

    Chervu, S.; Chervu, L.R.; Vallabhajosyula, B.; Milstein, D.M.; Shapiro, K.M.; Shulman, K.; Blaufox, M.D.

    1984-01-01

    The authors describe a rigorous method for measuring the flow of cerebrospinal fluid (CSF) in shunt circuits implanted for the relief of obstructive hydrocephalus. Clearance of radioactivity for several calibrated flow rates was determined with a Harvard infusion pump by injecting the Rickham reservoir of a Rickham-Holter valve system with 100 ..mu..Ci of Tc-99m as pertechnetate. The elliptical and the cylindrical Holter valves used as adjunct valves with the Rickham reservoir yielded two different regression lines when the clearances were plotted against flow rats. The experimental regression lines were used to determine the in vivo flow rates from clearances calculated after injecting the Rickham reservoirs of the patients. The unique clearance characteristics of the individual shunt systems available requires that calibration curves be derived for an entire system identical to one implanted in the patient being evaluated, rather than just the injected chamber. Excellent correlation between flow rates and the clinical findings supports the reliability of this method of quantification of CSF shunt flow, and the results are fully accepted by neurosurgeons.

  3. Evaluating and Improving Teacher Performance.

    ERIC Educational Resources Information Center

    Manatt, Richard P.

    This workbook, coordinated with Manatt Teacher Performance Evaluation (TPE) workshops, summarizes large group presentation in sequence with the transparancies used. The first four modules of the workbook deal with the state of the art of evaluating and improving teacher performance; the development of the TPE system, including selection of…

  4. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  5. Relevance of MTF and NPS in quantitative CT: towards developing a predictable model of quantitative performance

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Richard, Samuel; Samei, Ehsan

    2012-03-01

    The quantification of lung nodule volume based on CT images provides valuable information for disease diagnosis and staging. However, the precision of the quantification is protocol, system, and technique dependent and needs to be evaluated for each specific case. To efficiently investigate the quantitative precision and find an optimal operating point, it is important to develop a predictive model based on basic system parameters. In this study, a Fourier-based metric, the estimability index (e') was proposed as such a predictor, and validated across a variety of imaging conditions. To first obtain the ground truth of quantitative precision, an anthropomorphic chest phantom with synthetic spherical nodules were imaged on a 64 slice CT scanner across a range of protocols (five exposure levels and two reconstruction algorithms). The volumes of nodules were quantified from the images using clinical software, with the precision of the quantification calculated for each protocol. To predict the precision, e' was calculated for each protocol based on several Fourier-based figures of merit, which modeled the characteristic of the quantitation task and the imaging condition (resolution, noise, etc.) of a particular protocol. Results showed a strong correlation (R2=0.92) between the measured and predicted precision across all protocols, indicating e' as an effective predictor of the quantitative precision. This study provides a useful framework for quantification-oriented optimization of CT protocols.

  6. Performance Evaluation and Benchmarking of Intelligent Systems

    SciTech Connect

    Madhavan, Raj; Messina, Elena; Tunstel, Edward

    2009-09-01

    To design and develop capable, dependable, and affordable intelligent systems, their performance must be measurable. Scientific methodologies for standardization and benchmarking are crucial for quantitatively evaluating the performance of emerging robotic and intelligent systems technologies. There is currently no accepted standard for quantitatively measuring the performance of these systems against user-defined requirements; and furthermore, there is no consensus on what objective evaluation procedures need to be followed to understand the performance of these systems. The lack of reproducible and repeatable test methods has precluded researchers working towards a common goal from exchanging and communicating results, inter-comparing system performance, and leveraging previous work that could otherwise avoid duplication and expedite technology transfer. Currently, this lack of cohesion in the community hinders progress in many domains, such as manufacturing, service, healthcare, and security. By providing the research community with access to standardized tools, reference data sets, and open source libraries of solutions, researchers and consumers will be able to evaluate the cost and benefits associated with intelligent systems and associated technologies. In this vein, the edited book volume addresses performance evaluation and metrics for intelligent systems, in general, while emphasizing the need and solutions for standardized methods. To the knowledge of the editors, there is not a single book on the market that is solely dedicated to the subject of performance evaluation and benchmarking of intelligent systems. Even books that address this topic do so only marginally or are out of date. The research work presented in this volume fills this void by drawing from the experiences and insights of experts gained both through theoretical development and practical implementation of intelligent systems in a variety of diverse application domains. The book presents

  7. Performance evaluation soil samples utilizing encapsulation technology

    DOEpatents

    Dahlgran, James R.

    1999-01-01

    Performance evaluation soil samples and method of their preparation using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.

  8. Performance evaluation soil samples utilizing encapsulation technology

    DOEpatents

    Dahlgran, J.R.

    1999-08-17

    Performance evaluation soil samples and method of their preparation uses encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration. 1 fig.

  9. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  10. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  11. Room for Improvement: Performance Evaluations.

    ERIC Educational Resources Information Center

    Webb, Gisela

    1989-01-01

    Describes a performance management approach to library personnel management that stresses communication, clarification of goals, and reinforcement of new practices and behaviors. Each phase of the evaluation process (preparation, rating, administrative review, appraisal interview, and follow-up) and special evaluations to be used in cases of…

  12. Quantitative Synthesis: An Actuarial Base for Planning Impact Evaluations.

    ERIC Educational Resources Information Center

    Cordray, David S.; Sonnefeld, L. Joseph

    1985-01-01

    There are numerous micro-level methods decisions associated with planning an impact evaluation. Quantitative synthesis methods can be used to construct an actuarial data base for establishing the likelihood of achieving desired sample sizes, statistical power, and measurement characteristics. (Author/BS)

  13. Performance comparison between static and dynamic cardiac CT on perfusion quantitation and patient classification tasks

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2015-03-01

    Cardiac CT acquisitions for perfusion assessment can be performed in a dynamic or static mode. In this simulation study, we evaluate the relative classification and quantification performance of these modes for assessing myocardial blood flow (MBF). In the dynamic method, a series of low dose cardiac CT acquisitions yields data on contrast bolus dynamics over time; these data are fit with a model to give a quantitative MBF estimate. In the static method, a single CT acquisition is obtained, and the relative CT numbers in the myocardium are used to infer perfusion states. The static method does not directly yield a quantitative estimate of MBF, but these estimates can be roughly approximated by introducing assumed linear relationships between CT number and MBF, consistent with the ways such images are typically visually interpreted. Data obtained by either method may be used for a variety of clinical tasks, including 1) stratifying patients into differing categories of ischemia and 2) using the quantitative MBF estimate directly to evaluate ischemic disease severity. Through simulations, we evaluate the performance on each of these tasks. The dynamic method has very low bias in MBF estimates, making it particularly suitable for quantitative estimation. At matched radiation dose levels, ROC analysis demonstrated that the static method, with its high bias but generally lower variance, has superior performance in stratifying patients, especially for larger patients.

  14. The supervisor's performance appraisal: evaluating the evaluator.

    PubMed

    McConnell, C R

    1993-04-01

    The focus of much performance appraisal in the coming decade or so will likely be on the level of customer satisfaction achieved through performance. Ultimately, evaluating the evaluator--that is, appraising the supervisor--will likely become a matter of assessing how well the supervisor's department meets the needs of its customers. Since meeting the needs of one's customers can well become the strongest determinant of organizational success or failure, it follows that relative success in ensuring these needs are met can become the primary indicator of one's relative success as a supervisor. This has the effect of placing the emphasis on supervisory performance exactly at the point it belongs, right on the bottom-line results of the supervisor's efforts.

  15. Quantitative evaluation of phase processing approaches in susceptibility weighted imaging

    NASA Astrophysics Data System (ADS)

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2012-03-01

    Susceptibility weighted imaging (SWI) takes advantage of the local variation in susceptibility between different tissues to enable highly detailed visualization of the cerebral venous system and sensitive detection of intracranial hemorrhages. Thus, it has been increasingly used in magnetic resonance imaging studies of traumatic brain injury as well as other intracranial pathologies. In SWI, magnitude information is combined with phase information to enhance the susceptibility induced image contrast. Because of global susceptibility variations across the image, the rate of phase accumulation varies widely across the image resulting in phase wrapping artifacts that interfere with the local assessment of phase variation. Homodyne filtering is a common approach to eliminate this global phase variation. However, filter size requires careful selection in order to preserve image contrast and avoid errors resulting from residual phase wraps. An alternative approach is to apply phase unwrapping prior to high pass filtering. A suitable phase unwrapping algorithm guarantees no residual phase wraps but additional computational steps are required. In this work, we quantitatively evaluate these two phase processing approaches on both simulated and real data using different filters and cutoff frequencies. Our analysis leads to an improved understanding of the relationship between phase wraps, susceptibility effects, and acquisition parameters. Although homodyne filtering approaches are faster and more straightforward, phase unwrapping approaches perform more accurately in a wider variety of acquisition scenarios.

  16. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under-performers

  17. Quantitative Evaluation of Plant Actin Cytoskeletal Organization During Immune Signaling.

    PubMed

    Lu, Yi-Ju; Day, Brad

    2017-01-01

    High spatial and temporal resolution microscopy-based methods are valuable tools for the precise real-time imaging of changes in cellular organization in response to stimulus perception. Here, we describe a quantitative method for the evaluation of the plant actin cytoskeleton during immune stimulus perception and the activation of defense signaling. As a measure of the biotic stress-induced changes in actin filament organization, we present methods for analyzing changes in actin filament organization following elicitation of pattern-triggered immunity and effector-triggered immunity. Using these methods, it is possible to not only quantitatively evaluate changes in actin cytoskeletal organization following biotic stress perception, but to also use these protocols to assess changes in actin filament organization following perception of a wide range of stimuli, including abiotic and developmental cues. As described herein, we present an example application of this method, designed to evaluate changes in actin cytoskeletal organization following pathogen perception and immune signaling.

  18. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    SciTech Connect

    Som, P.; Oster, Z.H.

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  19. [Quantitative evaluation of soil hyperspectra denoising with different filters].

    PubMed

    Huang, Ming-Xiang; Wang, Ke; Shi, Zhou; Gong, Jian-Hua; Li, Hong-Yi; Chen, Jie-Liang

    2009-03-01

    The noise distribution of soil hyperspectra measured by ASD FieldSpec Pro FR was described, and then the quantitative evaluation of spectral denoising with six filters was compared. From the interpretation of soil hyperspectra, the continuum removed, first-order differential and high frequency curves, the UV/VNIR (350-1 050 nm) exhibit hardly noise except the coverage of 40 nm in the beginning 350 nm. However, the SWIR (1 000-2 500 nm) shows different noise distribution. Especially, the latter half of SWIR 2(1 800-2 500 nm) showed more noise, and the intersection spectrum of three spectrometers has more noise than the neighbor spectrum. Six filters were chosen for spectral denoising. The smoothing indexes (SI), horizontal feature reservation index (HFRI) and vertical feature reservation index (VFRI) were designed for evaluating the denoising performance of these filters. The comparison of their indexes shows that WD and MA filters are the optimal choice to filter the noise, in terms of balancing the contradiction between the smoothing and feature reservation ability. Furthermore the first-order differential data of 66 denoising soil spectra by 6 filters were respectively used as the input of the same PLSR model to predict the sand content. The different prediction accuracies caused by the different filters show that compared to the feature reservation ability, the filter's smoothing ability is the principal factor to influence the accuracy. The study can benefit the spectral preprocessing and analyzing, and also provide the scientific foundation for the related spectroscopy applications.

  20. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Performance evaluation... Architect-Engineer Services 236.604 Performance evaluation. Prepare a separate performance evaluation after... familiar with the architect-engineer contractor's performance....

  1. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  2. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  3. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  4. Quantitatively evaluating the CBM reservoir using logging data

    NASA Astrophysics Data System (ADS)

    Liu, Zhidi; Zhao, Jingzhou

    2016-02-01

    In order to evaluate coal bed methane (CBM) reservoirs, this paper select five parameters: porosity, permeability, CBM content, the coal structure index and effective thickness of the coal seam. Making full use of logging and the laboratory analysis data of a coal core, the logging evaluation methods of the five parameters were discussed in detail, and the comprehensive evaluation model of the CBM reservoir was established. The #5 coal seam of the Hancheng mine on the eastern edge of the Ordos Basin in China was quantitatively evaluated using this method. The results show that the CBM reservoir in the study area is better than in the central and northern regions. The actual development of CBM shows that the region with a good reservoir has high gas production—indicating that the method introduced in this paper can evaluate the CBM reservoir more effectively.

  5. Quantitative evaluation of CBM reservoir fracturing quality using logging data

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoyan

    2017-03-01

    This paper presents a method for the quantitative evaluation of fracturing quality of coalbed methane (CBM) reservoirs using logging data, which will help optimize the reservoir fracturing layer. First, to make full use of logging and laboratory analysis data of coal cores, a method to determine the brittleness index of CBM reservoirs is deduced using coal industrial components. Second, this paper briefly introduces methodology to compute the horizontal principal stress difference coefficient of coal seams and the minimum horizontal principal stress difference of coal seams and roof and floor. Third, an evaluation model for the coal structure index is established using logging data, which fully considers the fracturing quality of CBM reservoirs affected by the coal structure. Fourth, the development degree of the coal reservoir is evaluated. The evaluation standard for fracturing quality of CBM reservoirs based on these five evaluation parameters is used for quantitative evaluation. The results show that the combination of methods proposed in this paper are effective. The results are consistent with the fracturing dynamic drainage. The coal seam with large brittleness index, large stress difference between the coal seam and roof and floor, small stress difference coefficient and high coal structure index has a strong fracturing quality.

  6. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease

    PubMed Central

    van Gilst, Merel M.; van Mierlo, Petra; Bloem, Bastiaan R.; Overeem, Sebastiaan

    2015-01-01

    Study Objectives: Many people with Parkinson disease experience “sleep benefit”: temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. Design: Eighteen Parkinson patients with and 20 without subjective sleep benefit and 20 healthy controls participated. Before and directly after a regular night sleep and an afternoon nap, subjects performed the timed pegboard dexterity task and quantified finger tapping task. Subjective ratings of motor functioning and mood/vigilange were included. Sleep was monitored using polysomnography. Results: On both tasks, patients were overall slower than healthy controls (night: F2,55 = 16.938, P < 0.001; nap: F2,55 = 15.331, P < 0.001). On the pegboard task, there was a small overall effect of night sleep (F1,55 = 9.695, P = 0.003); both patients and controls were on average slightly slower in the morning. However, in both tasks there was no sleep*group interaction for nighttime sleep nor for afternoon nap. There was a modest correlation between the score on the pegboard task and self-rated motor symptoms among patients (rho = 0.233, P = 0.004). No correlations in task performance and mood/vigilance or sleep time/efficiency were found. Conclusions: A positive effect of sleep on motor function is commonly reported by Parkinson patients. Here we show that the subjective experience of sleep benefit is not paralleled by an actual improvement in motor functioning. Sleep benefit therefore appears to be a subjective phenomenon and not a Parkinson-specific reduction in symptoms. Citation: van Gilst MM, van Mierlo P, Bloem BR, Overeem S. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease. SLEEP 2015;38(10):1567–1573. PMID:25902811

  7. Quantitative evaluation of hybridization and the impact on biodiversity conservation.

    PubMed

    van Wyk, Anna M; Dalton, Desiré L; Hoban, Sean; Bruford, Michael W; Russo, Isa-Rita M; Birss, Coral; Grobler, Paul; van Vuuren, Bettine Janse; Kotzé, Antoinette

    2017-01-01

    Anthropogenic hybridization is an increasing conservation threat worldwide. In South Africa, recent hybridization is threatening numerous ungulate taxa. For example, the genetic integrity of the near-threatened bontebok (Damaliscus pygargus pygargus) is threatened by hybridization with the more common blesbok (D. p. phillipsi). Identifying nonadmixed parental and admixed individuals is challenging based on the morphological traits alone; however, molecular analyses may allow for accurate detection. Once hybrids are identified, population simulation software may assist in determining the optimal conservation management strategy, although quantitative evaluation of hybrid management is rarely performed. In this study, our objectives were to describe species-wide and localized rates of hybridization in nearly 3,000 individuals based on 12 microsatellite loci, quantify the accuracy of hybrid assignment software (STRUCTURE and NEWHYBRIDS), and determine an optimal threshold of bontebok ancestry for management purposes. According to multiple methods, we identified 2,051 bontebok, 657 hybrids, and 29 blesbok. More than two-thirds of locations contained at least some hybrid individuals, with populations varying in the degree of introgression. HYBRIDLAB was used to simulate four generations of coexistence between bontebok and blesbok, and to optimize a threshold of ancestry, where most hybrids will be detected and removed, and the fewest nonadmixed bontebok individuals misclassified as hybrids. Overall, a threshold Q-value (admixture coefficient) of 0.90 would remove 94% of hybrid animals, while a threshold of 0.95 would remove 98% of hybrid animals but also 8% of nonadmixed bontebok. To this end, a threshold of 0.90 was identified as optimal and has since been implemented in formal policy by a provincial nature conservation agency. Due to widespread hybridization, effective conservation plans should be established and enforced to conserve native populations that are

  8. Quantitative imaging to evaluate malignant potential of IPMNs

    PubMed Central

    Hanania, Alexander N.; Bantis, Leonidas E.; Feng, Ziding; Wang, Huamin; Tamm, Eric P.; Katz, Matthew H.; Maitra, Anirban; Koay, Eugene J.

    2016-01-01

    Objective To investigate using quantitative imaging to assess the malignant potential of intraductal papillary mucinous neoplasms (IPMNs) in the pancreas. Background Pancreatic cysts are identified in over 2% of the population and a subset of these, including intraductal papillary mucinous neoplasms (IPMNs), represent pre-malignant lesions. Unfortunately, clinicians cannot accurately predict which of these lesions are likely to progress to pancreatic ductal adenocarcinoma (PDAC). Methods We investigated 360 imaging features within the domains of intensity, texture and shape using pancreatic protocol CT images in 53 patients diagnosed with IPMN (34 “high-grade” [HG] and 19 “low-grade” [LG]) who subsequently underwent surgical resection. We evaluated the performance of these features as well as the Fukuoka criteria for pancreatic cyst resection. Results In our cohort, the Fukuoka criteria had a false positive rate of 36%. We identified 14 imaging biomarkers within Gray-Level Co-Occurrence Matrix (GLCM) that predicted histopathological grade within cyst contours. The most predictive marker differentiated LG and HG lesions with an area under the curve (AUC) of .82 at a sensitivity of 85% and specificity of 68%. Using a cross-validated design, the best logistic regression yielded an AUC of 0.96 (σ = .05) at a sensitivity of 97% and specificity of 88%. Based on the principal component analysis, HG IPMNs demonstrated a pattern of separation from LG IPMNs. Conclusions HG IPMNs appear to have distinct imaging properties. Further validation of these findings may address a major clinical need in this population by identifying those most likely to benefit from surgical resection. PMID:27588410

  9. Evaluation of four genes in rice for their suitability as endogenous reference standards in quantitative PCR.

    PubMed

    Wang, Chong; Jiang, Lingxi; Rao, Jun; Liu, Yinan; Yang, Litao; Zhang, Dabing

    2010-11-24

    The genetically modified (GM) food/feed quantification depends on the reliable detection systems of endogenous reference genes. Currently, four endogenous reference genes including sucrose phosphate synthase (SPS), GOS9, phospholipase D (PLD), and ppi phosphofructokinase (ppi-PPF) of rice have been used in GM rice detection. To compare the applicability of these four rice reference genes in quantitative PCR systems, we analyzed the target nucleotide sequence variation in 58 conventional rice varieties from various geographic and phylogenic origins, also their quantification performances were evaluated using quantitative real-time PCR and GeNorm analysis via a series of statistical calculation to get a "M value" which is negative correlation with the stability of genes. The sequencing analysis results showed that the reported GOS9 and PLD taqman probe regions had detectable single nucleotide polymorphisms (SNPs) among the tested rice cultivars, while no SNPs were observed for SPS and ppi-PPF amplicons. Also, poor quantitative performance was detectable in these cultivars with SNPs using GOS9 and PLD quantitative PCR systems. Even though the PCR efficiency of ppi-PPF system was slightly lower, the SPS and ppi-PPF quantitative PCR systems were shown to be applicable for rice endogenous reference assay with less variation among the C(t) values, good reproducibility in quantitative assays, and the low M values by the comprehensive quantitative PCR comparison and GeNorm analysis.

  10. High-performance quantitative robust switching control for optical telescopes

    NASA Astrophysics Data System (ADS)

    Lounsbury, William P.; Garcia-Sanz, Mario

    2014-07-01

    This paper introduces an innovative robust and nonlinear control design methodology for high-performance servosystems in optical telescopes. The dynamics of optical telescopes typically vary according to azimuth and altitude angles, temperature, friction, speed and acceleration, leading to nonlinearities and plant parameter uncertainty. The methodology proposed in this paper combines robust Quantitative Feedback Theory (QFT) techniques with nonlinear switching strategies that achieve simultaneously the best characteristics of a set of very active (fast) robust QFT controllers and very stable (slow) robust QFT controllers. A general dynamic model and a variety of specifications from several different commercially available amateur Newtonian telescopes are used for the controller design as well as the simulation and validation. It is also proven that the nonlinear/switching controller is stable for any switching strategy and switching velocity, according to described frequency conditions based on common quadratic Lyapunov functions (CQLF) and the circle criterion.

  11. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  12. Chinese Middle School Teachers' Preferences Regarding Performance Evaluation Measures

    ERIC Educational Resources Information Center

    Liu, Shujie; Xu, Xianxuan; Stronge, James H.

    2016-01-01

    Teacher performance evaluation currently is receiving unprecedented attention from policy makers, scholars, and practitioners worldwide. This study is one of the few studies of teacher perceptions regarding teacher performance measures that focus on China. We employed a quantitative dominant mixed research design to investigate Chinese teachers'…

  13. A Qualitative and Quantitative Evaluation of 8 Clear Sky Models.

    PubMed

    Bruneton, Eric

    2016-10-27

    We provide a qualitative and quantitative evaluation of 8 clear sky models used in Computer Graphics. We compare the models with each other as well as with measurements and with a reference model from the physics community. After a short summary of the physics of the problem, we present the measurements and the reference model, and how we "invert" it to get the model parameters. We then give an overview of each CG model, and detail its scope, its algorithmic complexity, and its results using the same parameters as in the reference model. We also compare the models with a perceptual study. Our quantitative results confirm that the less simplifications and approximations are used to solve the physical equations, the more accurate are the results. We conclude with a discussion of the advantages and drawbacks of each model, and how to further improve their accuracy.

  14. Quantitative projections of a quality measure: Performance of a complex task

    NASA Astrophysics Data System (ADS)

    Christensen, K.; Kleppe, Gisle; Vold, Martin; Frette, Vidar

    2014-12-01

    Complex data series that arise during interaction between humans (operators) and advanced technology in a controlled and realistic setting have been explored. The purpose is to obtain quantitative measures that reflect quality in task performance: on a ship simulator, nine crews have solved the same exercise, and detailed maneuvering histories have been logged. There are many degrees of freedom, some of them connected to the fact that the vessels may be freely moved in any direction. To compare maneuvering histories, several measures were used: the time needed to reach the position of operation, the integrated angle between the hull direction and the direction of motion, and the extent of movement when the vessel is to be manually kept in a fixed position. These measures are expected to reflect quality in performance. We have also obtained expert quality evaluations of the crews. The quantitative measures and the expert evaluations, taken together, allow a ranking of crew performance. However, except for time and integrated angle, there is no correlation between the individual measures. This may indicate that complex situations with social and man-machine interactions need complex measures of quality in task performance. In general terms, we have established a context-dependent and flexible framework with quantitative measures in contact with a social-science concept that is hard to define. This approach may be useful for other (qualitative) concepts in social science that contain important information on the society.

  15. The Quantitative Science of Evaluating Imaging Evidence.

    PubMed

    Genders, Tessa S S; Ferket, Bart S; Hunink, M G Myriam

    2017-03-01

    Cardiovascular diagnostic imaging tests are increasingly used in everyday clinical practice, but are often imperfect, just like any other diagnostic test. The performance of a cardiovascular diagnostic imaging test is usually expressed in terms of sensitivity and specificity compared with the reference standard (gold standard) for diagnosing the disease. However, evidence-based application of a diagnostic test also requires knowledge about the pre-test probability of disease, the benefit of making a correct diagnosis, the harm caused by false-positive imaging test results, and potential adverse effects of performing the test itself. To assist in clinical decision making regarding appropriate use of cardiovascular diagnostic imaging tests, we reviewed quantitative concepts related to diagnostic performance (e.g., sensitivity, specificity, predictive values, likelihood ratios), as well as possible biases and solutions in diagnostic performance studies, Bayesian principles, and the threshold approach to decision making.

  16. Metrics for Offline Evaluation of Prognostic Performance

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2010-01-01

    Prognostic performance evaluation has gained significant attention in the past few years. Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.

  17. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  18. Longitudinal flexural mode utility in quantitative guided wave evaluation

    NASA Astrophysics Data System (ADS)

    Li, Jian

    2001-07-01

    Longitudinal Non-axisymmetric flexural mode utility in quantitative guided wave evaluation is examined for pipe and tube inspection. Attention is focused on hollow cylinders. Several source loading problems such as a partial-loading angle beam, an axisymmetric comb transducer and an angle beam array are studied. The Normal Mode Expansion method is employed to simulate the generated guided wave fields. For non-axisymmetric sources, an important angular profile feature is studied. Based on numerical calculations, an angular profile varies with frequency, mode and propagating distance. Since an angular profile determines the energy distribution of the guided waves, the angular profile has a great impact on the pipe inspection capability of guided waves. The simulation of non-axisymmetric angular profiles generated by partialloading is verified by experiments. An angular profile is the superposition of harmonic axisymmetric and non-axisymmetric modes with various phase velocities. A simpler equation is derived to calculate the phase velocities of the non-axisymmetric guided waves and is used for discussing the characteristics of non-axisymmetric guided waves. Angular profiles have many applications in practical pipe testing. The procedure of building desired angular profiles and also angular profile tuning is discussed. This angular profile tuning process is implemented by a phased transducer array and a special computational algorithm. Since a transducer array plays a critical role in guided wave inspection, the performance of a transducer array is discussed in terms of guided wave mode control ability and excitation sensitivity. With time delay inputs, a transducer array is greatly improved for its mode control ability and sensitivity. The algorithms for setting time delays are derived based on frequency, element spacing and phase velocity. With the help of the conclusions drawn on non- axisymmetric guided waves, a phased circumferential partial-loading array is

  19. Quantitative fuel motion determination with the CABRI fast neutron hodoscope; Evaluation methods and results

    SciTech Connect

    Baumung, K. ); Augier, G. )

    1991-12-01

    The fast neutron hodoscope installed at the CABRI reactor in Cadarache, France, is employed to provide quantitative fuel motion data during experiments in which single liquid-metal fast breeder reactor test pins are subjected to simulated accident conditions. Instrument design and performance are reviewed, the methods for the quantitative evaluation are presented, and error sources are discussed. The most important findings are the axial expansion as a function of time, phenomena related to pin failure (such as time, location, pin failure mode, and fuel mass ejected after failure), and linear fuel mass distributions with a 2-cm axial resolution. In this paper the hodoscope results of the CABRI-1 program are summarized.

  20. Review of progress in quantitative nondestructive evaluation. Vol. 3B

    SciTech Connect

    Thompson, D.O.; Chimenti, D.E.

    1984-01-01

    This two-book volume constitutes the Proceedings of the Tenth Annual Review of Progress in Quantitative Nondestructive Evaluation held in California in 1983. Topics considered include nondestructive evaluation (NDE) reliability, ultrasonics (probability of detection, scattering, sizing, transducers, signal processing, imaging and reconstruction), eddy currents (probability of detection, modeling, sizing, probes), acoustic emission, thermal wave imaging, optical techniques, new techniques (e.g., maximum entropy reconstruction, near-surface inspection of flaws using bulk ultrasonic waves, inversion and reconstruction), composite materials, material properties, acoustoelasticity, residual stress, and new NDE systems (e.g., retirement-for-cause procedures for gas turbine engine components, pulsed eddy current flaw detection and characterization, an ultrasonic inspection protocol for IN100 jet engine materials, electromagnetic on-line monitoring of rotating turbine-generator components). Basic research and early engineering applications are emphasized.

  1. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  2. The Nuclear Renaissance - Implications on Quantitative Nondestructive Evaluations

    SciTech Connect

    Matzie, Regis A.

    2007-03-21

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  3. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Performance evaluation... Performance evaluation. Preparation of performance evaluation reports. (a) In addition to the requirements of FAR 36.604, performance evaluation reports shall be prepared for indefinite-delivery type...

  4. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  5. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches.

  6. Evaluation of Lung Metastasis in Mouse Mammary Tumor Models by Quantitative Real-time PCR

    PubMed Central

    Abt, Melissa A.; Grek, Christina L.; Ghatnekar, Gautam S.; Yeh, Elizabeth S.

    2016-01-01

    Metastatic disease is the spread of malignant tumor cells from the primary cancer site to a distant organ and is the primary cause of cancer associated death 1. Common sites of metastatic spread include lung, lymph node, brain, and bone 2. Mechanisms that drive metastasis are intense areas of cancer research. Consequently, effective assays to measure metastatic burden in distant sites of metastasis are instrumental for cancer research. Evaluation of lung metastases in mammary tumor models is generally performed by gross qualitative observation of lung tissue following dissection. Quantitative methods of evaluating metastasis are currently limited to ex vivo and in vivo imaging based techniques that require user defined parameters. Many of these techniques are at the whole organism level rather than the cellular level 3–6. Although newer imaging methods utilizing multi-photon microscopy are able to evaluate metastasis at the cellular level 7, these highly elegant procedures are more suited to evaluating mechanisms of dissemination rather than quantitative assessment of metastatic burden. Here, a simple in vitro method to quantitatively assess metastasis is presented. Using quantitative Real-time PCR (QRT-PCR), tumor cell specific mRNA can be detected within the mouse lung tissue. PMID:26862835

  7. Evaluation of Lung Metastasis in Mouse Mammary Tumor Models by Quantitative Real-time PCR.

    PubMed

    Abt, Melissa A; Grek, Christina L; Ghatnekar, Gautam S; Yeh, Elizabeth S

    2016-01-29

    Metastatic disease is the spread of malignant tumor cells from the primary cancer site to a distant organ and is the primary cause of cancer associated death. Common sites of metastatic spread include lung, lymph node, brain, and bone. Mechanisms that drive metastasis are intense areas of cancer research. Consequently, effective assays to measure metastatic burden in distant sites of metastasis are instrumental for cancer research. Evaluation of lung metastases in mammary tumor models is generally performed by gross qualitative observation of lung tissue following dissection. Quantitative methods of evaluating metastasis are currently limited to ex vivo and in vivo imaging based techniques that require user defined parameters. Many of these techniques are at the whole organism level rather than the cellular level. Although newer imaging methods utilizing multi-photon microscopy are able to evaluate metastasis at the cellular level, these highly elegant procedures are more suited to evaluating mechanisms of dissemination rather than quantitative assessment of metastatic burden. Here, a simple in vitro method to quantitatively assess metastasis is presented. Using quantitative Real-time PCR (QRT-PCR), tumor cell specific mRNA can be detected within the mouse lung tissue.

  8. Formative Evaluation in the Performance Context.

    ERIC Educational Resources Information Center

    Dick, Walter; King, Debby

    1994-01-01

    Reviews the traditional formative evaluation model used by instructional designers; summarizes Kirkpatrick's model of evaluation; proposes the integration of part of Kirkpatrick's model with traditional formative evaluation; and discusses performance-context formative evaluation. (three references) (LRW)

  9. Evaluation of a quantitative plasma PCR plate assay for detecting cytomegalovirus infection in marrow transplant recipients.

    PubMed Central

    Gallez-Hawkins, G M; Tegtmeier, B R; ter Veer, A; Niland, J C; Forman, S J; Zaia, J A

    1997-01-01

    A plasma PCR test, using a nonradioactive PCR plate assay, was evaluated for detection of human cytomegalovirus reactivation. This assay was compared to Southern blotting and found to perform well. As a noncompetitive method of quantitation, it was similar to a competitive method for detecting the number of genome copies per milliliter of plasma in marrow transplant recipients. This is a technically simplified assay with potential for adaptation to automation. PMID:9041438

  10. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    ERIC Educational Resources Information Center

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  11. Quantitative surface evaluation by matching experimental and simulated ronchigram images

    NASA Astrophysics Data System (ADS)

    Kantún Montiel, Juana Rosaura; Cordero Dávila, Alberto; González García, Jorge

    2011-09-01

    To estimate qualitatively the surface errors with Ronchi test, the experimental and simulated ronchigrams are compared. Recently surface errors have been obtained quantitatively matching the intersection point coordinates of ronchigrama fringes with x-axis . In this case, gaussian fit must be done for each fringe, and interference orders are used in Malacara algorithm for the simulations. In order to evaluate surface errors, we added an error function in simulations, described with cubic splines, to the sagitta function of the ideal surface. We used the vectorial transversal aberration formula and a ruling with cosinusoidal transmittance, because these rulings reproduce better experimental ronchigram fringe profiles. Several error functions are tried until the whole experimental ronchigrama image is reproduced. The optimization process was done using genetic algorithms.

  12. Shadow photogrammetric apparatus for the quantitative evaluation of corneal buttons.

    PubMed

    Denham, D; Mandelbaum, S; Parel, J M; Holland, S; Pflugfelder, S; Parel, J M

    1989-11-01

    We have developed a technique for the accurate, quantitative, geometric evaluation of trephined and punched corneal buttons. A magnified shadow of the frontal and edge views of a corneal button mounted on the rotary stage of a modified optical comparator is projected onto the screen of the comparator and photographed. This process takes approximately three minutes. The diameters and edge profile at any meridian photographed can subsequently be analyzed from the film. The precision in measuring the diameters of well cut corneal buttons is +/- 23 microns, and in measuring the angle of the edge profile is +/- 1 degree. Statistical analysis of inter observer variability indicated excellent reproducibility of measurements. Shadow photogrammetry offers a standardized, accurate, and reproducible method for analysis of corneal trephination.

  13. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  14. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  15. Quantitative evaluation of the transplanted lin(-) hematopoietic cell migration kinetics.

    PubMed

    Kašėta, Vytautas; Vaitkuvienė, Aida; Liubavičiūtė, Aušra; Maciulevičienė, Rūta; Stirkė, Arūnas; Biziulevičienė, Genė

    2016-02-01

    Stem cells take part in organogenesis, cell maturation and injury repair. The migration is necessary for each of these functions to occur. The aim of this study was to investigate the kinetics of transplanted hematopoietic lin(-) cell population (which consists mainly of the stem and progenitor cells) in BALB/c mouse contact hypersensitivity model and quantify the migration to the site of inflammation in the affected foot and other healthy organs. Quantitative analysis was carried out with the real-time polymerase chain reaction method. Spleen, kidney, bone marrow, lung, liver, damaged and healthy foot tissue samples at different time points were collected for analysis. The quantitative data normalization was performed according to the comparative quantification method. The analysis of foot samples shows the significant migration of transplanted cells to the recipient mice affected foot. The quantity was more than 1000 times higher, as compared with that of the untreated foot. Due to the inflammation, the number of donor origin cells migrating to the lungs, liver, spleen and bone marrow was found to be decreased. Our data shows that transplanted cells selectively migrated into the inflammation areas of the foot edema. Also, the inflammation caused a secondary migration in ectopic spleen of hematopoietic stem cell niches and re-homing from the spleen to the bone marrow took place.

  16. Quantitative genetic activity graphical profiles for use in chemical evaluation

    SciTech Connect

    Waters, M.D.; Stack, H.F.; Garrett, N.E.; Jackson, M.A.

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  17. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Performance evaluation... Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor performance as required in FAR 36.604. Normally, the performance report must be prepared by the...

  18. Quantitative Evaluation and Selection of Reference Genes for Quantitative RT-PCR in Mouse Acute Pancreatitis

    PubMed Central

    Yan, Zhaoping; Gao, Jinhang; Lv, Xiuhe; Yang, Wenjuan; Wen, Shilei; Tong, Huan; Tang, Chengwei

    2016-01-01

    The analysis of differences in gene expression is dependent on normalization using reference genes. However, the expression of many of these reference genes, as evaluated by quantitative RT-PCR, is upregulated in acute pancreatitis, so they cannot be used as the standard for gene expression in this condition. For this reason, we sought to identify a stable reference gene, or a suitable combination, for expression analysis in acute pancreatitis. The expression stability of 10 reference genes (ACTB, GAPDH, 18sRNA, TUBB, B2M, HPRT1, UBC, YWHAZ, EF-1α, and RPL-13A) was analyzed using geNorm, NormFinder, and BestKeeper software and evaluated according to variations in the raw Ct values. These reference genes were evaluated using a comprehensive method, which ranked the expression stability of these genes as follows (from most stable to least stable): RPL-13A, YWHAZ > HPRT1 > GAPDH > UBC > EF-1α > 18sRNA > B2M > TUBB > ACTB. RPL-13A was the most suitable reference gene, and the combination of RPL-13A and YWHAZ was the most stable group of reference genes in our experiments. The expression levels of ACTB, TUBB, and B2M were found to be significantly upregulated during acute pancreatitis, whereas the expression level of 18sRNA was downregulated. Thus, we recommend the use of RPL-13A or a combination of RPL-13A and YWHAZ for normalization in qRT-PCR analyses of gene expression in mouse models of acute pancreatitis. PMID:27069927

  19. Predictive Heterosis in Multibreed Evaluations Using Quantitative and Molecular Approaches

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Heterosis is the extra genetic boost in performance obtained by crossing two cattle breeds. It is an important tool for increasing the efficiency of beef production. It is also important to adjust data used to calculate genetic evaluations for differences in heterosis. Good estimates of heterosis...

  20. Quantitative evaluation of magnetic immunoassay with remanence measurement

    NASA Astrophysics Data System (ADS)

    Enpuku, K.; Soejima, K.; Nishimoto, T.; Kuma, H.; Hamasaki, N.; Tsukamoto, A.; Saitoh, K.; Kandori, A.

    2006-05-01

    Magnetic immunoassays utilizing magnetic markers and a high -Tc SQUID have been performed. The marker was designed so as to generate remanence, and its remanence field was measured with the SQUID. The SQUID system was developed so as to measure 12 samples in one measurement sequence. We first conducted a detection of antigen called human IgE using IgE standard solution, and showed the detection of IgE down to 2 attomol. The binding process between IgE and the marker could be semi-quantitatively explained with the Langmuir-type adsorption model. We also measured IgE in human serums, and demonstrated the usefulness of the present method for practical diagnosis.

  1. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  2. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  3. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  4. Quantitative evaluation of activation state in functional brain imaging.

    PubMed

    Hu, Zhenghui; Ni, Pengyu; Liu, Cong; Zhao, Xiaohu; Liu, Huafeng; Shi, Pengcheng

    2012-10-01

    Neuronal activity can evoke the hemodynamic change that gives rise to the observed functional magnetic resonance imaging (fMRI) signal. These increases are also regulated by the resting blood volume fraction (V (0)) associated with regional vasculature. The activation locus detected by means of the change in the blood-oxygen-level-dependent (BOLD) signal intensity thereby may deviate from the actual active site due to varied vascular density in the cortex. Furthermore, conventional detection techniques evaluate the statistical significance of the hemodynamic observations. In this sense, the significance level relies not only upon the intensity of the BOLD signal change, but also upon the spatially inhomogeneous fMRI noise distribution that complicates the expression of the results. In this paper, we propose a quantitative strategy for the calibration of activation states to address these challenging problems. The quantitative assessment is based on the estimated neuronal efficacy parameter [Formula: see text] of the hemodynamic model in a voxel-by-voxel way. It is partly immune to the inhomogeneous fMRI noise by virtue of the strength of the optimization strategy. Moreover, it is easy to incorporate regional vascular information into the activation detection procedure. By combining MR angiography images, this approach can remove large vessel contamination in fMRI signals, and provide more accurate functional localization than classical statistical techniques for clinical applications. It is also helpful to investigate the nonlinear nature of the coupling between synaptic activity and the evoked BOLD response. The proposed method might be considered as a potentially useful complement to existing statistical approaches.

  5. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies.

  6. 13 CFR 304.4 - Performance evaluations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Performance evaluations. 304.4... ECONOMIC DEVELOPMENT DISTRICTS § 304.4 Performance evaluations. (a) EDA shall evaluate the management standards, financial accountability and program performance of each District Organization within three...

  7. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL... Performance evaluation. See 42.1502(f) for the requirements for preparing past performance evaluations...

  8. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL... Performance evaluation. See 42.1502(f) for the requirements for preparing past performance evaluations...

  9. Evaluation of the Quantitative Prediction of a Trend Reversal on the Japanese Stock Market in 1999

    NASA Astrophysics Data System (ADS)

    Johansen, Anders; Sornette, Didier

    In January 1999, the authors published a quantitative prediction that the Nikkei index should recover from its 14-year low in January 1999 and reach ~20 500 a year later. The purpose of the present paper is to evaluate the performance of this specific prediction as well as the underlying model: the forecast, performed at a time when the Nikkei was at its lowest (as we can now judge in hindsight), has correctly captured the change of trend as well as the quantitative evolution of the Nikkei index since its inception. As the change of trend from sluggish to recovery was estimated quite unlikely by many observers at that time, a Bayesian analysis shows that a skeptical (resp. neutral) Bayesian sees prior belief in our model amplified into a posterior belief 19 times larger (resp. reach the 95% level).

  10. Energy performance evaluation of AAC

    NASA Astrophysics Data System (ADS)

    Aybek, Hulya

    The U.S. building industry constitutes the largest consumer of energy (i.e., electricity, natural gas, petroleum) in the world. The building sector uses almost 41 percent of the primary energy and approximately 72 percent of the available electricity in the United States. As global energy-generating resources are being depleted at exponential rates, the amount of energy consumed and wasted cannot be ignored. Professionals concerned about the environment have placed a high priority on finding solutions that reduce energy consumption while maintaining occupant comfort. Sustainable design and the judicious combination of building materials comprise one solution to this problem. A future including sustainable energy may result from using energy simulation software to accurately estimate energy consumption and from applying building materials that achieve the potential results derived through simulation analysis. Energy-modeling tools assist professionals with making informed decisions about energy performance during the early planning phases of a design project, such as determining the most advantageous combination of building materials, choosing mechanical systems, and determining building orientation on the site. By implementing energy simulation software to estimate the effect of these factors on the energy consumption of a building, designers can make adjustments to their designs during the design phase when the effect on cost is minimal. The primary objective of this research consisted of identifying a method with which to properly select energy-efficient building materials and involved evaluating the potential of these materials to earn LEED credits when properly applied to a structure. In addition, this objective included establishing a framework that provides suggestions for improvements to currently available simulation software that enhance the viability of the estimates concerning energy efficiency and the achievements of LEED credits. The primary objective

  11. Quantitative evaluation of solar wind time-shifting methods

    NASA Astrophysics Data System (ADS)

    Cameron, Taylor; Jackel, Brian

    2016-11-01

    Nine years of solar wind dynamic pressure and geosynchronous magnetic field data are used for a large-scale statistical comparison of uncertainties associated with several different algorithms for propagating solar wind measurements. The MVAB-0 scheme is best overall, performing on average a minute more accurately than a flat time-shift. We also evaluate the accuracy of these time-shifting methods as a function of solar wind magnetic field orientation. We find that all time-shifting algorithms perform significantly worse (>5 min) due to geometric effects when the solar wind magnetic field is radial (parallel or antiparallel to the Earth-Sun line). Finally, we present an empirical scheme that performs almost as well as MVAB-0 on average and slightly better than MVAB-0 for intervals with nonradial B.

  12. [Quantitative determination of niphensamide by high performance liquid chromatography (HPLC)].

    PubMed

    Long, C; Chen, S; Shi, T

    1998-01-01

    An HPLC method for the quantitative determination of Niphensamide in pesticide powder was developed. Column:Micropak-CH 5 microns (300 mm x 4.0 mm i.d.), mobile phase: CH3OH-H2O(1:1), detector: UV 254 nm, flow rate: 0.7 mL/min, column temperature: 25 degrees C. Under the above conditions, Niphensamide and other components were separated from each other. The method is simple, rapid, sensitive and accurate.

  13. A Proposed RTN Officer Performance Evaluation System

    DTIC Science & Technology

    1989-12-01

    studiod at the Naval Postpraduate School and practical theories relating to personnel management and performance evaluation. 4 The research method includes...various systems are discussed as the researcher perceives them. The fact that there is probably no agreed upon, fool-proof method of evaluating an...Performance Evaluation System. The research methodology Includes the following three componen: (1) a study of pertinent performance evaluation

  14. Quantitative evaluation of statistical inference in resting state functional MRI.

    PubMed

    Yang, Xue; Kang, Hakmook; Newton, Allen; Landman, Bennett A

    2012-01-01

    Modern statistical inference techniques may be able to improve the sensitivity and specificity of resting state functional MRI (rs-fMRI) connectivity analysis through more realistic characterization of distributional assumptions. In simulation, the advantages of such modern methods are readily demonstrable. However quantitative empirical validation remains elusive in vivo as the true connectivity patterns are unknown and noise/artifact distributions are challenging to characterize with high fidelity. Recent innovations in capturing finite sample behavior of asymptotically consistent estimators (i.e., SIMulation and EXtrapolation - SIMEX) have enabled direct estimation of bias given single datasets. Herein, we leverage the theoretical core of SIMEX to study the properties of inference methods in the face of diminishing data (in contrast to increasing noise). The stability of inference methods with respect to synthetic loss of empirical data (defined as resilience) is used to quantify the empirical performance of one inference method relative to another. We illustrate this new approach in a comparison of ordinary and robust inference methods with rs-fMRI.

  15. Gas turbine coatings eddy current quantitative and qualitative evaluation

    NASA Astrophysics Data System (ADS)

    Ribichini, Remo; Giolli, Carlo; Scrinzi, Erica

    2017-02-01

    Gas turbine blades (buckets) are among the most critical and expensive components of the engine. Buckets rely on protective coatings in order to withstand the harsh environment in which they operate. The thickness and the microstructure of coatings during the lifespan of a unit are fundamental to evaluate their fitness for service. A frequency scanning Eddy Current instrument can allow the measurement of the thickness and of physical properties of coatings in a Non-Destructive manner. The method employed relies on the acquisition of impedance spectra and on the inversion of the experimental data to derive the coating properties and structure using some assumptions. This article describes the experimental validation performed on several samples and real components in order to assess the performance of the instrument as a coating thickness gage. The application of the technique to support residual life assessment of serviced buckets is also presented.

  16. Using hybrid method to evaluate the green performance in uncertainty.

    PubMed

    Tseng, Ming-Lang; Lan, Lawrence W; Wang, Ray; Chiu, Anthony; Cheng, Hui-Ping

    2011-04-01

    Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm's green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed.

  17. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level.

  18. 13 CFR 304.4 - Performance evaluations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Performance evaluations. 304.4 Section 304.4 Business Credit and Assistance ECONOMIC DEVELOPMENT ADMINISTRATION, DEPARTMENT OF COMMERCE ECONOMIC DEVELOPMENT DISTRICTS § 304.4 Performance evaluations. (a) EDA shall evaluate the...

  19. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONTRACTING REQUIREMENTS CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 2936.604 Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  20. Dual-band infrared thermography for quantitative nondestructive evaluation

    SciTech Connect

    Durbin, P.F.; Del Grande, N.K.; Dolan, K.W.; Perkins, D.E.; Shapiro, A.B.

    1993-04-01

    The authors have developed dual-band infrared (DBIR) thermography that is being applied to quantitative nondestructive evaluation (NDE) of aging aircraft. The DBIR technique resolves 0.2 degrees C surface temperature differences for inspecting interior flaws in heated aircraft structures. It locates cracks, corrosion sites, disbonds or delaminations in metallic laps and composite patches. By removing clutter from surface roughness effects, the authors clarify interpretation of subsurface flaws. To accomplish this, the authors ratio images recorded at two infrared bands, centered near 5 microns and 10 microns. These image ratios are used to decouple temperature patterns associated with interior flaw sites from spatially varying surface emissivity noise. They also discuss three-dimensional (3D) dynamic thermal imaging of structural flaws using dual-band infrared (DBIR) computed tomography. Conventional thermography provides single-band infrared images which are difficult to interpret. Standard procedures yield imprecise (or qualitative) information about subsurface flaw sites which are typically masked by surface clutter. They use a DBIR imaging technique pioneered at LLNL to capture the time history of surface temperature difference patterns for flash-heated targets. They relate these patterns to the location, size, shape and depth of subsurface flaws. They have demonstrated temperature accuracies of 0.2{degree}C, timing synchronization of 3 ms (after onset of heat flash) and intervals of 42 ms, between images, during an 8 s cooling (and heating) interval characterizing the front (and back) surface temperature-time history of an epoxy-glue disbond site in a flash-heated aluminum lap joint.

  1. Quantitative Integrated Evaluation in the Mars Basin, Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Tichelaar, B. W.; Detomo, R.

    2005-05-01

    Today's exploitation of hydrocarbons in the Deepwater Gulf of Mexico requires a subtle, sophisticated class of opportunities for which uncertainties must be quantified to reduce risk. The explorer is often faced with non-amplitude supported hydrocarbon accumulations, limitations of seismic imaging, and uncertainty in stratigraphy and hydrocarbon kitchens, all in an environment of still-maturing technology and rising drilling costs. However, many of the fundamental Exploration processes that drove the industry in the past in the Gulf of Mexico still apply today. Integration of these historically proven processes with each other and with new technologies, supported by a growing body of knowledge, has provided a significant new methodology for wildcat and near-field Exploration. Even in mature fields, additional opportunities are seldom characterized by unambiguous attributes of direct hydrocarbon indicators or amplitude support. Shell's Quantitative Integrated Evaluation process relies upon visualization of integrated volume-based stratigraphic models of rock and fluid properties, and by relating these properties to measured and predicted seismic responses. An attribute referred to as the Differential Generalized Attribute, which summarizes the differences between multiple scenario response predictions and actual measured data, can then be used to distinguish likely scenarios from unlikely scenarios. This methodology allows competing scenarios to be rapidly tested against the data, and is built upon proprietary knowledge of the physical processes and relationships that likely drive vertical and lateral variation in these models. We will demonstrate the methodology by showing a portion of the Mars Basin and describing the integrated capability that is emplaced at the Exploration phase, and matured throughout the Appraisal, Development and Production life cycle of a basin discovery.

  2. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 236.604 Performance evaluation. (a) Preparation of performance reports. Use DD Form 2631, Performance Evaluation (Architect-Engineer), instead of SF 1421. (2) Prepare a...

  3. Performance evaluation of generalized MSK

    NASA Astrophysics Data System (ADS)

    Galko, P.; Pasupathy, S.

    The computation of the performance of several optimal and suboptimal receivers for generalized MSK is discussed. The optimal receivers considered are Viterbi receivers and the optimal receivers based on a finite observation interval. Two suboptimal receivers, (1) optimized linear receivers based on finite observation intervals and (2) linear receivers based on approximating generalized MSK as an OQPSK signal, are considered as well. It is shown that the former receiver's performance may be computed exactly, while for the latter receiver it is possible to provide arbitrarily tight bounds on the performance. These analyses are illustrated by application to the two popular generalized MSK schemes of duobinary MSK (or FSOQ) and (1 + D)squared/4 MSK.

  4. Quantitative polarization and flow evaluation of choroid and sclera by multifunctional Jones matrix optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Sugiyama, S.; Hong, Y.-J.; Kasaragod, D.; Makita, S.; Miura, M.; Ikuno, Y.; Yasuno, Y.

    2016-03-01

    Quantitative evaluation of optical properties of choroid and sclera are performed by multifunctional optical coherence tomography. Five normal eyes, five glaucoma eyes and one choroidal atrophy eye are examined. The refractive error was found to be correlated with choroidal birefringence, polarization uniformity, and flow in addition to scleral birefringence among normal eyes. The significant differences were observed between the normal and the glaucoma eyes, as for choroidal polarization uniformity, flow and scleral birefringence. An automatic segmentation algorithm of retinal pigment epithelium and chorioscleral interface based on multifunctional signals is also presented.

  5. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed

    Hertzberg, Richard C; Teuschler, Linda K

    2002-12-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions.

  6. INTEGRATED WATER TREATMENT SYSTEM PERFORMANCE EVALUATION

    SciTech Connect

    SEXTON RA; MEEUWSEN WE

    2009-03-12

    This document describes the results of an evaluation of the current Integrated Water Treatment System (IWTS) operation against design performance and a determination of short term and long term actions recommended to sustain IWTS performance.

  7. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  8. Segmentation and quantitative evaluation of brain MRI data with a multiphase 3D implicit deformable model

    NASA Astrophysics Data System (ADS)

    Angelini, Elsa D.; Song, Ting; Mensh, Brett D.; Laine, Andrew

    2004-05-01

    Segmentation of three-dimensional anatomical brain images into tissue classes has applications in both clinical and research settings. This paper presents the implementation and quantitative evaluation of a four-phase three-dimensional active contour implemented with a level set framework for automated segmentation of brain MRIs. The segmentation algorithm performs an optimal partitioning of three-dimensional data based on homogeneity measures that naturally evolves to the extraction of different tissue types in the brain. Random seed initialization was used to speed up numerical computation and avoid the need for a priori information. This random initialization ensures robustness of the method to variation of user expertise, biased a priori information and errors in input information that could be influenced by variations in image quality. Experimentation on three MRI brain data sets showed that an optimal partitioning successfully labeled regions that accurately identified white matter, gray matter and cerebrospinal fluid in the ventricles. Quantitative evaluation of the segmentation was performed with comparison to manually labeled data and computed false positive and false negative assignments of voxels for the three organs. We report high accuracy for the two comparison cases. These results demonstrate the efficiency and flexibility of this segmentation framework to perform the challenging task of automatically extracting brain tissue volume contours.

  9. Quantitative evaluation of noise reduction and vesselness filters for liver vessel segmentation on abdominal CTA images

    NASA Astrophysics Data System (ADS)

    Luu, Ha Manh; Klink, Camiel; Moelker, Adriaan; Niessen, Wiro; van Walsum, Theo

    2015-05-01

    Liver vessel segmentation in CTA images is a challenging task, especially in the case of noisy images. This paper investigates whether pre-filtering improves liver vessel segmentation in 3D CTA images. We introduce a quantitative evaluation of several well-known filters based on a proposed liver vessel segmentation method on CTA images. We compare the effect of different diffusion techniques i.e. Regularized Perona-Malik, Hybrid Diffusion with Continuous Switch and Vessel Enhancing Diffusion as well as the vesselness approaches proposed by Sato, Frangi and Erdt. Liver vessel segmentation of the pre-processed images is performed using a histogram-based region grown with local maxima as seed points. Quantitative measurements (sensitivity, specificity and accuracy) are determined based on manual landmarks inside and outside the vessels, followed by T-tests for statistic comparisons on 51 clinical CTA images. The evaluation demonstrates that all the filters make liver vessel segmentation have a significantly higher accuracy than without using a filter (p  <  0.05) Hybrid Diffusion with Continuous Switch achieves the best performance. Compared to the diffusion filters, vesselness filters have a greater sensitivity but less specificity. In addition, the proposed liver vessel segmentation method with pre-filtering is shown to perform robustly on a clinical dataset having a low contrast-to-noise of up to 3 (dB). The results indicate that the pre-filtering step significantly improves liver vessel segmentation on 3D CTA images.

  10. Quantitative evaluation of peptide-extraction methods by HPLC-triple-quad MS-MS.

    PubMed

    Du, Yan; Wu, Dapeng; Wu, Qian; Guan, Yafeng

    2015-02-01

    In this study, the efficiency of five peptide-extraction methods—acetonitrile (ACN) precipitation, ultrafiltration, C18 solid-phase extraction (SPE), dispersed SPE with mesoporous carbon CMK-3, and mesoporous silica MCM-41—was quantitatively investigated. With 28 tryptic peptides as target analytes, these methods were evaluated on the basis of recovery and reproducibility by using high-performance liquid chromatography-triple-quad tandem mass spectrometry in selected-reaction-monitoring mode. Because of the distinct extraction mechanisms of the methods, their preferences for extracting peptides of different properties were revealed to be quite different, usually depending on the pI values or hydrophobicity of peptides. When target peptides were spiked in bovine serum albumin (BSA) solution, the extraction efficiency of all the methods except ACN precipitation changed significantly. The binding of BSA with target peptides and nonspecific adsorption on adsorbents were believed to be the ways through which BSA affected the extraction behavior. When spiked in plasma, the performance of all five methods deteriorated substantially, with the number of peptides having recoveries exceeding 70% being 15 for ACN precipitation, and none for the other methods. Finally, the methods were evaluated in terms of the number of identified peptides for extraction of endogenous plasma peptides. Only ultrafiltration and CMK-3 dispersed SPE performed differently from the quantitative results with target peptides, and the wider distribution of the properties of endogenous peptides was believed to be the main reason.

  11. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  12. Clinical evaluator reliability for quantitative and manual muscle testing measures of strength in children.

    PubMed

    Escolar, D M; Henricson, E K; Mayhew, J; Florence, J; Leshner, R; Patel, K M; Clemens, P R

    2001-06-01

    Measurements of muscle strength in clinical trials of Duchenne muscular dystrophy have relied heavily on manual muscle testing (MMT). The high level of intra- and interrater variability of MMT compromises clinical study results. We compared the reliability of 12 clinical evaluators in performing MMT and quantitative muscle testing (QMT) on 12 children with muscular dystrophy. QMT was reliable, with an interclass correlation coefficient (ICC) of >0.9 for biceps and grip strength, and >0.8 for quadriceps strength. Training of both subjects and evaluators was easily accomplished. MMT was not as reliable, and required repeated training of evaluators to bring all groups to an ICC >0.75 for shoulder abduction, elbow and hip flexion, knee extension, and ankle dorsiflexion. We conclude that QMT shows greater reliability and is easier to implement than MMT. Consequently, QMT will be a superior measure of strength for use in pediatric, neuromuscular, multicenter clinical trials.

  13. GPS User Equipment Performance Evaluation.

    DTIC Science & Technology

    1979-11-01

    Unit Device Controller Assembly ( UDCA ) * Satellite Signal Generator Assembly (SSGA) • Dynanic Frequency Synthesizer Assembly (DFSA) * Jamming...Post Run Data (PR)) under operator control. The UDCA performs the following primary functions: * Receive digital data from I)I’A, buffer and transfer to...collected by the 1IJKA, reformatted, and sent to the I)IA, and vice versa, via a data bus interface. The UDCA consists of the following elcliiti S

  14. Lubricant Evaluation and Performance 2

    DTIC Science & Technology

    1994-02-01

    POWER DIRECTORATE WRIGHT LABORATORY AIR FORCE MATERIEL COMMAND WRIGHT-PATrERSON AIR FORCE BASE , OHIO 45433-7103 94 5 10 011 NOTICE When government...INTRODUCTION I R DEVELOPMENT OF IMPROVED METHODS FOR MEASURING LUBRICANT PERFORMANCE 3 1 OXIDATIVE STABILITY OF ESTER BASE LUBRICANTS 3 a. Introduction 3 b...7) Conclusions 134 i. Stability Testing of Cyclophosphazene Based Fluids 134 (1) Introduction 134 (2) Effect of Metal Specimens 134 (3) Effect of a

  15. Distance estimation from acceleration for quantitative evaluation of Parkinson tremor.

    PubMed

    Jeon, Hyoseon; Kim, Sang Kyong; Jeon, BeomSeok; Park, Kwang Suk

    2011-01-01

    The purpose of this paper is to assess Parkinson tremor estimating actual distance amplitude. We propose a practical, useful and simple method for evaluating Parkinson tremor with distance value. We measured resting tremor of 7 Parkinson Disease (PD) patients with triaxial accelerometer. Resting tremor of participants was diagnosed by Unified Parkinson's Disease Rating Scale (UPDRS) by neurologist. First, we segmented acceleration signal during 7 seconds from recorded data. To estimate a displacement of tremor, we performed double integration from the acceleration. Prior to double integration, moving average method was used to reduce an error of integral constant. After estimation of displacement, we calculated tremor distance during 1s from segmented signal using Euclidean distance. We evaluated the distance values compared with UPDRS. Averaged moving distance during 1 second corresponding to UPDRS 1 was 11.52 mm, that of UPDRS 2 was 33.58 mm and tremor distance of UPDRS 3 was 382.22 mm. Estimated moving distance during 1s was proportional to clinical rating scale--UPDRS.

  16. Evaluation of solar pond performance

    SciTech Connect

    Wittenberg, L.J.

    1980-01-01

    The City of Miamisburg, Ohio, constructed during 1978 a large, salt-gradient solar pond as part of its community park development project. The thermal energy stored in the pond is being used to heat an outdoor swimming pool in the summer and an adjacent recreational building during part of the winter. This solar pond, which occupies an area of 2020 m/sup 2/ (22,000 sq. ft.), was designed from experience obtained at smaller research ponds located at Ohio State University, the University of New Mexico and similar ponds operated in Israel. During the summer of 1979, the initial heat (40,000 kWh, 136 million Btu) was withdrawn from the solar pond to heat the outdoor swimming pool. All of the data collection systems were installed and functioned as designed so that operational data were obtained. The observed performance of the pond was compared with several of the predicted models for this type of pond. (MHR)

  17. Evaluation of static and dynamic perfusion cardiac computed tomography for quantitation and classification tasks.

    PubMed

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R; La Riviere, Patrick J; Alessio, Adam M

    2016-04-01

    Cardiac computed tomography (CT) acquisitions for perfusion assessment can be performed in a dynamic or static mode. Either method may be used for a variety of clinical tasks, including (1) stratifying patients into categories of ischemia and (2) using a quantitative myocardial blood flow (MBF) estimate to evaluate disease severity. In this simulation study, we compare method performance on these classification and quantification tasks for matched radiation dose levels and for different flow states, patient sizes, and injected contrast levels. Under conditions simulated, the dynamic method has low bias in MBF estimates (0 to [Formula: see text]) compared to linearly interpreted static assessment (0.45 to [Formula: see text]), making it more suitable for quantitative estimation. At matched radiation dose levels, receiver operating characteristic analysis demonstrated that the static method, with its high bias but generally lower variance, had superior performance ([Formula: see text]) in stratifying patients, especially for larger patients and lower contrast doses [area under the curve [Formula: see text] to 96 versus 0.86]. We also demonstrate that static assessment with a correctly tuned exponential relationship between the apparent CT number and MBF has superior quantification performance to static assessment with a linear relationship and to dynamic assessment. However, tuning the exponential relationship to the patient and scan characteristics will likely prove challenging. This study demonstrates that the selection and optimization of static or dynamic acquisition modes should depend on the specific clinical task.

  18. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  19. Evaluating IPMN and pancreatic carcinoma utilizing quantitative histopathology.

    PubMed

    Glazer, Evan S; Zhang, Hao Helen; Hill, Kimberly A; Patel, Charmi; Kha, Stephanie T; Yozwiak, Michael L; Bartels, Hubert; Nafissi, Nellie N; Watkins, Joseph C; Alberts, David S; Krouse, Robert S

    2016-10-01

    Intraductal papillary mucinous neoplasms (IPMN) are pancreatic lesions with uncertain biologic behavior. This study sought objective, accurate prediction tools, through the use of quantitative histopathological signatures of nuclear images, for classifying lesions as chronic pancreatitis (CP), IPMN, or pancreatic carcinoma (PC). Forty-four pancreatic resection patients were retrospectively identified for this study (12 CP; 16 IPMN; 16 PC). Regularized multinomial regression quantitatively classified each specimen as CP, IPMN, or PC in an automated, blinded fashion. Classification certainty was determined by subtracting the smallest classification probability from the largest probability (of the three groups). The certainty function varied from 1.0 (perfectly classified) to 0.0 (random). From each lesion, 180 ± 22 nuclei were imaged. Overall classification accuracy was 89.6% with six unique nuclear features. No CP cases were misclassified, 1/16 IPMN cases were misclassified, and 4/16 PC cases were misclassified. Certainty function was 0.75 ± 0.16 for correctly classified lesions and 0.47 ± 0.10 for incorrectly classified lesions (P = 0.0005). Uncertainty was identified in four of the five misclassified lesions. Quantitative histopathology provides a robust, novel method to distinguish among CP, IPMN, and PC with a quantitative measure of uncertainty. This may be useful when there is uncertainty in diagnosis.

  20. Theory and Practice on Teacher Performance Evaluation

    ERIC Educational Resources Information Center

    Yonghong, Cai; Chongde, Lin

    2006-01-01

    Teacher performance evaluation plays a key role in educational personnel reform, so it has been an important yet difficult issue in educational reform. Previous evaluations on teachers failed to make strict distinction among the three dominant types of evaluation, namely, capability, achievement, and effectiveness. Moreover, teacher performance…

  1. Toward objective and quantitative evaluation of imaging systems using images of phantoms.

    PubMed

    Gagne, Robert M; Gallas, Brandon D; Myers, Kyle J

    2006-01-01

    The use of imaging phantoms is a common method of evaluating image quality in the clinical setting. These evaluations rely on a subjective decision by a human observer with respect to the faintest detectable signal(s) in the image. Because of the variable and subjective nature of the human-observer scores, the evaluations manifest a lack of precision and a potential for bias. The advent of digital imaging systems with their inherent digital data provides the opportunity to use techniques that do not rely on human-observer decisions and thresholds. Using the digital data, signal-detection theory (SDT) provides the basis for more objective and quantitative evaluations which are independent of a human-observer decision threshold. In a SDT framework, the evaluation of imaging phantoms represents a "signal-known-exactly/background-known-exactly" ("SKE/ BKE") detection task. In this study, we compute the performance of prewhitening and nonprewhitening model observers in terms of the observer signal-to-noise ratio (SNR) for these "SK E/BKE" tasks. We apply the evaluation methods to a number of imaging systems. For example, we use data from a laboratory implementation of digital radiography and from a full-field digital mammography system in a clinical setting. In addition, we make a comparison of our methods to human-observer scoring of a set of digital images of the CDMAM phantom available from the internet (EUREF-European Reference Organization). In the latter case, we show a significant increase in the precision of the quantitative methods versus the variability in the scores from human observers on the same set of images. As regards bias, the performance of a model observer estimated from a finite data set is known to be biased. In this study, we minimize the bias and estimate the variance of the observer SNR using statistical resampling techniques, namely, "bootstrapping" and "shuffling" of the data sets. Our methods provide objective and quantitative evaluation of

  2. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  3. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  4. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  5. Impact of quantitative feedback and benchmark selection on radiation use by cardiologists performing cardiac angiography.

    PubMed

    Smith, Ian R; Cameron, James; Brighouse, Russell D; Ryan, Claire M; Foster, Kelley A; Rivers, John T

    2013-06-01

    Audit of and feedback on both group and individual data provided immediately after the point of care and compared with realistic benchmarks of excellence have been demonstrated to drive change. This study sought to evaluate the impact of immediate benchmarked quantitative case-based performance feedback on the clinical practice of cardiologists practicing at a private hospital in Brisbane, Australia. The participating cardiologists were assigned to one of two groups: Group 1 received patient and procedural details for review and Group 2 received Group 1 data plus detailed radiation data relating to the procedures and comparative benchmarks. In Group 2, Linear-by-Linear Association analysis suggests a link between change in radiation use and initial radiation dose category (p=0.014) with only those initially 'challenged' by the benchmarks showing improvement. Those not 'challenged' by the benchmarks deteriorated in performance compared with those starting well below the benchmarks showing greatest increase in radiation use. Conversely, those blinded to their radiation use (Group 1) showed general improvement in radiation use throughout the study compared with those performing initially close to the benchmarks showing greatest improvement. This study shows that use of non-challenging benchmarks in case-based radiation risk feedback does not promote a reduction in radiation use; indeed, it may contribute to increased doses. Paradoxically, cardiologists who are aware of performance monitoring but blinded to individual case data appear to maintain, if not reduce, their radiation use.

  6. A quantitative approach to evaluate image quality of whole slide imaging scanners

    PubMed Central

    Shrestha, Prarthana; Kneepkens, R.; Vrijnsen, J.; Vossen, D.; Abels, E.; Hulsken, B.

    2016-01-01

    Context: The quality of images produced by whole slide imaging (WSI) scanners has a direct influence on the readers’ performance and reliability of the clinical diagnosis. Therefore, WSI scanners should produce not only high quality but also consistent quality images. Aim: We aim to evaluate reproducibility of WSI scanners based on the quality of images produced over time and among multiple scanners. The evaluation is independent of content or context of test specimen. Methods: The ultimate judge of image quality is a pathologist, however, subjective evaluations are heavily influenced by the complexity of a case and subtle variations introduced by a scanner can be easily overlooked. Therefore, we employed a quantitative image quality assessment method based on clinically relevant parameters, such as sharpness and brightness, acquired in a survey of pathologists. The acceptable level of quality per parameter was determined in a subjective study. The evaluation of scanner reproducibility was conducted with Philips Ultra-Fast Scanners. A set of 36 HercepTest™ slides were used in three sub-studies addressing variations due to systems and time, producing 8640 test images for evaluation. Results: The results showed that the majority of images in all the sub-studies are within the acceptable quality level; however, some scanners produce higher quality images more often than others. The results are independent of case types, and they match our perception of quality. Conclusion: The quantitative image quality assessment method was successfully applied in the HercepTest™ slides to evaluate WSI scanner reproducibility. The proposed method is generic and applicable to any other types of slide stains and scanners. PMID:28197359

  7. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    SciTech Connect

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  8. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  9. Quantitative performance of a quadrupole-orbitrap-MS in targeted LC-MS determinations of small molecules.

    PubMed

    Grund, Baptiste; Marvin, Laure; Rochat, Bertrand

    2016-05-30

    High-resolution mass spectrometry (HRMS) has been associated with qualitative and research analysis and QQQ-MS with quantitative and routine analysis. This view is now challenged and for this reason, we have evaluated the quantitative LC-MS performance of a new high-resolution mass spectrometer (HRMS), a Q-orbitrap-MS, and compared the results obtained with a recent triple-quadrupole MS (QQQ-MS). High-resolution full-scan (HR-FS) and MS/MS acquisitions have been tested with real plasma extracts or pure standards. Limits of detection, dynamic range, mass accuracy and false positive or false negative detections have been determined or investigated with protease inhibitors, tyrosine kinase inhibitors, steroids and metanephrines. Our quantitative results show that today's available HRMS are reliable and sensitive quantitative instruments and comparable to QQQ-MS quantitative performance. Taking into account their versatility, user-friendliness and robustness, we believe that HRMS should be seen more and more as key instruments in quantitative LC-MS analyses. In this scenario, most targeted LC-HRMS analyses should be performed by HR-FS recording virtually "all" ions. In addition to absolute quantifications, HR-FS will allow the relative quantifications of hundreds of metabolites in plasma revealing individual's metabolome and exposome. This phenotyping of known metabolites should promote HRMS in clinical environment. A few other LC-HRMS analyses should be performed in single-ion-monitoring or MS/MS mode when increased sensitivity and/or detection selectivity will be necessary.

  10. Conductor gestures influence evaluations of ensemble performance

    PubMed Central

    Morrison, Steven J.; Price, Harry E.; Smedley, Eric M.; Meals, Cory D.

    2014-01-01

    Previous research has found that listener evaluations of ensemble performances vary depending on the expressivity of the conductor’s gestures, even when performances are otherwise identical. It was the purpose of the present study to test whether this effect of visual information was evident in the evaluation of specific aspects of ensemble performance: articulation and dynamics. We constructed a set of 32 music performances that combined auditory and visual information and were designed to feature a high degree of contrast along one of two target characteristics: articulation and dynamics. We paired each of four music excerpts recorded by a chamber ensemble in both a high- and low-contrast condition with video of four conductors demonstrating high- and low-contrast gesture specifically appropriate to either articulation or dynamics. Using one of two equivalent test forms, college music majors and non-majors (N = 285) viewed sixteen 30 s performances and evaluated the quality of the ensemble’s articulation, dynamics, technique, and tempo along with overall expressivity. Results showed significantly higher evaluations for performances featuring high rather than low conducting expressivity regardless of the ensemble’s performance quality. Evaluations for both articulation and dynamics were strongly and positively correlated with evaluations of overall ensemble expressivity. PMID:25104944

  11. Promising quantitative nondestructive evaluation techniques for composite materials

    NASA Technical Reports Server (NTRS)

    Williams, J. H., Jr.; Lee, S. S.

    1985-01-01

    Some recent results in the area of the ultrasonic, acoustic emission, thermographic, and acousto-ultrasonic NDE of composites are reviewed. In particular, attention is given to the progress in the use of ultrasonic attenuation, acoustic emission (parameter) delay, liquid-crystal thermography, and the stress wave factor in structural integrity monitoring of composite materials. The importance of NDE flaw significance characterizations is emphasized since such characterizations can directly indicate the appropriate NDE technique sensitivity requirements. The role of the NDE of flawed composites with and without overt defects in establishing quantitative accept/reject criteria for structural integrity assessment is discussed.

  12. Quantitative Guidance for Stove Usage and Performance to Achieve Health and Environmental Targets

    PubMed Central

    Chiang, Ranyee A.

    2015-01-01

    Background Displacing the use of polluting and inefficient cookstoves in developing countries is necessary to achieve the potential health and environmental benefits sought through clean cooking solutions. Yet little quantitative context has been provided on how much displacement of traditional technologies is needed to achieve targets for household air pollutant concentrations or fuel savings. Objectives This paper provides instructive guidance on the usage of cooking technologies required to achieve health and environmental improvements. Methods We evaluated different scenarios of displacement of traditional stoves with use of higher performing technologies. The air quality and fuel consumption impacts were estimated for these scenarios using a single-zone box model of indoor air quality and ratios of thermal efficiency. Results Stove performance and usage should be considered together, as lower performing stoves can result in similar or greater benefits than a higher performing stove if the lower performing stove has considerably higher displacement of the baseline stove. Based on the indoor air quality model, there are multiple performance–usage scenarios for achieving modest indoor air quality improvements. To meet World Health Organization guidance levels, however, three-stone fire and basic charcoal stove usage must be nearly eliminated to achieve the particulate matter target (< 1–3 hr/week), and substantially limited to meet the carbon monoxide guideline (< 7–9 hr/week). Conclusions Moderate health gains may be achieved with various performance–usage scenarios. The greatest benefits are estimated to be achieved by near-complete displacement of traditional stoves with clean technologies, emphasizing the need to shift in the long term to near exclusive use of clean fuels and stoves. The performance–usage scenarios are also provided as a tool to guide technology selection and prioritize behavior change opportunities to maximize impact. Citation

  13. ATAMM enhancement and multiprocessor performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.; Som, Sukhamoy; Obando, Rodrigo; Malekpour, Mahyar R.; Jones, Robert L., III; Mandala, Brij Mohan V.

    1991-01-01

    ATAMM (Algorithm To Architecture Mapping Model) enhancement and multiprocessor performance evaluation is discussed. The following topics are included: the ATAMM model; ATAMM enhancement; ADM (Advanced Development Model) implementation of ATAMM; and ATAMM support tools.

  14. Performance Evaluation of Undulator Radiation at CEBAF

    SciTech Connect

    Chuyu Liu, Geoffrey Krafft, Guimei Wang

    2010-05-01

    The performance of undulator radiation (UR) at CEBAF with a 3.5 m helical undulator is evaluated and compared with APS undulator-A radiation in terms of brilliance, peak brilliance, spectral flux, flux density and intensity distribution.

  15. Quantitative analytical method to evaluate the metabolism of vitamin D.

    PubMed

    Mena-Bravo, A; Ferreiro-Vera, C; Priego-Capote, F; Maestro, M A; Mouriño, A; Quesada-Gómez, J M; Luque de Castro, M D

    2015-03-10

    A method for quantitative analysis of vitamin D (both D2 and D3) and its main metabolites - monohydroxylated vitamin D (25-hydroxyvitamin D2 and 25-hydroxyvitamin D3) and dihydroxylated metabolites (1,25-dihydroxyvitamin D2, 1,25-dihydroxyvitamin D3 and 24,25-dihydroxyvitamin D3) in human serum is here reported. The method is based on direct analysis of serum by an automated platform involving on-line coupling of a solid-phase extraction workstation to a liquid chromatograph-tandem mass spectrometer. Detection of the seven analytes was carried out by the selected reaction monitoring (SRM) mode, and quantitative analysis was supported on the use of stable isotopic labeled internal standards (SIL-ISs). The detection limits were between 0.3-75pg/mL for the target compounds, while precision (expressed as relative standard deviation) was below 13.0% for between-day variability. The method was externally validated according to the vitamin D External Quality Assurance Scheme (DEQAS) through the analysis of ten serum samples provided by this organism. The analytical features of the method support its applicability in nutritional and clinical studies targeted at elucidating the role of vitamin D metabolism.

  16. Quantitative evaluation of bioorthogonal chemistries for surface functionalization of nanoparticles.

    PubMed

    Feldborg, Lise N; Jølck, Rasmus I; Andresen, Thomas L

    2012-12-19

    We present here a highly efficient and chemoselective liposome functionalization method based on oxime bond formation between a hydroxylamine and an aldehyde-modified lipid component. We have conducted a systematic and quantitative comparison of this new approach with other state-of-the-art conjugation reactions in the field. Targeted liposomes that recognize overexpressed receptors or antigens on diseased cells have great potential in therapeutic and diagnostic applications. However, chemical modifications of nanoparticle surfaces by postfunctionalization approaches are less effective than in solution and often not high-yielding. In addition, the conjugation efficiency is often challenging to characterize and therefore not addressed in many reports. We present here an investigation of PEGylated liposomes functionalized with a neuroendocrine tumor targeting peptide (TATE), synthesized with a variety of functionalities that have been used for surface conjugation of nanoparticles. The reaction kinetics and overall yield were quantified by HPLC. Reactions were conducted in solution as well as by postfunctionalization of liposomes in order to study the effects of steric hindrance and possible affinity between the peptide and the liposome surface. These studies demonstrate the importance of choosing the correct chemistry in order to obtain a quantitative surface functionalization of liposomes.

  17. Quantitative analysis and chromatographic fingerprinting for the quality evaluation of Scutellaria baicalensis Georgi using capillary electrophoresis.

    PubMed

    Yu, Ke; Gong, Yifei; Lin, Zhongying; Cheng, Yiyu

    2007-01-17

    Quantitative analysis and chromatographic fingerprinting for the quality evaluation of a Chinese herb Scutellaria baicalensis Georgi using capillary electrophoresis (CE) technique was developed. The separation was performed with a 50.0cm (42.0cm to the detector window)x75mum i.d. fused-silica capillary, and the CE fingerprint condition was optimized using the combination of central composite design and multivariate analysis. The optimized buffer system containing 15mM borate, 40mM phosphate, 15mM SDS, 15% (v/v) acetonitrile and 7.5% (v/v) 2-propanol was employed for the method development, and the baseline separation was achieved within 15min. The determination of the major active components (Baicalin, Baicalein and Wogonin) was carried out using the optimized CE condition. Good linear relationships were provided over the investigated concentration ranges (the values of R(2): 0.9997 for Baicalin, 0.9992 for Baicalein, and 0.9983 for Wogonin, respectively). The average recoveries of these target components ranged between 96.1-105.6%, 98.6-105.2%, and 96.3-105.0%, respectively. CE fingerprints combined with the quantitative analysis can be used for the quality evaluation of S. baicalensis.

  18. Evaluation of high-performance computing software

    SciTech Connect

    Browne, S.; Dongarra, J.; Rowan, T.

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  19. Improvement of Automotive Part Supplier Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Kongmunee, Chalermkwan; Chutima, Parames

    2016-05-01

    This research investigates the problem of the part supplier performance evaluation in a major Japanese automotive plant in Thailand. Its current evaluation scheme is based on experiences and self-opinion of the evaluators. As a result, many poor performance suppliers are still considered as good suppliers and allow to supply parts to the plant without further improvement obligation. To alleviate this problem, the brainstorming session among stakeholders and evaluators are formally conducted. The result of which is the appropriate evaluation criteria and sub-criteria. The analytical hierarchy process is also used to find suitable weights for each criteria and sub-criteria. The results show that a newly developed evaluation method is significantly better than the previous one in segregating between good and poor suppliers.

  20. Building Leadership Talent through Performance Evaluation

    ERIC Educational Resources Information Center

    Clifford, Matthew

    2015-01-01

    Most states and districts scramble to provide professional development to support principals, but "principal evaluation" is often lost amid competing priorities. Evaluation is an important method for supporting principal growth, communicating performance expectations to principals, and improving leadership practice. It provides leaders…

  1. Automatic Singing Performance Evaluation for Untrained Singers

    NASA Astrophysics Data System (ADS)

    Cao, Chuan; Li, Ming; Wu, Xiao; Suo, Hongbin; Liu, Jian; Yan, Yonghong

    In this letter, we present an automatic approach of objective singing performance evaluation for untrained singers by relating acoustic measurements to perceptual ratings of singing voice quality. Several acoustic parameters and their combination features are investigated to find objective correspondences of the perceptual evaluation criteria. Experimental results show relative strong correlation between perceptual ratings and the combined features and the reliability of the proposed evaluation system is tested to be comparable to human judges.

  2. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  3. Using quantitative interference phase microscopy for sperm acrosome evaluation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Balberg, Michal; Kalinowski, Ksawery; Levi, Mattan; Shaked, Natan T.

    2016-03-01

    We demonstrate quantitative assessment of sperm cell morphology, primarily acrosomal volume, using quantitative interference phase microscopy (IPM). Normally, the area of the acrosome is assessed using dyes that stain the acrosomal part of the cell. We have imaged fixed individual sperm cells using IPM. Following, the sample was stained and the same cells were imaged using bright field microscopy (BFM). We identified the acrosome using the stained BFM image, and used it to define a quantitative corresponding area in the IPM image and determine a quantitative threshold for evaluating the volume of the acrosome.

  4. Principal Performance Areas and Principal Evaluation.

    ERIC Educational Resources Information Center

    Fletcher, Thomas E.; McInerney, William D.

    1995-01-01

    Summarizes a study that surveyed Indiana superintendents and examined their principal-evaluation instruments. Superintendents were asked which of 21 performance domains were most important and whether these were currently being assessed. Respondents generally agreed that all 21 performance domains identified by the National Policy Board for…

  5. A Quantitative Approach to Evaluating Training Curriculum Content Sampling Adequacy.

    ERIC Educational Resources Information Center

    Bownas, David A.; And Others

    1985-01-01

    Developed and illustrated a technique depicting the fit between training curriculum content and job performance requirements for three Coast Guard schools. Generated a listing of tasks which receive undue emphasis in training, tasks not being taught, and tasks instructors intend to train, but which course graduates are unable to perform.…

  6. Evaluation of Quantitative Environmental Stress Screening (ESS) Methods. Volume 1

    DTIC Science & Technology

    1991-11-01

    muu4 The objective of this study was to evaluate Environmental Stress Screening (ESS) techniques contained in DOD-HDBK-344,’ by applying the methodology...to several electronic products during actual factor production. Validation of the techniques , the develop- ment of improved, qi•p’lified,_ad...automated procedures and subsequent revisions to the Handbook were the objectives, qf the evaluation. The Rome Laboratory has developed techniques which

  7. Use of the Behaviorally Anchored Rating Scale in Evaluating Teacher Performance.

    ERIC Educational Resources Information Center

    Beebe, Robert J.

    Behaviorally anchored rating scales (BARS), a new quantitative method of employee performance evaluation, is advocated for teacher evaluation. Development of a BARS consists generally of five steps: a representative sample of potential raters generates the scales; the group identifies the broad qualities to be evaluated; the group formulates…

  8. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  9. Supplier Performance Evaluation and Rating System (SPEARS)

    SciTech Connect

    Oged, M.; Warner, D.; Gurbuz, E.

    1993-03-01

    The SSCL Magnet Quality Assurance Department has implemented a Supplier Performance Evaluation and Rating System (SPEARS) to assess supplier performance throughout the development and production stages of the SSCL program. The main objectives of SPEARS are to promote teamwork and recognize performance. This paper examines the current implementation of SPEARS. MSD QA supports the development and production of SSCsuperconducting magnets while implementing the requirements of DOE Order 5700.6C. The MSD QA program is based on the concept of continuous improvement in quality and productivity. The QA program requires that procurement of items and services be controlled to assure conformance to specification. SPEARS has been implemented to meet DOE requirements and to enhance overall confidence in supplier performance. Key elements of SPEARS include supplier evaluation and selection as well as evaluation of furnished quality through source inspection, audit, and receipt inspection. These elements are described in this paper.

  10. Quantitative evaluation of noise reduction strategies in dual-energy imaging.

    PubMed

    Warp, Richard J; Dobbins, James T

    2003-02-01

    In this paper we describe a quantitative evaluation of the performance of three dual-energy noise reduction algorithms: Kalender's correlated noise reduction (KCNR), noise clipping (NOC), and edge-predictive adaptive smoothing (EPAS). These algorithms were compared to a simple smoothing filter approach, using the variance and noise power spectrum measurements of the residual noise in dual-energy images acquired with an a-Si TFT flat-panel x-ray detector. An estimate of the true noise was made through a new method with subpixel accuracy by subtracting an individual image from an ensemble average image. The results indicate that in the lung regions of the tissue image, all three algorithms reduced the noise by similar percentages at high spatial frequencies (KCNR=88%, NOC=88%, EPAS=84%, NOC/KCNR=88%) and somewhat less at low spatial frequencies (KCNR=45%, NOC=54%, EPAS=52%, NOC/KCNR=55%). At low frequencies, the presence of edge artifacts from KCNR made the performance worse, thus NOC or NOC combined with KCNR performed best. At high frequencies, KCNR performed best in the bone image, yet NOC performed best in the tissue image. Noise reduction strategies in dual-energy imaging can be effective and should focus on blending various algorithms depending on anatomical locations.

  11. Quantitative PCR is a valuable tool to monitor performance of DNA-encoded chemical library selections.

    PubMed

    Li, Yizhou; Zimmermann, Gunther; Scheuermann, Jörg; Neri, Dario

    2017-02-21

    Phage-display libraries and DNA-encoded chemical libraries (DECL) represent useful tools for the isolation of specific binding molecules out of large combinatorial sets of compounds. In both methods, specific binders are recovered at the end of affinity capture procedures, using target proteins of interest immobilized on a solid support. However, while the efficiency of phage-display selections is routinely quantified by counting the phage titer before and after the affinity capture step, no similar quantification procedures have been reported for the characterization of DNA-encoded chemical library selections. In this article, we describe the potential and limitations of quantitative PCR (qPCR) methods for the evaluation of selection efficiency, using a combinatorial chemical library with more than 35 million compounds. In the experimental conditions chosen for the selections, a quantification of DNA input/recovery over five orders of magnitude could be performed, revealing a successful enrichment of abundant binders, which could be confirmed by DNA sequencing. qPCR provides rapid information about the performance of selections, thus facilitating the optimization of experimental conditions.

  12. Quantitative pharmaco-EEG and performance after administration of brotizolam to healthy volunteers

    PubMed Central

    Saletu, B.; Grünberger, J.; Linzmayer, L.

    1983-01-01

    1 The activity of brotizolam (0.1, 0.3 and 0.5 mg) was studied in normal subjects using quantitative pharmaco-EEG, psychometric and clinical evaluation. 2 Power spectral density analysis showed no changes after placebo, while brotizolam increased beta-activity, decreased alpha-activity and increased the average frequency (anxiolytic pharmaco-EEG profile). In addition, 0.3 and 0.5 mg brotizolam augmented delta-activity indicating hypnotic activity. 3 The highest dose (0.5 mg) of brotizolam decreased attention, concentration, psychomotor performance and affectivity, and increased reaction time. The lower doses of brotizolam also caused a decrease in attention and concentration, but tended to improve psychomotor performance, shorten reaction time, and did not influence mood or affectivity. 4 Brotizolam (0.1 mg) is the minimal effective psychoactive dose with a tranquillizing effect, while 0.5 mg and to some extent 0.3 mg induce a sedative effect and may be regarded as hypnotic doses. PMID:6661379

  13. Performance Evaluation Model for Application Layer Firewalls

    PubMed Central

    Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall. PMID:27893803

  14. Performance Evaluation Model for Application Layer Firewalls.

    PubMed

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  15. Evaluation of various real-time reverse transcription quantitative PCR assays for norovirus detection.

    PubMed

    Yoo, Ju Eun; Lee, Cheonghoon; Park, SungJun; Ko, GwangPyo

    2017-02-01

    Human noroviruses are widespread and contagious viruses causing nonbacterial gastroenteritis. Real-time reverse transcription quantitative PCR (real-time RT-qPCR) is currently the gold standard for sensitive and accurate detection for these pathogens and serves as a critical tool in outbreak prevention and control. Different surveillance teams, however, may use different assays and variability in specimen conditions may lead to disagreement in results. Furthermore, the norovirus genome is highly variable and continuously evolving. These issues necessitate the re-examination of the real-time RT-qPCR's robustness in the context of accurate detection as well as the investigation of practical strategies to enhance assay performance. Four widely referenced real-time RT-qPCR assays (Assay A-D) were simultaneously performed to evaluate characteristics such as PCR efficiency, detection limit, as well as sensitivity and specificity with RT-PCR, and to assess the most accurate method for detecting norovirus genogroups I and II. Overall, Assay D was evaluated to be the most precise and accurate assay in this study. A Zen internal quencher, which decreases nonspecific fluorescence during the PCR reaction, was added to Assay D's probe which further improved assay performance. This study compared several detection assays for noroviruses and an improvement strategy based on such comparisons provided useful characterizations of a highly optimized real-time RT-qPCR assay for norovirus detection.

  16. A quantitative evaluation of confidence measures for stereo vision.

    PubMed

    Hu, Xiaoyan; Mordohai, Philippos

    2012-11-01

    We present an extensive evaluation of 17 confidence measures for stereo matching that compares the most widely used measures as well as several novel techniques proposed here. We begin by categorizing these methods according to which aspects of stereo cost estimation they take into account and then assess their strengths and weaknesses. The evaluation is conducted using a winner-take-all framework on binocular and multibaseline datasets with ground truth. It measures the capability of each confidence method to rank depth estimates according to their likelihood for being correct, to detect occluded pixels, and to generate low-error depth maps by selecting among multiple hypotheses for each pixel. Our work was motivated by the observation that such an evaluation is missing from the rapidly maturing stereo literature and that our findings would be helpful to researchers in binocular and multiview stereo.

  17. Digital holographic microscopy for quantitative cell dynamic evaluation during laser microsurgery

    PubMed Central

    Yu, Lingfeng; Mohanty, Samarendra; Zhang, Jun; Genc, Suzanne; Kim, Myung K.; Berns, Michael W.; Chen, Zhongping

    2010-01-01

    Digital holographic microscopy allows determination of dynamic changes in the optical thickness profile of a transparent object with subwavelength accuracy. Here, we report a quantitative phase laser microsurgery system for evaluation of cellular/ sub-cellular dynamic changes during laser micro-dissection. The proposed method takes advantage of the precise optical manipulation by the laser microbeam and quantitative phase imaging by digital holographic microscopy with high spatial and temporal resolution. This system will permit quantitative evaluation of the damage and/or the repair of the cell or cell organelles in real time. PMID:19582118

  18. Preprocessing of Edge of Light images: towards a quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Forsyth, David S.; Marincak, Anton

    2003-08-01

    A computer vision inspection system, named Edge of Light TM (EOL), was invented and developed at the Institute for Aerospace Research of the National Research Council Canada. One application of interest is the detection and quantitative measurement of "pillowing" caused by corrosion in the faying surfaces of aircraft fuselage joints. To quantify the hidden corrosion, one approach is to relate the average corrosion of a region to the peak-to-peak amplitude between two diagonally adjacent rivet centers. This raises the requirement for automatically locating the rivet centers. The first step to achieve this is the rivet edge detection. In this study, gradient-based edge detection, local energy based feature extraction, and an adaptive threshold method were employed to identify the edge of rivets, which facilitated the first step in the EOL quantification procedure. Furthermore, the brightness profile is processed by the derivative operation, which locates the pillowing along the scanning direction. The derivative curves present an estimation of the inspected surface.

  19. Performance Evaluation of Nano-JASMINE

    NASA Astrophysics Data System (ADS)

    Hatsutori, Y.; Kobayashi, Y.; Gouda, N.; Yano, T.; Murooka, J.; Niwa, Y.; Yamada, Y.

    2011-02-01

    We report the results of performance evaluation of the first Japanese astrometry satellite, Nano-JASMINE. It is a very small satellite and weighs only 35 kg. It aims to carry out astrometry measurement of nearby bright stars (z ≤ 7.5 mag) with an accuracy of 3 milli-arcseconds. Nano-JASMINE will be launched by Cyclone-4 rocket in August 2011 from Brazil. The current status is in the process of evaluating the performances. A series of performance tests and numerical analysis were conducted. As a result, the engineering model (EM) of the telescope was measured to be achieving a diffraction-limited performance and confirmed that it has enough performance for scientific astrometry.

  20. Evaluating student performance in clinical dietetics.

    PubMed

    Novascone, M A

    1985-06-01

    The focus of this study was on the development and field-testing of a set of behaviorally anchored rating scales for evaluating the clinical performance of dietetic students. The scales emphasized the application of skills and knowledge. A variation of the Smith-Kendall technique was used to develop the scales. The 42 participants involved in instrument development included dietetic students, didactic and clinical instructors, and dietetic practitioners. The completed instrument contained 8 dimension statements and 70 behavioral anchors. The instrument was field-tested in 16 clinical rotations within 8 dietetic education programs. Evaluators not only rated student performance but also critiqued the format and content of the scales. The mid-to-upper portions of each scale were used most frequently, and little score variation within or across programs was noted. The scales were deemed appropriate for formative evaluation; however, some evaluators who had to grade students' performance expressed a desire for performance standards defined in terms of grades. Because the process used to develop the instrument facilitated the articulation of performance criteria, it is recommended as a practical approach to setting performance standards.

  1. Quantitative Evaluation of 3 DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Sylto, R.

    1984-01-01

    Characteristics required for NASA scientific data base management application are listed as well as performance testing objectives. Results obtained for the ORACLE, SEED, and INGRES packages are presented in charts. It is concluded that vendor packages can manage 130 megabytes of data at acceptable load and query rates. Performance tests varying data base designs and various data base management system parameters are valuable to applications for choosing packages and critical to designing effective data bases. An applications productivity increases with the use of data base management system because of enhanced capabilities such as a screen formatter, a reporter writer, and a data dictionary.

  2. Quantitative Evaluation of Scintillation Camera Imaging Characteristics of Isotopes Used in Liver Radioembolization

    PubMed Central

    Elschot, Mattijs; Nijsen, Johannes Franciscus Wilhelmus; Dam, Alida Johanna; de Jong, Hugo Wilhelmus Antonius Maria

    2011-01-01

    Background Scintillation camera imaging is used for treatment planning and post-treatment dosimetry in liver radioembolization (RE). In yttrium-90 (90Y) RE, scintigraphic images of technetium-99m (99mTc) are used for treatment planning, while 90Y Bremsstrahlung images are used for post-treatment dosimetry. In holmium-166 (166Ho) RE, scintigraphic images of 166Ho can be used for both treatment planning and post-treatment dosimetry. The aim of this study is to quantitatively evaluate and compare the imaging characteristics of these three isotopes, in order that imaging protocols can be optimized and RE studies with varying isotopes can be compared. Methodology/Principal Findings Phantom experiments were performed in line with NEMA guidelines to assess the spatial resolution, sensitivity, count rate linearity, and contrast recovery of 99mTc, 90Y and 166Ho. In addition, Monte Carlo simulations were performed to obtain detailed information about the history of detected photons. The results showed that the use of a broad energy window and the high-energy collimator gave optimal combination of sensitivity, spatial resolution, and primary photon fraction for 90Y Bremsstrahlung imaging, although differences with the medium-energy collimator were small. For 166Ho, the high-energy collimator also slightly outperformed the medium-energy collimator. In comparison with 99mTc, the image quality of both 90Y and 166Ho is degraded by a lower spatial resolution, a lower sensitivity, and larger scatter and collimator penetration fractions. Conclusions/Significance The quantitative evaluation of the scintillation camera characteristics presented in this study helps to optimize acquisition parameters and supports future analysis of clinical comparisons between RE studies. PMID:22073149

  3. Quantitative Evaluation of Papilledema from Stereoscopic Color Fundus Photographs

    PubMed Central

    Tang, Li; Kardon, Randy H.; Wang, Jui-Kai; Garvin, Mona K.; Lee, Kyungmoo; Abràmoff, Michael D.

    2012-01-01

    Purpose. To derive a computerized measurement of optic disc volume from digital stereoscopic fundus photographs for the purpose of diagnosing and managing papilledema. Methods. Twenty-nine pairs of stereoscopic fundus photographs and optic nerve head (ONH) centered spectral domain optical coherence tomography (SD-OCT) scans were obtained at the same visit in 15 patients with papilledema. Some patients were imaged at multiple visits in order to assess their changes. Three-dimensional shape of the ONH was estimated from stereo fundus photographs using an automated multi-scale stereo correspondence algorithm. We assessed the correlation of the stereo volume measurements with the SD-OCT volume measurements quantitatively, in terms of volume of retinal surface elevation above a reference plane and also to expert grading of papilledema from digital fundus photographs using the Frisén grading scale. Results. The volumetric measurements of retinal surface elevation estimated from stereo fundus photographs and OCT scans were positively correlated (correlation coefficient r2 = 0.60; P < 0.001) and were positively correlated with Frisén grade (Spearman correlation coefficient r = 0.59; P < 0.001). Conclusions. Retinal surface elevation among papilledema patients obtained from stereo fundus photographs compares favorably with that from OCT scans and with expert grading of papilledema severity. Stereoscopic color imaging of the ONH combined with a method of automated shape reconstruction is a low-cost alternative to SD-OCT scans that has potential for a more cost-effective diagnosis and management of papilledema in a telemedical setting. An automated three-dimensional image analysis method was validated that quantifies the retinal surface topography with an imaging modality that has lacked prior objective assessment. PMID:22661468

  4. Quantitative evaluation of the major determinants of human gait.

    PubMed

    Lin, Yi-Chung; Gfoehler, Margit; Pandy, Marcus G

    2014-04-11

    Accurate knowledge of the isolated contributions of joint movements to the three-dimensional displacement of the center of mass (COM) is fundamental for understanding the kinematics of normal walking and for improving the treatment of gait disabilities. Saunders et al. (1953) identified six kinematic mechanisms to explain the efficient progression of the whole-body COM in the sagittal, transverse, and coronal planes. These mechanisms, referred to as the major determinants of gait, were pelvic rotation, pelvic list, stance knee flexion, foot and knee mechanisms, and hip adduction. The aim of the present study was to quantitatively assess the contribution of each major gait determinant to the anteroposterior, vertical, and mediolateral displacements of the COM over one gait cycle. The contribution of each gait determinant was found by applying the concept of an 'influence coefficient', wherein the partial derivative of the COM displacement with respect to a prescribed determinant was calculated. The analysis was based on three-dimensional measurements of joint angular displacements obtained from 23 healthy young adults walking at slow, normal and fast speeds. We found that hip flexion, stance knee flexion, and ankle-foot interaction (comprised of ankle plantarflexion, toe flexion and the displacement of the center of pressure) are the major determinants of the displacements of the COM in the sagittal plane, while hip adduction and pelvic list contribute most significantly to the mediolateral displacement of the COM in the coronal plane. Pelvic rotation and pelvic list contribute little to the vertical displacement of the COM at all walking speeds. Pelvic tilt, hip rotation, subtalar inversion, and back extension, abduction and rotation make negligible contributions to the displacements of the COM in all three anatomical planes.

  5. Skin moisturization by hydrogenated polyisobutene--quantitative and visual evaluation.

    PubMed

    Dayan, Nava; Sivalenka, Rajarajeswari; Chase, John

    2009-01-01

    Hydrogenated polyisobutene (HP) is used in topically applied cosmetic/personal care formulations as an emollient that leaves a pleasing skin feel when applied, and rubbed in after application. This effect, although distinguishable to the user, is difficult to define and quantify. Recognizing that some of the physical properties of HP such as film formation and wear resistance may contribute, in certain mechanisms, to skin moisturization, we designed a short-term pilot study to follow changes in skin moisturization. HP's incorporation into an o/w emulsion at 8% yielded increased viscosity and reduced emulsion droplet size as compared to the emollient ester CCT (capric/caprylic triglyceride) or a control formulation. Quantitative data indicate that application of the o/w emulsion formulation containing either HP or CCT significantly elevated skin moisture content and thus reduced transepidermal water loss (TEWL) by a maximal approximately 33% against the control formulation within 3 h and maintained this up to 6 h. Visual observation of skin treated with the HP-containing formulation showed fine texture and clear contrast as compared to the control or the CCT formulation, confirming this effect. As a result of increased hydration, skin conductivity, as measured in terms of corneometer values, was also elevated significantly by about tenfold as early as 20 min after HP or CCT application and was maintained throughout the test period. Throughout the test period the HP formulation was 5-10% more effective than the CCT formulation both in reduction of TEWL as well as in increased skin conductivity. Thus, compared to the emollient ester (CCT), HP showed a unique capability for long-lasting effect in retaining moisture and improving skin texture.

  6. Quantitative evaluation of material degradation by Barkhausen noise method

    SciTech Connect

    Yamaguchi, Atsunori; Maeda, Noriyoshi; Sugibayashi, Takuya

    1995-12-01

    Evaluation the life of nuclear power plant becomes inevitable to extend the plant operating period. This paper applied the magnetic method using Barkhausen noise (BHN) to detect the degradation by fatigue and thermal aging. Low alloy steel (SA 508 cl.2) was fatigued at the strain amplitudes of {+-}1% and {+-}0.4%, and duplex stainless steel (SCS14A) was heated at 400 C for a long period (thermal aging). For the degraded material by thermal aging, BHN was measured and good correlation between magnetic properties and absorption energy of the material was obtained. For fatigued material, BHNM was measured at each predetermined cycle and the effect of stress or strain of the material when it measured was evaluated, and good correlation between BHN and fatigue damage ratio was obtained.

  7. Quantitative vertebral compression fracture evaluation using a height compass

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Burns, Joseph E.; Wiese, Tatjana; Summers, Ronald M.

    2012-03-01

    Vertebral compression fractures can be caused by even minor trauma in patients with pathological conditions such as osteoporosis, varying greatly in vertebral body location and compression geometry. The location and morphology of the compression injury can guide decision making for treatment modality (vertebroplasty versus surgical fixation), and can be important for pre-surgical planning. We propose a height compass to evaluate the axial plane spatial distribution of compression injury (anterior, posterior, lateral, and central), and distinguish it from physiologic height variations of normal vertebrae. The method includes four steps: spine segmentation and partition, endplate detection, height compass computation and compression fracture evaluation. A height compass is computed for each vertebra, where the vertebral body is partitioned in the axial plane into 17 cells oriented about concentric rings. In the compass structure, a crown-like geometry is produced by three concentric rings which are divided into 8 equal length arcs by rays which are subtended by 8 common central angles. The radius of each ring increases multiplicatively, with resultant structure of a central node and two concentric surrounding bands of cells, each divided into octants. The height value for each octant is calculated and plotted against octants in neighboring vertebrae. The height compass shows intuitive display of the height distribution and can be used to easily identify the fracture regions. Our technique was evaluated on 8 thoraco-abdominal CT scans of patients with reported compression fractures and showed statistically significant differences in height value at the sites of the fractures.

  8. An evaluation of protein assays for quantitative determination of drugs.

    PubMed

    Williams, Katherine M; Arthur, Sarah J; Burrell, Gillian; Kelly, Fionnuala; Phillips, Darren W; Marshall, Thomas

    2003-07-31

    We have evaluated the response of six protein assays [the biuret, Lowry, bicinchoninic acid (BCA), Coomassie Brilliant Blue (CBB), Pyrogallol Red-Molybdate (PRM), and benzethonium chloride (BEC)] to 21 pharmaceutical drugs. The drugs evaluated were analgesics (acetaminophen, aspirin, codeine, methadone, morphine and pethidine), antibiotics (amoxicillin, ampicillin, gentamicin, neomycin, penicillin G and vancomycin), antipsychotics (chlorpromazine, fluphenazine, prochlorperazine, promazine and thioridazine) and water-soluble vitamins (ascorbic acid, niacinamide, pantothenic acid and pyridoxine). The biuret, Lowry and BCA assays responded strongly to most of the drugs tested. The PRM assay gave a sensitive response to the aminoglycoside antibiotics (gentamicin and neomycin) and the antipsychotic drugs. In contrast, the CBB assay showed little response to the aminoglycosides and gave a relatively poor response with the antipsychotics. The BEC assay did not respond significantly to the drugs tested. The response of the protein assays to the drugs was further evaluated by investigating the linearity of the response and the combined response of drug plus protein. The results are discussed with reference to drug interference in protein assays and the development of new methods for the quantification of drugs in protein-free solution.

  9. Evaluate reformer performance at a glance

    SciTech Connect

    Nag, A.

    1996-02-01

    Catalytic reforming is becoming increasingly important in replacing octane lost as the removal of lead from worldwide gasoline pools continues. A method has been developed that can quickly evaluate the performance of any catalytic reformer. The catalytic naphtha reforming process primarily involves three well-known reactions. These are aromatization of naphthenes, cyclization of paraffins and hydrocracking of paraffins. Hydrogen is produced in the process of aromatization and dehydrocyclization of paraffins. Reformer performance is normally evaluated with a reformate analysis (PONA) and yield of C{sub 5{sup +}} reformate. This method of quick evaluation of reformer performance is based upon the main assumption that the increase in hydrocarbon moles in the process is equal to the number of C{single_bond}C bond ruptures and one mole of hydrogen is absorbed to saturate the same. This new method calculates aromatization efficiency, paraffin conversion, aromatic selectivity and finally the paraffin, naphthene and aromatic content of C{sub 5{sup +}} reformate.

  10. Quantitative image analysis for evaluating the abrasion resistance of nanoporous silica films on glass

    NASA Astrophysics Data System (ADS)

    Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar

    2015-12-01

    The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5-6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance.

  11. Quantitative image analysis for evaluating the abrasion resistance of nanoporous silica films on glass

    PubMed Central

    Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar

    2015-01-01

    The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5–6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance. PMID:26656260

  12. Smith Newton Vehicle Performance Evaluation (Brochure)

    SciTech Connect

    Not Available

    2012-08-01

    The Fleet Test and Evaluation Team at the U.S. Department of Energy's National Renewable Energy Laboratory is evaluating and documenting the performance of electric and plug-in hybrid electric drive systems in medium-duty trucks across the nation. Through this project, Smith Electric Vehicles will build and deploy 500 all-electric medium-duty trucks. The trucks will be deployed in diverse climates across the country.

  13. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model

    PubMed Central

    Yoshioka, S.; Matsuhana, B.; Tanaka, S.; Inouye, Y.; Oshima, N.; Kinoshita, S.

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model. PMID:20554565

  14. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model.

    PubMed

    Yoshioka, S; Matsuhana, B; Tanaka, S; Inouye, Y; Oshima, N; Kinoshita, S

    2011-01-06

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model.

  15. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  16. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results.

  17. A new method to evaluate human-robot system performance

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  18. A new method to evaluate human-robot system performance.

    PubMed

    Rodriguez, G; Weisbin, C R

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  19. Quantitative Evaluation of the Reticuloendothelial System Function with Dynamic MRI

    PubMed Central

    Liu, Ting; Choi, Hoon; Zhou, Rong; Chen, I-Wei

    2014-01-01

    Purpose To evaluate the reticuloendothelial system (RES) function by real-time imaging blood clearance as well as hepatic uptake of superparamagnetic iron oxide nanoparticle (SPIO) using dynamic magnetic resonance imaging (MRI) with two-compartment pharmacokinetic modeling. Materials and Methods Kinetics of blood clearance and hepatic accumulation were recorded in young adult male 01b74 athymic nude mice by dynamic T2* weighted MRI after the injection of different doses of SPIO nanoparticles (0.5, 3 or 10 mg Fe/kg). Association parameter, Kin, dissociation parameter, Kout, and elimination constant, Ke, derived from dynamic data with two-compartment model, were used to describe active binding to Kupffer cells and extrahepatic clearance. The clodrosome and liposome were utilized to deplete macrophages and block the RES function to evaluate the capability of the kinetic parameters for investigation of macrophage function and density. Results The two-compartment model provided a good description for all data and showed a low sum squared residual for all mice (0.27±0.03). A lower Kin, a lower Kout and a lower Ke were found after clodrosome treatment, whereas a lower Kin, a higher Kout and a lower Ke were observed after liposome treatment in comparison to saline treatment (P<0.005). Conclusion Dynamic SPIO-enhanced MR imaging with two-compartment modeling can provide information on RES function on both a cell number and receptor function level. PMID:25090653

  20. Computerized quantitative evaluation of mammographic accreditation phantom images

    SciTech Connect

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria, the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.

  1. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  2. Smith Newton Vehicle Performance Evaluation - Cumulative (Brochure)

    SciTech Connect

    Not Available

    2014-08-01

    The Fleet Test and Evaluation Team at the U.S. Department of Energy's National Renewable Energy Laboratory is evaluating and documenting the performance of electric and plug-in hybrid electric drive systems in medium-duty trucks across the nation. U.S. companies participating in this evaluation project received funding from the American Recovery and Reinvestment Act to cover part of the cost of purchasing these vehicles. Through this project, Smith Electric Vehicles is building and deploying 500 all-electric medium-duty trucks that will be deployed by a variety of companies in diverse climates across the country.

  3. Performance evaluations of demountable electrical connections

    NASA Astrophysics Data System (ADS)

    Niemann, R. C.; Cha, Y. S.; Hull, J. R.; Buckles, W. E.; Daugherty, M. A.

    Electrical conductors operating in cryogenic environments can require demountable connections along their lengths. The connections must have low resistance and high reliability and should allow ready assembly and disassembly. In this work, the performance of two types of connections has been evaluated. The first connection type is a clamped surface-to-surface joint. The second connection type is a screwed joint that incorporates male and female machine-thread components. The connections for copper conductors have been evaluated experimentally at 77 K. Experimental variables included thread surface treatment and assembly methods. The results of the evaluations are presented.

  4. Prospective safety performance evaluation on construction sites.

    PubMed

    Wu, Xianguo; Liu, Qian; Zhang, Limao; Skibniewski, Miroslaw J; Wang, Yanhong

    2015-05-01

    This paper presents a systematic Structural Equation Modeling (SEM) based approach for Prospective Safety Performance Evaluation (PSPE) on construction sites, with causal relationships and interactions between enablers and the goals of PSPE taken into account. According to a sample of 450 valid questionnaire surveys from 30 Chinese construction enterprises, a SEM model with 26 items included for PSPE in the context of Chinese construction industry is established and then verified through the goodness-of-fit test. Three typical types of construction enterprises, namely the state-owned enterprise, private enterprise and Sino-foreign joint venture, are selected as samples to measure the level of safety performance given the enterprise scale, ownership and business strategy are different. Results provide a full understanding of safety performance practice in the construction industry, and indicate that the level of overall safety performance situation on working sites is rated at least a level of III (Fair) or above. This phenomenon can be explained that the construction industry has gradually matured with the norms, and construction enterprises should improve the level of safety performance as not to be eliminated from the government-led construction industry. The differences existing in the safety performance practice regarding different construction enterprise categories are compared and analyzed according to evaluation results. This research provides insights into cause-effect relationships among safety performance factors and goals, which, in turn, can facilitate the improvement of high safety performance in the construction industry.

  5. Quantitative evaluation of stone fragments in extracorporeal shock wave lithotripsy using a time reversal operator

    NASA Astrophysics Data System (ADS)

    Wang, Jen-Chieh; Zhou, Yufeng

    2017-03-01

    Extracorporeal shock wave lithotripsy (ESWL) has been used widely in the noninvasive treatment of kidney calculi. The fine fragments less than 2 mm in size can be discharged by urination, which determines the success of ESWL. Although ultrasonic and fluorescent imaging are used to localize the calculi, it's challenging to monitor the stone comminution progress, especially at the late stage of ESWL when fragments spread out as a cloud. The lack of real-time and quantitative evaluation makes this procedure semi-blind, resulting in either under- or over-treatment after the legal number of pulses required by FDA. The time reversal operator (TRO) method has the ability to detect point-like scatterers, and the number of non-zero eigenvalues of TRO is equal to that of the scatterers. In this study, the validation of TRO method to identify stones was illustrated from both numerical and experimental results for one to two stones with various sizes and locations. Furthermore, the parameters affecting the performance of TRO method has also been investigated. Overall, TRO method is effective in identifying the fragments in a stone cluster in real-time. Further development of a detection system and evaluation of its performance both in vitro and in vivo during ESWL is necessary for application.

  6. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  7. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  8. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  9. Quantitative Evaluation of Strain Near Tooth Fillet by Image Processing

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Yoshiizumi, Satoshi; Inoue, Katsumi

    The accurate measurement of strain and stress in a tooth is important for the reliable evaluation of the strength or life of gears. In this research, a strain measurement method which is based on image processing is applied to the analysis of strain near the tooth fillet. The loaded tooth is photographed using a CCD camera and stored as a digital image. The displacement of the point in the tooth flank is tracked by the cross-correlation method, and then, the strain is calculated. The interrogation window size of the correlation method and the overlap amount affect the accuracy and resolution. In the case of measurements at structures with complicated profiles such as fillets, the interrogation window maintains a large size and the overlap amount should be large. The surface condition also affects the accuracy. The white painted surface with a small black particle is suitable for measurement.

  10. A Quantitative Evaluation of Medication Histories and Reconciliation by Discipline

    PubMed Central

    Stewart, Michael R.; Fogg, Sarah M.; Schminke, Brandon C.; Zackula, Rosalee E.; Nester, Tina M.; Eidem, Leslie A.; Rosendale, James C.; Ragan, Robert H.; Bond, Jack A.; Goertzen, Kreg W.

    2014-01-01

    Abstract Background/Objective: Medication reconciliation at transitions of care decreases medication errors, hospitalizations, and adverse drug events. We compared inpatient medication histories and reconciliation across disciplines and evaluated the nature of discrepancies. Methods: We conducted a prospective cohort study of patients admitted from the emergency department at our 760-bed hospital. Eligible patients had their medication histories conducted and reconciled in order by the admitting nurse (RN), certified pharmacy technician (CPhT), and pharmacist (RPh). Discharge medication reconciliation was not altered. Admission and discharge discrepancies were categorized by discipline, error type, and drug class and were assigned a criticality index score. A discrepancy rating system systematically measured discrepancies. Results: Of 175 consented patients, 153 were evaluated. Total admission and discharge discrepancies were 1,461 and 369, respectively. The average number of medications per participant at admission was 8.59 (1,314) with 9.41 (1,374) at discharge. Most discrepancies were committed by RNs: 53.2% (777) at admission and 56.1% (207) at discharge. The majority were omitted or incorrect. RNs had significantly higher admission discrepancy rates per medication (0.59) compared with CPhTs (0.36) and RPhs (0.16) (P < .001). RPhs corrected significantly more discrepancies per participant than RNs (6.39 vs 0.48; P < .001); average criticality index reduction was 79.0%. Estimated prevented adverse drug events (pADEs) cost savings were $589,744. Conclusions: RPhs committed the fewest discrepancies compared with RNs and CPhTs, resulting in more accurate medication histories and reconciliation. RPh involvement also prevented the greatest number of medication errors, contributing to considerable pADE-related cost savings. PMID:25477614

  11. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  12. Quantitative evaluation of MPTP-treated nonhuman parkinsonian primates in the HALLWAY task.

    PubMed

    Campos-Romo, Aurelio; Ojeda-Flores, Rafael; Moreno-Briseño, Pablo; Fernandez-Ruiz, Juan

    2009-03-15

    Parkinson's disease (PD) is a progressive neurodegenerative disorder. An experimental model of this disease is produced in nonhuman primates by the administration of the neurotoxin 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP). In this work, we put forward a new quantitative evaluation method that uses video recordings to measure the displacement, gate, gross and fine motor performance of freely moving subjects. Four Vervet monkeys (Cercopithecus aethiops) were trained in a behavioral observation hallway while being recorded with digital video cameras from four different angles. After MPTP intoxication the animals were tested without any drug and after 30 and 90 min of Levodopa/Carbidopa administration. Using a personal computer the following behaviors were measured and evaluated from the video recordings: displacement time across the hallway, reaching time towards rewards, ingestion time, number of attempts to obtain rewards, number of rewards obtained, and level of the highest shelf reached for rewards. Our results show that there was an overall behavioral deterioration after MPTP administration and an overall improvement after Levodopa/Carbidopa treatment. This demonstrates that the HALLWAY task is a sensitive and objective method that allows detailed behavioral evaluation of freely moving monkeys in the MPTP Parkinson's disease model.

  13. Noninvasive Quantitative Evaluation of the Dentin Layer during Dental Procedures Using Optical Coherence Tomography

    PubMed Central

    Sinescu, Cosmin; Negrutiu, Meda Lavinia; Bradu, Adrian; Duma, Virgil-Florin; Podoleanu, Adrian Gh.

    2015-01-01

    A routine cavity preparation of a tooth may lead to opening the pulp chamber. The present study evaluates quantitatively, in real time, for the first time to the best of our knowledge, the drilled cavities during dental procedures. An established noninvasive imaging technique, Optical Coherence Tomography (OCT), is used. The main scope is to prevent accidental openings of the dental pulp chamber. Six teeth with dental cavities have been used in this ex vivo study. The real time assessment of the distances between the bottom of the drilled cavities and the top of the pulp chamber was performed using an own assembled OCT system. The evaluation of the remaining dentin thickness (RDT) allowed for the positioning of the drilling tools in the cavities in relation to the pulp horns. Estimations of the safe and of the critical RDT were made; for the latter, the opening of the pulp chamber becomes unavoidable. Also, by following the fractures that can occur when the extent of the decay is too large, the dentist can decide upon the right therapy to follow, endodontic or conventional filling. The study demonstrates the usefulness of OCT imaging in guiding such evaluations during dental procedures. PMID:26078779

  14. Noninvasive Quantitative Evaluation of the Dentin Layer during Dental Procedures Using Optical Coherence Tomography.

    PubMed

    Sinescu, Cosmin; Negrutiu, Meda Lavinia; Bradu, Adrian; Duma, Virgil-Florin; Podoleanu, Adrian Gh

    2015-01-01

    A routine cavity preparation of a tooth may lead to opening the pulp chamber. The present study evaluates quantitatively, in real time, for the first time to the best of our knowledge, the drilled cavities during dental procedures. An established noninvasive imaging technique, Optical Coherence Tomography (OCT), is used. The main scope is to prevent accidental openings of the dental pulp chamber. Six teeth with dental cavities have been used in this ex vivo study. The real time assessment of the distances between the bottom of the drilled cavities and the top of the pulp chamber was performed using an own assembled OCT system. The evaluation of the remaining dentin thickness (RDT) allowed for the positioning of the drilling tools in the cavities in relation to the pulp horns. Estimations of the safe and of the critical RDT were made; for the latter, the opening of the pulp chamber becomes unavoidable. Also, by following the fractures that can occur when the extent of the decay is too large, the dentist can decide upon the right therapy to follow, endodontic or conventional filling. The study demonstrates the usefulness of OCT imaging in guiding such evaluations during dental procedures.

  15. Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance.

    PubMed

    Romeo, Elizabeth M

    2010-07-01

    The concept of critical thinking has been influential in several disciplines. Both education and nursing in general have been attempting to define, teach, and measure this concept for decades. Nurse educators realize that critical thinking is the cornerstone of the objectives and goals for nursing students. The purpose of this article is to review and analyze quantitative research findings relevant to the measurement of critical thinking abilities and skills in undergraduate nursing students and the usefulness of critical thinking as a predictor of National Council Licensure Examination-Registered Nurse (NCLEX-RN) performance. The specific issues that this integrative review examined include assessment and analysis of the theoretical and operational definitions of critical thinking, theoretical frameworks used to guide the studies, instruments used to evaluate critical thinking skills and abilities, and the role of critical thinking as a predictor of NCLEX-RN outcomes. A list of key assumptions related to critical thinking was formulated. The limitations and gaps in the literature were identified, as well as the types of future research needed in this arena.

  16. Evaluating Performance Portability of OpenACC

    SciTech Connect

    Sabne, Amit J; Sakdhnagool, Putt; Lee, Seyong; Vetter, Jeffrey S

    2015-01-01

    Accelerator-based heterogeneous computing is gaining momentum in High Performance Computing arena. However, the increased complexity of the accelerator architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle the problem. While the abstraction endowed by OpenACC offers productivity, it raises questions on its portability. This paper evaluates the performance portability obtained by OpenACC on twelve OpenACC programs on NVIDIA CUDA, AMD GCN, and Intel MIC architectures. We study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.

  17. Evaluation of the National Science Foundation's Local Course Improvement Program, Volume II: Quantitative Analyses.

    ERIC Educational Resources Information Center

    Kulik, James A.; And Others

    This report is the second of three volumes describing the results of the evaluation of the National Science Foundation (NSF) Local Course Improvement (LOCI) program. This volume describes the quantitative results of the program. Evaluation of the LOCI program involved answering questions in the areas of the need for science course improvement as…

  18. A statistical model for assessing performance standards for quantitative and semiquantitative disinfectant test methods.

    PubMed

    Parker, Albert E; Hamilton, Martin A; Tomasino, Stephen F

    2014-01-01

    A performance standard for a disinfectant test method can be evaluated by quantifying the (Type I) pass-error rate for ineffective products and the (Type II) fail-error rate for highly effective products. This paper shows how to calculate these error rates for test methods where the log reduction in a microbial population is used as a measure of antimicrobial efficacy. The calculations can be used to assess performance standards that may require multiple tests of multiple microbes at multiple laboratories. Notably, the error rates account for among-laboratory variance of the log reductions estimated from a multilaboratory data set and the correlation among tests of different microbes conducted in the same laboratory. Performance standards that require that a disinfectant product pass all tests or multiple tests on average, are considered. The proposed statistical methodology is flexible and allows for a different acceptable outcome for each microbe tested, since, for example, variability may be different for different microbes. The approach can also be applied to semiquantitative methods for which product efficacy is reported as the number of positive carriers out of a treated set and the density of the microbes on control carriers is quantified, thereby allowing a log reduction to be calculated. Therefore, using the approach described in this paper, the error rates can also be calculated for semiquantitative method performance standards specified solely in terms of the maximum allowable number of positive carriers per test. The calculations are demonstrated in a case study of the current performance standard for the semiquantitative AOAC Use-Dilution Methods for Pseudomonas aeruginosa (964.02) and Staphylococcus aureus (955.15), which allow up to one positive carrier out of a set of 60 inoculated and treated carriers in each test. A simulation study was also conducted to verify the validity of the model's assumptions and accuracy. Our approach, easily implemented

  19. A Management System for Computer Performance Evaluation.

    DTIC Science & Technology

    1981-12-01

    1 Software . . . . . . . . . . . ............. Interaction; . . . . . . . . ............... 27 III. Design of a CPE Management...SEAFAC Workload. . . .............. SE WAC Computer H .ard.......... . . 57 SEAFAC Computer Software . . . . . . . . . . . . . 64 Summary...system hard--e/ software . It is a team that can either use or learn to use the tools and techniques of computer performance evaluation. The make-up of such

  20. GENERAL METHODS FOR REMEDIAL PERFORMANCE EVALUATIONS

    EPA Science Inventory

    This document was developed by an EPA-funded project to explain technical considerations and principles necessary to evaluated the performance of ground-water contamination remediations at hazardous waste sites. This is neither a "cookbook", nor an encyclopedia of recommended fi...

  1. Measures of Searcher Performance: A Psychometric Evaluation.

    ERIC Educational Resources Information Center

    Wildemuth, Barbara M.; And Others

    1993-01-01

    Describes a study of medical students that was conducted to evaluate measures of performance on factual searches of INQUIRER, a full-text database in microbiology. Measures relating to recall, precision, search term overlap, and efficiency are discussed; reliability and construct validity are considered; and implications for future research are…

  2. Performance evaluation of lightweight piezocomposite curved actuator

    NASA Astrophysics Data System (ADS)

    Goo, Nam Seo; Kim, Cheol; Park, Hoon C.; Yoon, Kwang J.

    2001-07-01

    A numerical method for the performance evaluation of LIPCA actuators is proposed using a finite element method. Fully-coupled formulations for piezo-electric materials are introduced and eight-node incompatible elements used. After verifying the developed code, the behavior of LIPCA actuators is investigated.

  3. Optical Storage Performance Modeling and Evaluation.

    ERIC Educational Resources Information Center

    Behera, Bailochan; Singh, Harpreet

    1990-01-01

    Evaluates different types of storage media for long-term archival storage of large amounts of data. Existing storage media are reviewed, including optical disks, optical tape, magnetic storage, and microfilm; three models are proposed based on document storage requirements; performance analysis is considered; and cost effectiveness is discussed.…

  4. Performance evaluation of an air solar collector

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Indoor tests on signal-glazed flat-plate collector are described in report. Marhsall Space Flight Center solar simulator is used to make tests. Test included evaluations on thermal performance under various combinations of flow rate, incident flux, inlet temperature, and wind speed. Results are presented in graph/table form.

  5. Performance Evaluation of a Semantic Perception Classifier

    DTIC Science & Technology

    2013-09-01

    Performance Evaluation of a Semantic Perception Classifier by Craig Lennon, Barry Bodt, Marshal Childers, Rick Camden, Arne Suppe, Luis...Camden and Nicoleta Florea Engility Corporation Luis Navarro-Serment and Arne Suppe Carnegie Mellon University...Lennon, Barry Bodt, Marshal Childers, Rick Camden,* Arne Suppe, † Luis Navarro-Serment, † and Nicoleta Florea* 5d. PROJECT NUMBER 5e. TASK

  6. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604... require a performance evaluation report on the work done by the architect-engineer after the completion of or during the construction of the designed project....

  7. EVALUATION OF CONFOCAL MICROSCOPY SYSTEM PERFORMANCE

    EPA Science Inventory

    BACKGROUND. The confocal laser scanning microscope (CLSM) has enormous potential in many biological fields. Currently there is a subjective nature in the assessment of a confocal microscope's performance by primarily evaluating the system with a specific test slide provided by ea...

  8. Performance evaluation of two personal bioaerosol samplers.

    PubMed

    Tolchinsky, Alexander D; Sigaev, Vladimir I; Varfolomeev, Alexander N; Uspenskaya, Svetlana N; Cheng, Yung S; Su, Wei-Chung

    2011-01-01

    In this study, the performance of two newly developed personal bioaerosol samplers for monitoring the level of environmental and occupational airborne microorganisms was evaluated. These new personal bioaerosol samplers were designed based on a swirling cyclone with recirculating liquid film. The performance evaluation included collection efficiency tests using inert aerosols, the bioaerosol survival test using viable airborne microorganism, and the evaluation of using non-aqueous collection liquid for long-period sampling. The test results showed that these two newly developed personal bioaerosol samplers are capable of doing high efficiency, aerosol sampling (the cutoff diameters are around 0.7 μm for both samplers), and have proven to provide acceptable survival for the collected bioaerosols. By using an appropriate non-aqueous collection liquid, these two personal bioaerosol samplers should be able to permit continuous, long-period bioaerosol sampling with considerable viability for the captured bioaerosols.

  9. Performance Evaluation of Dense Gas Dispersion Models.

    NASA Astrophysics Data System (ADS)

    Touma, Jawad S.; Cox, William M.; Thistle, Harold; Zapert, James G.

    1995-03-01

    This paper summarizes the results of a study to evaluate the performance of seven dense gas dispersion models using data from three field experiments. Two models (DEGADIS and SLAB) are in the public domain and the other five (AIRTOX, CHARM, FOCUS, SAFEMODE, and TRACE) are proprietary. The field data used are the Desert Tortoise pressurized ammonia releases, Burro liquefied natural gas spill tests, and the Goldfish anhydrous hydrofluoric acid spill experiments. Desert Tortoise and Goldfish releases were simulated as horizontal jet releases, and Burro as a liquid pool. Performance statistics were used to compare maximum observed concentrations and plume half-width to those predicted by each model. Model performance varied and no model exhibited consistently good performance across all three databases. However, when combined across the three databases, all models performed within a factor of 2. Problems encountered are discussed in order to help future investigators.

  10. DAPAR & ProStaR: software to perform statistical analyses in quantitative discovery proteomics.

    PubMed

    Wieczorek, Samuel; Combes, Florence; Lazar, Cosmin; Giai Gianetto, Quentin; Gatto, Laurent; Dorffer, Alexia; Hesse, Anne-Marie; Couté, Yohann; Ferro, Myriam; Bruley, Christophe; Burger, Thomas

    2017-01-01

    DAPAR and ProStaR are software tools to perform the statistical analysis of label-free XIC-based quantitative discovery proteomics experiments. DAPAR contains procedures to filter, normalize, impute missing value, aggregate peptide intensities, perform null hypothesis significance tests and select the most likely differentially abundant proteins with a corresponding false discovery rate. ProStaR is a graphical user interface that allows friendly access to the DAPAR functionalities through a web browser.

  11. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  12. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages.

    PubMed

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2013-12-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns.

  13. Evaluating Melanoma Drug Response and Therapeutic Escape with Quantitative Proteomics*

    PubMed Central

    Rebecca, Vito W.; Wood, Elizabeth; Fedorenko, Inna V.; Paraiso, Kim H. T.; Haarberg, H. Eirik; Chen, Yi; Xiang, Yun; Sarnaik, Amod; Gibney, Geoffrey T.; Sondak, Vernon K.; Koomen, John M.; Smalley, Keiran S. M.

    2014-01-01

    The evolution of cancer therapy into complex regimens with multiple drugs requires novel approaches for the development and evaluation of companion biomarkers. Liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) is a versatile platform for biomarker measurement. In this study, we describe the development and use of the LC-MRM platform to study the adaptive signaling responses of melanoma cells to inhibitors of HSP90 (XL888) and MEK (AZD6244). XL888 had good anti-tumor activity against NRAS mutant melanoma cell lines as well as BRAF mutant cells with acquired resistance to BRAF inhibitors both in vitro and in vivo. LC-MRM analysis showed HSP90 inhibition to be associated with decreased expression of multiple receptor tyrosine kinases, modules in the PI3K/AKT/mammalian target of rapamycin pathway, and the MAPK/CDK4 signaling axis in NRAS mutant melanoma cell lines and the inhibition of PI3K/AKT signaling in BRAF mutant melanoma xenografts with acquired vemurafenib resistance. The LC-MRM approach targeting more than 80 cancer signaling proteins was highly sensitive and could be applied to fine needle aspirates from xenografts and clinical melanoma specimens (using 50 μg of total protein). We further showed MEK inhibition to be associated with signaling through the NFκB and WNT signaling pathways, as well as increased receptor tyrosine kinase expression and activation. Validation studies identified PDGF receptor β signaling as a potential escape mechanism from MEK inhibition, which could be overcome through combined use of AZD6244 and the PDGF receptor inhibitor, crenolanib. Together, our studies show LC-MRM to have unique value as a platform for the systems level understanding of the molecular mechanisms of drug response and therapeutic escape. This work provides the proof-of-principle for the future development of LC-MRM assays for monitoring drug responses in the clinic. PMID:24760959

  14. PEAPOL (Program Evaluation at the Performance Objective Level) Outside Evaluation.

    ERIC Educational Resources Information Center

    Auvil, Mary S.

    In evaluating this pilot project, which developed a computer system for assessing student progress and cost effectiveness as related to achievement of performance objectives, interviews were conducted with project participants, including project staff, school administrators, and the auto shop instructors. Project documents were reviewed and a…

  15. An identity verifier evaluation of performance

    SciTech Connect

    Maxwell, R.L.

    1987-01-01

    Because the development of personnel identity verifiers is active in several areas, it is important that an independent comparative evaluation of such devices be continuously pursued for the security industry to apply such devices. An evaluation of several verifiers was recently conducted (in the winter of 1986/1987) at Sandia National Laboratories. In a nonrigorous attempt to comparatively evaluate these verifiers in a field security environment, about 80 individuals were enrolled in five different verifiers. The enrollees were than encouraged to attempt a verification on each device several times a day for about four months such that both single try and multiple try information could be extracted from the data. Results indicated a general improvement in verifier performance with regard to accuracy and operating time compared to previous similar evaluations of verifiers at Sandia.

  16. A quantitative evaluation of dry-sensor electroencephalography

    NASA Astrophysics Data System (ADS)

    Uy, E. Timothy

    Neurologists, neuroscientists, and experimental psychologists study electrical activity within the brain by recording voltage fluctuations at the scalp. This is electroencephalography (EEG). In conventional or "wet" EEG, scalp abrasion and use of electrolytic paste are required to insure good electrical connection between sensor and skin. Repeated abrasion quickly becomes irritating to subjects, severely limiting the number and frequency of sessions. Several groups have produced "dry" EEG sensors that do not require abrasion or conductive paste. These, in addition to sidestepping the issue of abrasion, promise to reduce setup time from about 30 minutes with a technician to less than 30 seconds without one. The availability of such an instrument would (1) reduce the cost of brain-related medical care, (2) lower the barrier of entry on brain experimentation, and (3) allow individual subjects to contribute substantially more data without fear of abrasion or fatigue. Accuracy of the EEG is paramount in the medical diagnosis of epilepsy, in experimental psychology and in the burgeoning field of brain-computer interface. Without a sufficiently accurate measurement, the advantages of dry sensors remain a moot point. However, even after nearly a decade, demonstrations of dry EEG accuracy with respect to wet have been limited to visual comparison of short snippets of spontaneous EEG, averaged event-related potentials or plots of power spectrum. In this dissertation, I propose a detailed methodology based on single-trial EEG classification for comparing dry EEG sensors to their wet counterparts. Applied to a set of commercially fabricated dry sensors, this work reveals that dry sensors can perform as well their wet counterparts with careful screening and attention to the bandwidth of interest.

  17. Simulation-based evaluation of the resolution and quantitative accuracy of temperature-modulated fluorescence tomography

    PubMed Central

    Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C.; Gulsen, Gultekin

    2016-01-01

    Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed “temperature-modulated fluorescence tomography” (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm × W :100 mm) is recovered as an elongated object in the conventional FT (x = 4.5 mm; y = 10.4 mm), while TM-FT recovers it successfully in both directions (x = 3.8 mm; y = 4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT. PMID:26368884

  18. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    PubMed Central

    Biradar, Nagashettappa; Dewal, M. L.; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  19. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  20. Behavioral patterns of environmental performance evaluation programs.

    PubMed

    Li, Wanxin; Mauerhofer, Volker

    2016-11-01

    During the past decades numerous environmental performance evaluation programs have been developed and implemented on different geographic scales. This paper develops a taxonomy of environmental management behavioral patterns in order to provide a practical comparison tool for environmental performance evaluation programs. Ten such programs purposively selected are mapped against the identified four behavioral patterns in the form of diagnosis, negotiation, learning, and socialization and learning. Overall, we found that schemes which serve to diagnose environmental abnormalities are mainly externally imposed and have been developed as a result of technical debates concerning data sources, methodology and ranking criteria. Learning oriented scheme is featured by processes through which free exchange of ideas, mutual and adaptive learning can occur. Scheme developed by higher authority for influencing behaviors of lower levels of government has been adopted by the evaluated to signal their excellent environmental performance. The socializing and learning classified evaluation schemes have incorporated dialogue, participation, and capacity building in program design. In conclusion we consider the 'fitness for purpose' of the various schemes, the merits of our analytical model and the future possibilities of fostering capacity building in the realm of wicked environmental challenges.

  1. Error Reduction Program. [combustor performance evaluation codes

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.

    1985-01-01

    The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.

  2. Performance Evaluation of a Data Validation System

    NASA Technical Reports Server (NTRS)

    Wong, Edmond (Technical Monitor); Sowers, T. Shane; Santi, L. Michael; Bickford, Randall L.

    2005-01-01

    Online data validation is a performance-enhancing component of modern control and health management systems. It is essential that performance of the data validation system be verified prior to its use in a control and health management system. A new Data Qualification and Validation (DQV) Test-bed application was developed to provide a systematic test environment for this performance verification. The DQV Test-bed was used to evaluate a model-based data validation package known as the Data Quality Validation Studio (DQVS). DQVS was employed as the primary data validation component of a rocket engine health management (EHM) system developed under NASA's NGLT (Next Generation Launch Technology) program. In this paper, the DQVS and DQV Test-bed software applications are described, and the DQV Test-bed verification procedure for this EHM system application is presented. Test-bed results are summarized and implications for EHM system performance improvements are discussed.

  3. Cavitation performance evaluation for a condensate pump

    NASA Astrophysics Data System (ADS)

    Yu, A.; Yu, W. P.; Pan, Z. B.; Luo, X. W.; Ji, B.; Y Xu, H.

    2013-12-01

    Cavitation in a condensate pump with specific speed of 95 m·m3s-1·min-1 was treated in this study. Cavitation performance for the pump was tested experimentally, and the steady state cavitating flows in the pump impeller were simulated by RANS method as well as a homogeneous cavitation model. It is noted that cavitating flow simulation reasonably depicted cavitation development in the pump. Compared to the tested results, the numerical simulation basically predicted later performance drops due to cavitation. Unfortunately, the cavitation simulation at the operation condition of 50% best efficiency point could not predict the head drop up to 3%. By applying the concept of relative cavity length cavitation performance evaluation is achieved. For better application, future study is necessary to establish the relation between relative cavity length and performance drop.

  4. A Monte Carlo and physical phantom evaluation of quantitative In-111 SPECT

    NASA Astrophysics Data System (ADS)

    He, Bin; Du, Yong; Song, Xiyun; Segars, W. Paul; Frey, Eric C.

    2005-09-01

    Accurate estimation of the 3D in vivo activity distribution is important for dose estimation in targeted radionuclide therapy (TRT). Although SPECT can potentially provide such estimates, SPECT without compensation for image degrading factors is not quantitatively accurate. In this work, we evaluated quantitative SPECT (QSPECT) reconstruction methods that include compensation for various physical effects. Experimental projection data were obtained using a GE VH/Hawkeye system and an RSD torso phantom. Known activities of In-111 chloride were placed in the lungs, liver, heart, background and two spherical compartments with inner diameters of 22 mm and 34 mm. The 3D NCAT phantom with organ activities based on clinically derived In-111 ibritumomab tiuxetan data was used for the Monte Carlo (MC) simulation studies. Low-noise projection data were simulated using previously validated MC simulation methods. Fifty sets of noisy projections with realistic count levels were generated. Reconstructions were performed using the OS-EM algorithm with various combinations of attenuation (A), scatter (S), geometric response (G), collimator-detector response (D) and partial volume compensation (PVC). The QSPECT images from the various combinations of compensations were evaluated in terms of the accuracy and precision of the estimates of the total activity in each organ. For experimental data, the errors in organ activities for ADS and PVC compensation were less than 6.5% except the smaller sphere (-11.9%). For the noisy simulated data, the errors in organ activity for ADS compensation were less than 5.5% except the lungs (20.9%) and blood vessels (15.2%). Errors for other combinations of compensations were significantly (A, AS) or somewhat (AGS) larger. With added PVC, the error in the organ activities improved slightly except for the lungs (11.5%) and blood vessels (3.6%) where the improvement was more substantial. The standard deviation/mean ratios were all less than 1.5%. We

  5. Performance Evaluation Methods for Assistive Robotic Technology

    NASA Astrophysics Data System (ADS)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  6. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    PubMed

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills.

  7. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  8. Evaluation of Performance Management in State Schools: A Case of North Cyprus

    ERIC Educational Resources Information Center

    Atamturk, Hakan; Aksal, Fahriye A.; Gazi, Zehra A.; Atamturk, A. Nurdan

    2011-01-01

    The research study aims to evaluate performance management in the state secondary schools in North Cyprus. This study is significant by shedding a light on perceptions of teachers and headmasters regarding quality control of schools through performance management. In this research, quantitative research was employed, and a survey was conducted to…

  9. Dynamic quantitative echocardiographic evaluation of mitral regurgitation in the operating department.

    PubMed

    Gisbert, Alejandro; Soulière, Vicky; Denault, André Y; Bouchard, Denis; Couture, Pierre; Pellerin, Michel; Carrier, Michel; Levesque, Sylvie; Ducharme, Anique; Basmadjian, Arsène J

    2006-02-01

    Hemodynamic modifications induced by general anesthesia could lead to underestimation of mitral regurgitation (MR) severity in the operating department and potentially serious consequences. The intraoperative severity of MR was prospectively compared with the preoperative baseline evaluation using dynamic quantitative transesophageal echocardiography in 25 patients who were stable with MR 2/4 or greater undergoing coronary bypass, mitral valve operation, or both. Significant changes in the severity of MR using transesophageal echocardiographic criteria occurred after the induction of general anesthesia and with phenylephrine. Quantitative transesophageal echocardiographic evaluation of MR using effective orifice area and vena contracta, and the use of phenylephrine challenge, were useful to avoid underestimating MR severity in the operating department.

  10. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  11. Performance Evaluation of Kitchen Exhaust Draft Hoods.

    DTIC Science & Technology

    1980-03-01

    1-ACI827 JOHNS - MANVILLE SALES CORP DENVER CO RESEARCH AND DEV-ETC F/e 13/1 PERFORMANCE EVALUATION OF KITCHEN EXHAUST DRAFT HOOOS. (U) MAR 80 P 8...ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT PROJECT. TASK AREA a WOPK UNIT NUMOERS Johns - Manville Sales Corper a, t Research & Development Center /0004...P. B. SHEPHERD, R. H. NEISEL JOHNS - MANVILLE SALES CORPORATION RESEARCH & DEVELOPMENT CENTER KEN-CARYL RANCH, DENVER, COLORADO 80217 MARCH 1980 FINAL

  12. Quantitative evaluation of the requirements for the promotion as associate professor at German Medical Faculties

    PubMed Central

    Sorg, Heiko; Knobloch, Karsten

    2012-01-01

    Background: First quantitative evaluation of the requirements for the promotion as associate professor (AP) at German Medical Faculties Material and methods: Analysis of the AP-regulations of German Medical Faculties according to a validated scoring system, which has been adapted to this study. Results: The overall scoring for the AP-requirements at 35 German Medical Faculties was 13.5±0.6 of 20 possible scoring points (95% confidence interval 12.2-14.7). More than 88% of the AP-regulations demand sufficient performance in teaching and research with adequate scientific publication. Furthermore, 83% of the faculties expect an expert review of the candidate´s performance. Conference presentations required as an assistant professor as well as the reduction of the minimum time as an assistant professor do only play minor roles. Conclusion: The requirements for assistant professors to get nominated as an associate professor at German Medical Faculties are high with an only small range. In detail, however, it can be seen that there still exists large heterogeneity, which hinders equal opportunities and career possibilities. These data might be used for the ongoing objective discussion. PMID:23255964

  13. Performance evaluation and clinical applications of 3D plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Decker, Ryan; Shademan, Azad; Opfermann, Justin; Leonard, Simon; Kim, Peter C. W.; Krieger, Axel

    2015-06-01

    The observation and 3D quantification of arbitrary scenes using optical imaging systems is challenging, but increasingly necessary in many fields. This paper provides a technical basis for the application of plenoptic cameras in medical and medical robotics applications, and rigorously evaluates camera integration and performance in the clinical setting. It discusses plenoptic camera calibration and setup, assesses plenoptic imaging in a clinically relevant context, and in the context of other quantitative imaging technologies. We report the methods used for camera calibration, precision and accuracy results in an ideal and simulated surgical setting. Afterwards, we report performance during a surgical task. Test results showed the average precision of the plenoptic camera to be 0.90mm, increasing to 1.37mm for tissue across the calibrated FOV. The ideal accuracy was 1.14mm. The camera showed submillimeter error during a simulated surgical task.

  14. Quantitative evaluation of regional vegetation ecological environment quality by using remotely sensed data over Qingjiang, Hubei

    NASA Astrophysics Data System (ADS)

    Wang, Cheng; Sun, Yan; Li, Lijun; Zhang, Qiuwen

    2007-11-01

    Vegetation cover is an important component and the best indication to the region ecological environment. The paper adopts a new method of integrating remote sensing technology and composite index appraisal model based multiple linear regression for quantitatively evaluating the regional vegetation ecological environment quality(VEEQ). This method is different to the traditional ecological environment research methods. It fully utilizes the advantages of quantitatively remote sensing technology, directly extracts the key influencing factors of VEEQ, such as vegetation indices (RVI, NDVI, ARVI, TMG), humidity indices(NDMI, MI, TMW), soil and landform indices(NDSI, TMB, GRABS) as the evaluating parameters from data the Landsat 5/TM remotely sensed images, and then puts these factors mentioned above into the multiple linear regression evaluating model. Ultimately we obtain the VEEQ evaluation rank figure of the experimental field-part of Qingjiang region. The handy multiple linear regression model, is proved to be well fit the experimental field for the vegetation ecological environment evaluation research.

  15. Project Performance Evaluation Using Deep Belief Networks

    NASA Astrophysics Data System (ADS)

    Nguvulu, Alick; Yamato, Shoso; Honma, Toshihisa

    A Project Assessment Indicator (PAI) Model has recently been applied to evaluate monthly project performance based on 15 project elements derived from the project management (PM) knowledge areas. While the PAI Model comprehensively evaluates project performance, it lacks objectivity and universality. It lacks objectivity because experts assign model weights intuitively based on their PM skills and experience. It lacks universality because the allocation of ceiling scores to project elements is done ad hoc based on the empirical rule without taking into account the interactions between the project elements. This study overcomes these limitations by applying a DBN approach where the model automatically assigns weights and allocates ceiling scores to the project elements based on the DBN weights which capture the interaction between the project elements. We train our DBN on 5 IT projects of 12 months duration and test it on 8 IT projects with less than 12 months duration. We completely eliminate the manual assigning of weights and compute ceiling scores of project elements based on DBN weights. Our trained DBN evaluates monthly project performance of the 8 test projects based on the 15 project elements to within a monthly relative error margin of between ±1.03 and ±3.30%.

  16. Detection and quantitation of HBV DNA in miniaturized samples: multi centre study to evaluate the performance of the COBAS ® AmpliPrep/COBAS ® TaqMan ® hepatitis B virus (HBV) test v2.0 by the use of plasma or serum specimens.

    PubMed

    Berger, Annemarie; Gohl, Peter; Stürmer, Martin; Rabenau, Holger Felix; Nauck, Markus; Doerr, Hans Wilhelm

    2010-11-01

    Laboratory analysis of blood specimens is an increasingly important tool for rapid diagnosis and control of therapy. So, miniaturization of test systems is needed, but reduced specimens might impair test quality. For rapid detection and quantitation of HBV DNA, the COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV test has proved a robust instrument in routine diagnostic services. The test system has been modified recently for application of reduced samples of blood plasma and for blood serum, too. The performance of this modified COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV v2.0 (HBV v2.0 (this test is currently not available in the USA)) test was evaluated by comparison with the former COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV v1.0 (HBV v1.0) test. In this study a platform correlation of both assay versions was done including 275 HBV DNA positive EDTA plasma samples. Comparable results were obtained (R(2)=0.97, mean difference -0.03 log(10)IU/ml). The verification of equivalency of the sample matrix (plasma vs. serum samples tested in HBV v2.0 in the same run) showed comparable results for all 278 samples with a R(2)=0.99 and a mean difference of 0.06 log(10)IU/ml. In conclusion, the new test version HBV v2.0 is highly specific and reproducible and quantifies accurately HBV DNA in EDTA plasma and serum samples from patients with chronic HBV infection.

  17. Analytical performance evaluation for autonomous sensor fusion

    NASA Astrophysics Data System (ADS)

    Chang, K. C.

    2008-04-01

    A distributed data fusion system consists of a network of sensors, each capable of local processing and fusion of sensor data. There has been a great deal of work in developing distributed fusion algorithms applicable to a network centric architecture. Currently there are at least a few approaches including naive fusion, cross-correlation fusion, information graph fusion, maximum a posteriori (MAP) fusion, channel filter fusion, and covariance intersection fusion. However, in general, in a distributed system such as the ad hoc sensor networks, the communication architecture is not fixed. Each node has knowledge of only its local connectivity but not the global network topology. In those cases, the distributed fusion algorithm based on information graph type of approach may not scale due to its requirements to carry long pedigree information for decorrelation. In this paper, we focus on scalable fusion algorithms and conduct analytical performance evaluation to compare their performance. The goal is to understand the performance of those algorithms under different operating conditions. Specifically, we evaluate the performance of channel filter fusion, Chernoff fusion, Shannon Fusion, and Battachayya fusion algorithms. We also compare their results to NaÃve fusion and "optimal" centralized fusion algorithms under a specific communication pattern.

  18. 'Stories' or 'snapshots'? A study directed at comparing qualitative and quantitative approaches to curriculum evaluation.

    PubMed

    Pateman, B; Jinks, A M

    1999-01-01

    The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.

  19. Preclinical Performance Evaluation of Percutaneous Glucose Biosensors

    PubMed Central

    Soto, Robert J.; Schoenfisch, Mark H.

    2015-01-01

    The utility of continuous glucose monitoring devices remains limited by an obstinate foreign body response (FBR) that degrades the analytical performance of the in vivo sensor. A number of novel materials that resist or delay the FBR have been proposed as outer, tissue-contacting glucose sensor membranes as a strategy to improve sensor accuracy. Traditionally, researchers have examined the ability of a material to minimize the host response by assessing adsorbed cell morphology and tissue histology. However, these techniques do not adequately predict in vivo glucose sensor function, necessitating sensor performance evaluation in a relevant animal model prior to human testing. Herein, the effects of critical experimental parameters, including the animal model and data processing methods, on the reliability and usefulness of preclinical sensor performance data are considered. PMID:26085566

  20. Group 3: Performance evaluation and assessment

    NASA Technical Reports Server (NTRS)

    Frink, A.

    1981-01-01

    Line-oriented flight training provides a unique learning experience and an opportunity to look at aspects of performance other types of training did not provide. Areas such as crew coordination, resource management, leadership, and so forth, can be readily evaluated in such a format. While individual performance is of the utmost importance, crew performance deserves equal emphasis, therefore, these areas should be carefully observed by the instructors as an rea for discussion in the same way that individual performane is observed. To be effective, it must be accepted by the crew members, and administered by the instructors as pure training-learning through experience. To keep open minds, to benefit most from the experience, both in the doing and in the follow-on discussion, it is essential that it be entered into with a feeling of freedom, openness, and enthusiasm. Reserve or defensiveness because of concern for failure must be inhibit participation.

  1. Evaluating Algorithm Performance Metrics Tailored for Prognostics

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2009-01-01

    Prognostics has taken a center stage in Condition Based Maintenance (CBM) where it is desired to estimate Remaining Useful Life (RUL) of the system so that remedial measures may be taken in advance to avoid catastrophic events or unwanted downtimes. Validation of such predictions is an important but difficult proposition and a lack of appropriate evaluation methods renders prognostics meaningless. Evaluation methods currently used in the research community are not standardized and in many cases do not sufficiently assess key performance aspects expected out of a prognostics algorithm. In this paper we introduce several new evaluation metrics tailored for prognostics and show that they can effectively evaluate various algorithms as compared to other conventional metrics. Specifically four algorithms namely; Relevance Vector Machine (RVM), Gaussian Process Regression (GPR), Artificial Neural Network (ANN), and Polynomial Regression (PR) are compared. These algorithms vary in complexity and their ability to manage uncertainty around predicted estimates. Results show that the new metrics rank these algorithms in different manner and depending on the requirements and constraints suitable metrics may be chosen. Beyond these results, these metrics offer ideas about how metrics suitable to prognostics may be designed so that the evaluation procedure can be standardized. 1

  2. Evaluation of Fourier Transform Profilometry for Quantitative Waste Volume Determination under Simulated Hanford Tank Conditions

    SciTech Connect

    Etheridge, J.A.; Jang, P.R.; Leone, T.; Long, Z.; Norton, O.P.; Okhuysen, W.P.; Monts, D.L.; Coggins, T.L.

    2008-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of chemical and radioactive species of interest. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We are conducting a multi-stage performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. The successive stages impose aspects that present increasing difficulty and increasingly more accurate approximations of in-tank environments. In this paper, we report our investigations of the dependence of the analyst upon FTP volume determination results and of the

  3. Quantitative Evaluation of DNA Hypermethylation in Malignant and Benign Breast Tissue and Fluids

    PubMed Central

    Zhu, Weizhu; Qin, Wenyi; Hewett, John E.; Sauter, Edward R.

    2012-01-01

    The assessment of DNA had demonstrated altered methylation in malignant compared to benign breast tissue.The purpose of our study was to 1) confirm the predictive ability of methylation assessment in breast tissue, and 2) use the genes found to be cancer predictive in tissue to evaluate the diagnostic potential of hypermethylation assessment in nipple aspirate fluid (NAF) and mammary ductoscopic (MD) samples. Quantitative methylation specific (qMS)-PCR was conducted on three specimen sets: 44 malignant (CA) and 34 normal (NL) tissue specimens, 18 matched CA, adjacent normal (ANL) tissue and NAF specimens, and 119 MD specimens. Training and validation tissue sets were analyzed to determine the optimal group of cancer predictive genes for NAF and MD analysis. NAF and MD cytologic review were also performed. Methylation of CCND-2, p16, RAR-β and RASSF-1a was significantly more prevalent in tumor than in normal tissue specimens. Receiver operating characteristic curve analysis demonstrated an area under the curve of 0.96. For the 18 matched CA, ANL and NAF specimens, the four predictive genes identified in cancer tissue contained increased methylation in CA vs. ANL tissue; NAF samples had higher methylation than ANL specimens. Methylation frequency was higher in MD specimens from breasts with cancer than benign samples for p16 and RASSF-1a. In summary, 1) routine quantitative DNA methylation assessment in NAF and MD samples is possible, and 2) genes hypermethylated in malignant breast tissue are also altered in matched NAF and in MD samples, and may be useful to assist in early breast cancer detection. PMID:19618401

  4. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  5. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  6. Toward Web-Site Quantitative Evaluation: Defining Quality Characteristics and Attributes.

    ERIC Educational Resources Information Center

    Olsina, L; Rossi, G.

    This paper identifies World Wide Web site characteristics and attributes and groups them in a hierarchy. The primary goal is to classify the elements that might be part of a quantitative evaluation and comparison process. In order to effectively select quality characteristics, different users' needs and behaviors are considered. Following an…

  7. Quantitative Skills as a Graduate Learning Outcome: Exploring Students' Evaluative Expertise

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2017-01-01

    In the biosciences, quantitative skills are an essential graduate learning outcome. Efforts to evidence student attainment at the whole of degree programme level are rare and making sense of such data is complex. We draw on assessment theories from Sadler (evaluative expertise) and Boud (sustainable assessment) to interpret final-year bioscience…

  8. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  9. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  10. Complementarity as a Program Evaluation Strategy: A Focus on Qualitative and Quantitative Methods.

    ERIC Educational Resources Information Center

    Lafleur, Clay

    Use of complementarity as a deliberate and necessary program evaluation strategy is discussed. Quantitative and qualitative approaches are viewed as complementary and can be integrated into a single study. The synergy that results from using complementary methods in a single study seems to enhance understanding and interpretation. A review of the…

  11. Evaluation of a quantitative phosphorus transport model for potential improvement of southern phosphorus indices

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Due to a shortage of available phosphorus (P) loss data sets, simulated data from a quantitative P transport model could be used to evaluate a P-index. However, the model would need to accurately predict the P loss data sets that are available. The objective of this study was to compare predictions ...

  12. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  13. Integrating Qualitative Methods in a Predominantly Quantitative Evaluation: A Case Study and Some Reflections.

    ERIC Educational Resources Information Center

    Mark, Melvin M.; Feller, Irwin; Button, Scott B.

    1997-01-01

    A review of qualitative methods used in a predominantly quantitative evaluation indicates a variety of roles for such a mixing of methods, including framing and revising research questions, assessing the validity of measures and adaptations to program implementation, and gauging the degree of uncertainty and generalizability of conclusions.…

  14. Performance evaluation of triangulation based range sensors.

    PubMed

    Guidi, Gabriele; Russo, Michele; Magrassi, Grazia; Bordegoni, Monica

    2010-01-01

    The performance of 2D digital imaging systems depends on several factors related with both optical and electronic processing. These concepts have originated standards, which have been conceived for photographic equipment and bi-dimensional scanning systems, and which have been aimed at estimating different parameters such as resolution, noise or dynamic range. Conversely, no standard test protocols currently exist for evaluating the corresponding performances of 3D imaging systems such as laser scanners or pattern projection range cameras. This paper is focused on investigating experimental processes for evaluating some critical parameters of 3D equipment, by extending the concepts defined by the ISO standards to the 3D domain. The experimental part of this work concerns the characterization of different range sensors through the extraction of their resolution, accuracy and uncertainty from sets of 3D data acquisitions of specifically designed test objects whose geometrical characteristics are known in advance. The major objective of this contribution is to suggest an easy characterization process for generating a reliable comparison between the performances of different range sensors and to check if a specific piece of equipment is compliant with the expected characteristics.

  15. Performance evaluation of an automotive thermoelectric generator

    NASA Astrophysics Data System (ADS)

    Dubitsky, Andrei O.

    Around 40% of the total fuel energy in typical internal combustion engines (ICEs) is rejected to the environment in the form of exhaust gas waste heat. Efficient recovery of this waste heat in automobiles can promise a fuel economy improvement of 5%. The thermal energy can be harvested through thermoelectric generators (TEGs) utilizing the Seebeck effect. In the present work, a versatile test bench has been designed and built in order to simulate conditions found on test vehicles. This allows experimental performance evaluation and model validation of automotive thermoelectric generators. An electrically heated exhaust gas circuit and a circulator based coolant loop enable integrated system testing of hot and cold side heat exchangers, thermoelectric modules (TEMs), and thermal interface materials at various scales. A transient thermal model of the coolant loop was created in order to design a system which can maintain constant coolant temperature under variable heat input. Additionally, as electrical heaters cannot match the transient response of an ICE, modelling was completed in order to design a relaxed exhaust flow and temperature history utilizing the system thermal lag. This profile reduced required heating power and gas flow rates by over 50%. The test bench was used to evaluate a DOE/GM initial prototype automotive TEG and validate analytical performance models. The maximum electrical power generation was found to be 54 W with a thermal conversion efficiency of 1.8%. It has been found that thermal interface management is critical for achieving maximum system performance, with novel designs being considered for further improvement.

  16. Evaluating iterative reconstruction performance in computed tomography

    SciTech Connect

    Chen, Baiyu Solomon, Justin; Ramirez Giraldo, Juan Carlos; Samei, Ehsan

    2014-12-15

    Purpose: Iterative reconstruction (IR) offers notable advantages in computed tomography (CT). However, its performance characterization is complicated by its potentially nonlinear behavior, impacting performance in terms of specific tasks. This study aimed to evaluate the performance of IR with both task-specific and task-generic strategies. Methods: The performance of IR in CT was mathematically assessed with an observer model that predicted the detection accuracy in terms of the detectability index (d′). d′ was calculated based on the properties of the image noise and resolution, the observer, and the detection task. The characterizations of image noise and resolution were extended to accommodate the nonlinearity of IR. A library of tasks was mathematically modeled at a range of sizes (radius 1–4 mm), contrast levels (10–100 HU), and edge profiles (sharp and soft). Unique d′ values were calculated for each task with respect to five radiation exposure levels (volume CT dose index, CTDI{sub vol}: 3.4–64.8 mGy) and four reconstruction algorithms (filtered backprojection reconstruction, FBP; iterative reconstruction in imaging space, IRIS; and sinogram affirmed iterative reconstruction with strengths of 3 and 5, SAFIRE3 and SAFIRE5; all provided by Siemens Healthcare, Forchheim, Germany). The d′ values were translated into the areas under the receiver operating characteristic curve (AUC) to represent human observer performance. For each task and reconstruction algorithm, a threshold dose was derived as the minimum dose required to achieve a threshold AUC of 0.9. A task-specific dose reduction potential of IR was calculated as the difference between the threshold doses for IR and FBP. A task-generic comparison was further made between IR and FBP in terms of the percent of all tasks yielding an AUC higher than the threshold. Results: IR required less dose than FBP to achieve the threshold AUC. In general, SAFIRE5 showed the most significant dose reduction

  17. Seismic Performance Evaluation of Concentrically Braced Frames

    NASA Astrophysics Data System (ADS)

    Hsiao, Po-Chien

    Concentrically braced frames (CBFs) are broadly used as lateral-load resisting systems in buildings throughout the US. In high seismic regions, special concentrically braced frames (SCBFs) where ductility under seismic loading is necessary. Their large elastic stiffness and strength efficiently sustains the seismic demands during smaller, more frequent earthquakes. During large, infrequent earthquakes, SCBFs exhibit highly nonlinear behavior due to brace buckling and yielding and the inelastic behavior induced by secondary deformation of the framing system. These response modes reduce the system demands relative to an elastic system without supplemental damping. In design the re reduced demands are estimated using a response modification coefficient, commonly termed the R factor. The R factor values are important to the seismic performance of a building. Procedures put forth in FEMAP695 developed to R factors through a formalized procedure with the objective of consistent level of collapse potential for all building types. The primary objective of the research was to evaluate the seismic performance of SCBFs. To achieve this goal, an improved model including a proposed gusset plate connection model for SCBFs that permits accurate simulation of inelastic deformations of the brace, gusset plate connections, beams and columns and brace fracture was developed and validated using a large number of experiments. Response history analyses were conducted using the validated model. A series of different story-height SCBF buildings were designed and evaluated. The FEMAP695 method and an alternate procedure were applied to SCBFs and NCBFs. NCBFs are designed without ductile detailing. The evaluation using P695 method shows contrary results to the alternate evaluation procedure and the current knowledge in which short-story SCBF structures are more venerable than taller counterparts and NCBFs are more vulnerable than SCBFs.

  18. Performance evaluation of swimmers: scientific tools.

    PubMed

    Smith, David J; Norris, Stephen R; Hogg, John M

    2002-01-01

    The purpose of this article is to provide a critical commentary of the physiological and psychological tools used in the evaluation of swimmers. The first-level evaluation should be the competitive performance itself, since it is at this juncture that all elements interplay and provide the 'highest form' of assessment. Competition video analysis of major swimming events has progressed to the point where it has become an indispensable tool for coaches, athletes, sport scientists, equipment manufacturers, and even the media. The breakdown of each swimming performance at the individual level to its constituent parts allows for comparison with the predicted or sought after execution, as well as allowing for comparison with identified world competition levels. The use of other 'on-going' monitoring protocols to evaluate training efficacy typically involves criterion 'effort' swims and specific training sets where certain aspects are scrutinised in depth. Physiological parameters that are often examined alongside swimming speed and technical aspects include oxygen uptake, heart rate, blood lactate concentration, blood lactate accumulation and clearance rates. Simple and more complex procedures are available for in-training examination of technical issues. Strength and power may be quantified via several modalities although, typically, tethered swimming and dry-land isokinetic devices are used. The availability of a 'swimming flume' does afford coaches and sport scientists a higher degree of flexibility in the type of monitoring and evaluation that can be undertaken. There is convincing evidence that athletes can be distinguished on the basis of their psychological skills and emotional competencies and that these differences become further accentuated as the athlete improves. No matter what test format is used (physiological, biomechanical or psychological), similar criteria of validity must be ensured so that the test provides useful and associative information

  19. Performance evaluation of MPEG internet video coding

    NASA Astrophysics Data System (ADS)

    Luo, Jiajia; Wang, Ronggang; Fan, Kui; Wang, Zhenyu; Li, Ge; Wang, Wenmin

    2016-09-01

    Internet Video Coding (IVC) has been developed in MPEG by combining well-known existing technology elements and new coding tools with royalty-free declarations. In June 2015, IVC project was approved as ISO/IEC 14496-33 (MPEG- 4 Internet Video Coding). It is believed that this standard can be highly beneficial for video services in the Internet domain. This paper evaluates the objective and subjective performances of IVC by comparing it against Web Video Coding (WVC), Video Coding for Browsers (VCB) and AVC High Profile. Experimental results show that IVC's compression performance is approximately equal to that of the AVC High Profile for typical operational settings, both for streaming and low-delay applications, and is better than WVC and VCB.

  20. Performance Evaluation of Emerging High Performance Computing Technologies using WRF

    NASA Astrophysics Data System (ADS)

    Newby, G. B.; Morton, D.

    2008-12-01

    The Arctic Region Supercomputing Center (ARSC) has evaluated multicore processors and other emerging processor technologies for a variety of high performance computing applications in the earth and space sciences, especially climate and weather applications. A flagship effort has been to assess dual core processor nodes on ARSC's Midnight supercomputer, in which two-socket systems were compared to eight-socket systems. Midnight is utilized for ARSC's twice-daily weather research and forecasting (WRF) model runs, available at weather.arsc.edu. Among other findings on Midnight, it was found that the Hypertransport system for interconnecting Opteron processors, memory, and other subsystems does not scale as well on eight-socket (sixteen processor) systems as well as two-socket (four processor) systems. A fundamental limitation is the cache snooping operation performed whenever a computational thread accesses main memory. This increases memory latency as the number of processor sockets increases. This is particularly noticeable on applications such as WRF that are primarily CPU-bound, versus applications that are bound by input/output or communication. The new Cray XT5 supercomputer at ARSC features quad core processors, and will host a variety of scaling experiments for WRF, CCSM4, and other models. Early results will be presented, including a series of WRF runs for Alaska with grid resolutions under 2km. ARSC will discuss a set of standardized test cases for the Alaska domain, similar to existing test cases for CONUS. These test cases will provide different configuration sizes and resolutions, suitable for single processors up to thousands. Beyond multi-core Opteron-based supercomputers, ARSC has examined WRF and other applications on additional emerging technologies. One such technology is the graphics processing unit, or GPU. The 9800-series nVidia GPU was evaluated with the cuBLAS software library. While in-socket GPUs might be forthcoming in the future, current

  1. Performance evaluation of bound diamond ring tools

    SciTech Connect

    Piscotty, M.A.; Taylor, J.S.; Blaedel, K.L.

    1995-07-14

    LLNL is collaborating with the Center for Optics Manufacturing (COM) and the American Precision Optics Manufacturers Association (APOMA) to optimize bound diamond ring tools for the spherical generation of high quality optical surfaces. An important element of this work is establishing an experimentally-verified link between tooling properties and workpiece quality indicators such as roughness, subsurface damage and removal rate. In this paper, we report on a standardized methodology for assessing ring tool performance and its preliminary application to a set of commercially-available wheels. Our goals are to (1) assist optics manufacturers (users of the ring tools) in evaluating tools and in assessing their applicability for a given operation, and (2) provide performance feedback to wheel manufacturers to help optimize tooling for the optics industry. Our paper includes measurements of wheel performance for three 2-4 micron diamond bronze-bond wheels that were supplied by different manufacturers to nominally- identical specifications. Preliminary data suggests that the difference in performance levels among the wheels were small.

  2. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  3. The Evaluation and Quantitation of Dihydrogen Metabolism Using Deuterium Isotope in Rats

    PubMed Central

    Hyspler, Radomir; Ticha, Alena; Schierbeek, Henk; Galkin, Alexander; Zadak, Zdenek

    2015-01-01

    Purpose Despite the significant interest in molecular hydrogen as an antioxidant in the last eight years, its quantitative metabolic parameters in vivo are still lacking, as is an appropriate method for determination of hydrogen effectivity in the mammalian organism under various conditions. Basic Procedures Intraperitoneally-applied deuterium gas was used as a metabolic tracer and deuterium enrichment was determined in the body water pool. Also, in vitro experiments were performed using bovine heart submitochondrial particles to evaluate superoxide formation in Complex I of the respiratory chain. Main Findings A significant oxidation of about 10% of the applied dose was found under physiological conditions in rats, proving its antioxidant properties. Hypoxia or endotoxin application did not exert any effect, whilst pure oxygen inhalation reduced deuterium oxidation. During in vitro experiments, a significant reduction of superoxide formation by Complex I of the respiratory chain was found under the influence of hydrogen. The possible molecular mechanisms of the beneficial effects of hydrogen are discussed, with an emphasis on the role of iron sulphur clusters in reactive oxygen species generation and on iron species-dihydrogen interaction. Principal Conclusions According to our findings, hydrogen may be an efficient, non-toxic, highly bioavailable and low-cost antioxidant supplement for patients with pathological conditions involving ROS-induced oxidative stress. PMID:26103048

  4. Quantitative evaluation of radiation-induced changes in sperm morphology and chromatin distribution

    SciTech Connect

    Aubele, M.; Juetting, U.R.; Rodenacker, K.; Gais, P.; Burger, G.; Hacker-Klom, U. )

    1990-01-01

    Sperm head cytometry provides a useful assay for the detection of radiation-induced damage in mouse germ cells. Exposure of the gonads to radiation is known to lead to an increase of diploid and higher polyploid sperm and of sperm with head shape abnormalities. In the pilot studies reported here quantitative analysis of the total DNA content, the morphology, and the chromatin distribution of mouse sperm was performed. The goal was to evaluate the discriminative power of features derived by high resolution image cytometry in distinguishing sperm of control and irradiated mice. Our results suggest that besides the induction of the above mentioned variations in DNA content and shape of sperm head, changes of the nonhomogeneous chromatin distribution within the sperm may also be used to quantify the radiation effect on sperm cells. Whereas the chromatin distribution features show larger variations for sperm 21 days after exposure (dpr), the shape parameters seem to be more important to discriminate sperm 35 dpr. This may be explained by differentiation processes, which take place in different stages during mouse spermatogenesis.

  5. Flexor and extensor muscle tone evaluated using the quantitative pendulum test in stroke and parkinsonian patients.

    PubMed

    Huang, Han-Wei; Ju, Ming-Shaung; Lin, Chou-Ching K

    2016-05-01

    The aim of this study was to evaluate the flexor and extensor muscle tone of the upper limbs in patients with spasticity or rigidity and to investigate the difference in hypertonia between spasticity and rigidity. The two experimental groups consisted of stroke patients and parkinsonian patients. The control group consisted of age and sex-matched normal subjects. Quantitative upper limb pendulum tests starting from both flexed and extended joint positions were conducted. System identification with a simple linear model was performed and model parameters were derived. The differences between the three groups and two starting positions were investigated by these model parameters and tested by two-way analysis of variance. In total, 57 subjects were recruited, including 22 controls, 14 stroke patients and 21 parkinsonian patients. While stiffness coefficient showed no difference among groups, the number of swings, relaxation index and damping coefficient showed changes suggesting significant hypertonia in the two patient groups. There was no difference between these two patient groups. The test starting from the extended position constantly manifested higher muscle tone in all three groups. In conclusion, the hypertonia of parkinsonian and stroke patients could not be differentiated by the modified pendulum test; the elbow extensors showed a higher muscle tone in both control and patient groups; and hypertonia of both parkinsonian and stroke patients is velocity dependent.

  6. Hygienization by anaerobic digestion: comparison between evaluation by cultivation and quantitative real-time PCR.

    PubMed

    Lebuhn, M; Effenberger, M; Garcés, G; Gronauer, A; Wilderer, P A

    2005-01-01

    In order to assess hygienization by anaerobic digestion, a comparison between evaluation by cultivation and quantitative real-time PCR (qPCR) including optimized DNA extraction and quantification was carried out for samples from a full-scale fermenter cascade (F1, mesophilic; F2, thermophilic; F3, mesophilic). The system was highly effective in inactivating (pathogenic) viable microorganisms, except for spore-formers. Conventionally performed cultivation underestimated viable organisms particularly in F2 and F3 by a factor of at least 10 as shown by data from extended incubation times, probably due to the rise of sublethally injured (active but not cultivable) cells. Incubation should hence be extended adequately in incubation-based hygiene monitoring of stressed samples, in order to minimize contamination risks. Although results from qPCR and cultivation agreed for the equilibrated compartments, considerably higher qPCR values were obtained for the fermenters. The difference probably corresponded to DNA copies from decayed cells that had not yet been degraded by the residual microbial activity. An extrapolation from qPCR determination to the quantity of viable organisms is hence not justified for samples that had been exposed to lethal stress.

  7. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  8. Analytical and clinical performance of a new molecular assay for Epstein-Barr virus DNA quantitation.

    PubMed

    Hübner, Margit; Bozic, Michael; Konrad, Petra M; Grohs, Katharina; Santner, Brigitte I; Kessler, Harald H

    2015-02-01

    Quantitation of EBV DNA has been shown to be a useful tool to identify and monitor patients with immunosuppression and high risk for EBV-associated disease. In this study, the analytical and clinical performance of the new Realquality RS-EBV Kit (AB Analitica, Padova, Italy) was investigated. The clinical performance was compared to that of the EBV R-gene (bioMerieux, Varilhes, France) assay. When the accuracy of the new assay was tested, all results except of one were found to be within ±0.5log10 unit of the expected panel results. Determination of linearity showed a quasilinear curve, the between day imprecision ranged from 18% to 88% and the within run imprecision from 16% to 53%. When 96 clinical EDTA whole blood samples were tested, 77 concordant and 19 discordant results were obtained. When the results for the 69 samples quantifiable with both assays were compared, the new assay revealed a mean 0.31log10 unit higher measurement. The new assay proved to be suitable for the detection and quantitation of EBV DNA in EDTA whole blood in the routine diagnostic laboratory. The variation between quantitative results obtained by the assays used in this study reinforces the use of calibrators traceable to the existing international WHO standard making different assays better comparable.

  9. Style-independent document labeling: design and performance evaluation

    NASA Astrophysics Data System (ADS)

    Mao, Song; Kim, Jong Woo; Thoma, George R.

    2003-12-01

    The Medical Article Records System or MARS has been developed at the U.S. National Library of Medicine (NLM) for automated data entry of bibliographical information from medical journals into MEDLINE, the premier bibliographic citation database at NLM. Currently, a rule-based algorithm (called ZoneCzar) is used for labeling important bibliographical fields (title, author, affiliation, and abstract) on medical journal article page images. While rules have been created for medical journals with regular layout types, new rules have to be manually created for any input journals with arbitrary or new layout types. Therefore, it is of interest to label any journal articles independent of their layout styles. In this paper, we first describe a system (called ZoneMatch) for automated generation of crucial geometric and non-geometric features of important bibliographical fields based on string-matching and clustering techniques. The rule based algorithm is then modified to use these features to perform style-independent labeling. We then describe a performance evaluation method for quantitatively evaluating our algorithm and characterizing its error distributions. Experimental results show that the labeling performance of the rule-based algorithm is significantly improved when the generated features are used.

  10. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future.

  11. Improvement and quantitative performance estimation of the back support muscle suit.

    PubMed

    Muramatsu, Y; Umehara, H; Kobayashi, H

    2013-01-01

    We have been developing the wearable muscle suit for direct and physical motion supports. The use of the McKibben artificial muscle has opened the way to the introduction of "muscle suits" compact, lightweight, reliable, wearable "assist-bots" enabling manual worker to lift and carry weights. Since back pain is the most serious problem for manual worker, improvement of the back support muscle suit under the feasibility study and quantitative estimation are shown in this paper. The structure of the upper body frame, the method to attach to the body, and the axes addition were explained as for the improvement. In the experiments, we investigated quantitative performance results and efficiency of the back support muscle suit in terms of vertical lifting of heavy weights by employing integral electromyography (IEMG). The results indicated that the values of IEMG were reduced by about 40% by using the muscle suit.

  12. Performance Assessment of Human and Cattle Associated Quantitative Real-time PCR Assays - slides

    EPA Science Inventory

    The presentation overview is (1) Single laboratory performance assessment of human- and cattle associated PCR assays and (2) A Field Study: Evaluation of two human fecal waste management practices in Ohio watershed.

  13. Quantitative evaluation of simulated human enamel caries kinetics using photothermal radiometry and modulated luminescence

    NASA Astrophysics Data System (ADS)

    Hellen, Adam; Mandelis, Andreas; Finer, Yoav; Amaechi, Bennett T.

    2011-03-01

    Photothermal radiometry and modulated luminescence (PTR-LUM) is a non-destructive methodology applied toward the detection, monitoring and quantification of dental caries. The purpose of this study was to evaluate the efficacy of PTRLUM to detect incipient caries lesions and quantify opto-thermophysical properties as a function of treatment time. Extracted human molars (n=15) were exposed to an acid demineralization gel (pH 4.5) for 10 or 40 days in order to simulate incipient caries lesions. PTR-LUM frequency scans (1 Hz - 1 kHz) were performed prior to and during demineralization. Transverse Micro-Radiography (TMR) analysis followed at treatment conclusion. A coupled diffusephoton- density-wave and thermal-wave theoretical model was applied to PTR experimental amplitude and phase data across the frequency range of 4 Hz - 354 Hz, to quantitatively evaluate changes in thermal and optical properties of sound and demineralized enamel. Excellent fits with small residuals were observed experimental and theoretical data illustrating the robustness of the computational algorithm. Increased scattering coefficients and poorer thermophysical properties were characteristic of demineralized lesion bodies. Enhanced optical scattering coefficients of demineralized lesions resulted in poorer luminescence yield due to scattering of both incident and converted luminescent photons. Differences in the rate of lesion progression for the 10-day and 40-day samples points to a continuum of surface and diffusion controlled mechanism of lesion formation. PTR-LUM sensitivity to changes in tooth mineralization coupled with opto-thermophysical property extraction illustrates the technique's potential for non-destructive quantification of enamel caries.

  14. Evaluation by quantitative image analysis of anticancer drug activity on multicellular spheroids grown in 3D matrices

    PubMed Central

    Gomes, Aurélie; Russo, Adrien; Vidal, Guillaume; Demange, Elise; Pannetier, Pauline; Souguir, Zied; Lagarde, Jean-Michel; Ducommun, Bernard; Lobjois, Valérie

    2016-01-01

    Pharmacological evaluation of anticancer drugs using 3D in vitro models provides invaluable information for predicting in vivo activity. Artificial matrices are currently available that scale up and increase the power of such 3D models. The aim of the present study was to propose an efficient and robust imaging and analysis pipeline to assess with quantitative parameters the efficacy of a particular cytotoxic drug. HCT116 colorectal adenocarcinoma tumor cell multispheres were grown in a 3D physiological hyaluronic acid matrix. 3D microscopy was performed with structured illumination, whereas image processing and feature extraction were performed with custom analysis tools. This procedure makes it possible to automatically detect spheres in a large volume of matrix in 96-well plates. It was used to evaluate drug efficacy in HCT116 spheres treated with different concentrations of topotecan, a DNA topoisomerase inhibitor. Following automatic detection and quantification, changes in cluster size distribution with a topotecan concentration-dependent increase of small clusters according to drug cytotoxicity were observed. Quantitative image analysis is thus an effective means to evaluate and quantify the cytotoxic and cytostatic activities of anticancer drugs on 3D multicellular models grown in a physiological matrix. PMID:28105152

  15. Application of quantitative stereology to the evaluation of enzyme-altered foci in rat liver.

    PubMed

    Campbell, H A; Pitot, H C; Potter, V R; Laishes, B A

    1982-02-01

    The mathematical science of quantitative stereology has established relationships for the quantitation of elements in three-dimensional space from observations on two-dimensional planes. This report describes the utilization and importance of such mathematical relationships for the quantitative analysis of focal hepatic lesions in terms relative to the volume of the liver. Three examples are utilized to demonstrate the utility of such calculations in the three-dimensional quantitation of hepatic focal lesions. The first is that of a computer-simulated experiment based on defined hypothetical situations. The simulations demonstrate the applicability of the computations described in this report to the evaluation of two-dimensional data from typical animal experiments. The other two examples are taken from actual experiments and involve the transplantation of hepatic cell populations into the liver suitably prepared hosts and the quantitation of altered foci produced by initiation with diethylnitrosamine-partial hepatectomy followed by promotion with phenobarbital. The quantitation of altered foci by means of a two-dimensional analysis (simple enumeration of focal intersections/area of tissue section) is proportional to the quantitation of foci per volume of liver provided that the mean diameter of the foci for each treatment is sufficiently uniform, as exemplified in the text by the transplantation experiment. When such mean diameters are unequal as in the diethylnitrosamine-phenobarbital experiment described herein, quantitation from three-dimensional analysis gives significantly different results as compared with enumeration of focal intersections on two-dimensional areas. These studies clearly demonstrate that the frequency and size of foci intersections viewed on two-dimensional tissue sections do not necessarily reflect the number of size of foci in the three-dimensional tissue. Only by quantitating the number and size of the foci in relation to the three

  16. Manipulator Performance Evaluation Using Fitts' Taping Task

    SciTech Connect

    Draper, J.V.; Jared, B.C.; Noakes, M.W.

    1999-04-25

    Metaphorically, a teleoperator with master controllers projects the user's arms and hands into a re- mote area, Therefore, human users interact with teleoperators at a more fundamental level than they do with most human-machine systems. Instead of inputting decisions about how the system should func- tion, teleoperator users input the movements they might make if they were truly in the remote area and the remote machine must recreate their trajectories and impedance. This intense human-machine inter- action requires displays and controls more carefully attuned to human motor capabilities than is neces- sary with most systems. It is important for teleoperated manipulators to be able to recreate human trajectories and impedance in real time. One method for assessing manipulator performance is to observe how well a system be- haves while a human user completes human dexterity tasks with it. Fitts' tapping task has been, used many times in the past for this purpose. This report describes such a performance assessment. The International Submarine Engineering (ISE) Autonomous/Teleoperated Operations Manipulator (ATOM) servomanipulator system was evalu- ated using a generic positioning accuracy task. The task is a simple one but has the merits of (1) pro- ducing a performance function estimate rather than a point estimate and (2) being widely used in the past for human and servomanipulator dexterity tests. Results of testing using this task may, therefore, allow comparison with other manipulators, and is generically representative of a broad class of tasks. Results of the testing indicate that the ATOM manipulator is capable of performing the task. Force reflection had a negative impact on task efficiency in these data. This was most likely caused by the high resistance to movement the master controller exhibited with the force reflection engaged. Measurements of exerted forces were not made, so it is not possible to say whether the force reflection helped partici- pants

  17. Qualitative and quantitative evaluation of Simon™, a new CE-based automated Western blot system as applied to vaccine development.

    PubMed

    Rustandi, Richard R; Loughney, John W; Hamm, Melissa; Hamm, Christopher; Lancaster, Catherine; Mach, Anna; Ha, Sha

    2012-09-01

    Many CE-based technologies such as imaged capillary IEF, CE-SDS, CZE, and MEKC are well established for analyzing proteins, viruses, or other biomolecules such as polysaccharides. For example, imaged capillary isoelectric focusing (charge-based protein separation) and CE-SDS (size-based protein separation) are standard replacement methods in biopharmaceutical industries for tedious and labor intensive IEF and SDS-PAGE methods, respectively. Another important analytical tool for protein characterization is a Western blot, where after size-based separation in SDS-PAGE the proteins are transferred to a membrane and blotted with specific monoclonal or polyclonal antibodies. Western blotting analysis is applied in many areas such as biomarker research, therapeutic target identification, and vaccine development. Currently, the procedure is very manual, laborious, and time consuming. Here, we evaluate a new technology called Simple Western™ (or Simon™) for performing automated Western analysis. This new technology is based on CE-SDS where the separated proteins are attached to the wall of capillary by a proprietary photo activated chemical crosslink. Subsequent blotting is done automatically by incubating and washing the capillary with primary and secondary antibodies conjugated with horseradish peroxidase and detected with chemiluminescence. Typically, Western blots are not quantitative, hence we also evaluated the quantitative aspect of this new technology. We demonstrate that Simon™ can quantitate specific components in one of our vaccine candidates and it provides good reproducibility and intermediate precision with CV <10%.

  18. Combinative Method Using Multi-components Quantitation and HPLC Fingerprint for Comprehensive Evaluation of Gentiana crassicaulis

    PubMed Central

    Song, Jiuhua; Chen, Fengzheng; Liu, Jiang; Zou, Yuanfeng; Luo, Yun; Yi, Xiaoyan; Meng, Jie; Chen, Xingfu

    2017-01-01

    Background: Gentiana crassicaulis () is an important traditional Chinese herb. Like other herbs, its chemical compounds vary greatly by the environmental and genetic factors, as a result, the quality is always different even from the same region, and therefore, the quality evaluation is necessary for its safety and effective use. In this study, a comprehensive method including HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Cujingqinjiao and to classify the samples collected from Lijiang City of Yunnan province. A total of 30 common peaks including four identified peaks, were found, and were involved for further characterization and quality control of Cujingqinjiao. Twenty-one batches of samples from Lijiang City of Yunnan Province were evaluated by similarity analysis (SA), hierarchical cluster analysis (HCA), principal component analysis (PCA) and factor analysis (FA) according to the characteristic of common peaks. Results: The obtained data showed good stability and repeatability of the chromatographic fingerprint, similarity values were all more than 0.90. This study demonstrated that a combination of the chromatographic quantitative analysis and fingerprint offered an efficient way to quality consistency evaluation of Cujingqinjiao. Consistent results were obtained to show that samples from a same origin could be successfully classified into two groups. Conclusion: This study revealed that the combinative method was reliable, simple and sensitive for fingerprint analysis, moreover, for quality control and pattern recognition of Cujingqinjiao. SUMMARY HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Gentiana crassicaulisSimilarity analysis, hierarchical cluster analysis, principal component analysis and factor analysis were employed to analysis the chromatographic dataset.The results of multi-components quantitation analysis, similarity analysis, hierarchical cluster analysis, principal

  19. Performance Evaluation Method of Chemical Mechanical Polishing Pad Conditioner Using Digital Image Correlation Processing

    NASA Astrophysics Data System (ADS)

    Uneda, Michio; Omote, Tatsunori; Ishikawa, Ken-ichi; Ichikawa, Koichiro; Doi, Toshiro; Kurokawa, Syuhei; Ohnishi, Osamu

    2012-05-01

    In chemical mechanical polishing (CMP), conditioning is generally used for the regeneration of the pad surface texture. Currently, the performance evaluation of conditioners depends on the user's experience so that it is important to develop a novel quantitative evaluation method for conditioner performance. In this paper, we propose a novel evaluation method for conditioner performance using digital image correlation (DIC) processing. The proposed method can measure the in-plane micro-deformation distribution of the pad surface texture by conditioning. It is found that a pad surface deforms over 40 µm with conditioning and that the in-plane deformation value increases with a decrease in the mesh size of conditioner grains.

  20. Evaluation of a takeoff performance monitoring system

    NASA Technical Reports Server (NTRS)

    Middleton, David B.; Srivatsan, Raghavachari

    1987-01-01

    A takeoff performance monitoring system (TOPMS) has been developed to provide the pilot with graphic/numeric information pertinent to his decision to continue or abort a takeoff. The TOPMS instrument display consists primarily of a runway graphic overlaid with symbolic status, predictive, and advisory information including: (1) current position and airspeed, (2) predicted locations for reaching decision speed and rotation speed, (3) groundroll limit for reaching the rotation speed, (4) predicted stop point for an aborted takeoff from current conditions, (5) engine-failure flags, and (6) an overall situation advisory flag which recommends continuation or rejection of the takeoff. In this study, over 30 experienced multiengine pilots evaluated the TOPMS display on the Langley B-737 real-time research simulator. The display was judged to be easy to monitor and comprehend.

  1. Performance evaluation of mail-scanning cameras

    NASA Astrophysics Data System (ADS)

    Rajashekar, Umesh; Vu, Tony Tuan; Hooning, John E.; Bovik, Alan Conrad

    2010-04-01

    Letter-scanning cameras (LSCs) form the front- end imaging systems for virtually all mail-scanning systems that are currently used to automatically sort mail products. As with any vision-dependent technology, the quality of the images generated by the camera is fundamental to the overall performance of the system. We present novel techniques for objective evaluation of LSCs using comparative imaging-a technique that involves measuring the fidelity of target images produced by a camera with reference to an image of the same target captured at very high quality. Such a framework provides a unique opportunity to directly quantify the camera's ability to capture real-world targets, such as handwritten and printed text. Noncomparative techniques were also used to measure properties such as the camera's modulation transfer function, dynamic range, and signal-to-noise ratio. To simulate real-world imaging conditions, application-specific test samples were designed using actual mail product materials.

  2. Performance Evaluation of the SPT-140

    NASA Technical Reports Server (NTRS)

    Manzella, David; Sarmiento, Charles; Sankovic, John; Haag, Tom

    1997-01-01

    As part of an on-going cooperative program with industry, an engineering model SPT-140 Hall thruster, which may be suitable for orbit insertion and station-keeping of geosynchronous communication satellites, was evaluated with respect to thrust and radiated electromagnetic interference at the NASA Lewis Research Center. Performance measurements were made using a laboratory model propellant feed system and commercial power supplies. The engine was operated in a space simulation chamber capable of providing background pressures of 4 x 10(exp -6) Torr or less during thruster operation. Thrust was measured at input powers ranging from 1.5 to 5 kilowatts with two different output filter configurations. The broadband electromagnetic emission spectra generated by the engine was also measured for a range of frequencies from 0.01 to 18,000 Mhz. These results are compared to the noise threshold of the measurement system and MIL-STD-461C where appropriate.

  3. Image analysis techniques. The problem of the quantitative evaluation of thechromatin ultrastructure.

    PubMed

    Maraldi, N M; Marinelli, F; Squarzoni, S; Santi, S; Barbieri, M

    1991-02-01

    The application of image analysis methods to conventional thin sections for electron microscopy to analyze the chromatin arrangement are quite limited. We developed a method which utilizes freeze-fractured samples; the results indicate that the method is suitable for identifying the changes in the chromatin arrangement which occur in physiological, experimental and pathological conditions. The modern era of image analysis begins in 1964, when pictures of the moon transmitted by Ranger 7 were processed by a computer. This processing improved the original picture by enhancing and restoring the image affected by various types of distorsion. These performances have been allowed by the third-generation of computers having the speed and the storage capabilities required for practical use of image processing algorithms. Each image can be converted into a two-dimensional light intensity function: f (x, y), where x and y are the spatial coordinates and f value is proportional to the gray level of the image at that point. The digital image is therefore a matrix whose elements are the pixels (picture elements). A typical digital image can be obtained with a quality comparable to monochrome TV, with a 512×512 pixel array with 64 gray levels. The magnetic disks of commercial minicomputers are thus capable of storing some tenths of images which can be elaborated by the image processor, converting the signal into digital form. In biological images, obtained by light microscopy, the digitation converts the chromatic differences into gray level intensities, thus allowing to define the contours of the cytoplasm, of the nucleus and of the nucleoli. The use of a quantitative staining method for the DNA, the Feulgen reaction, permits to evaluate the ratio between condensed chromatin (stained) and euchromatin (unstained). The digitized images obtained by transmission electron microscopy are rich in details at high resolution. However, the application of image analysis techniques to

  4. Quantitative evaluation on the characteristics of activated sludge granules and flocs using a fuzzy entropy-based approach

    NASA Astrophysics Data System (ADS)

    Fang, Fang; Qiao, Li-Li; Ni, Bing-Jie; Cao, Jia-Shun; Yu, Han-Qing

    2017-02-01

    Activated sludge granules and flocs have their inherent advantages and disadvantages for wastewater treatment due to their different characteristics. So far quantitative information on their evaluation is still lacking. This work provides a quantitative and comparative evaluation on the characteristics and pollutant removal capacity of granules and flocs by using a new methodology through integrating fuzzy analytic hierarchy process, accelerating genetic algorithm and entropy weight method. Evaluation results show a higher overall score of granules, indicating that granules had more favorable characteristics than flocs. Although large sized granules might suffer from more mass transfer limitation and is prone to operating instability, they also enable a higher level of biomass retention, greater settling velocity and lower sludge volume index compared to flocs. Thus, optimized control of granule size is essential for achieving good pollutant removal performance and simultaneously sustaining long-term stable operation of granule-based reactors. This new integrated approach is effective to quantify and differentiate the characteristics of activated sludge granules and flocs. The evaluation results also provide useful information for the application of activated sludge granules in full-scale wastewater treatment plants.

  5. Quantitative evaluation on the characteristics of activated sludge granules and flocs using a fuzzy entropy-based approach

    PubMed Central

    Fang, Fang; Qiao, Li-Li; Ni, Bing-Jie; Cao, Jia-Shun; Yu, Han-Qing

    2017-01-01

    Activated sludge granules and flocs have their inherent advantages and disadvantages for wastewater treatment due to their different characteristics. So far quantitative information on their evaluation is still lacking. This work provides a quantitative and comparative evaluation on the characteristics and pollutant removal capacity of granules and flocs by using a new methodology through integrating fuzzy analytic hierarchy process, accelerating genetic algorithm and entropy weight method. Evaluation results show a higher overall score of granules, indicating that granules had more favorable characteristics than flocs. Although large sized granules might suffer from more mass transfer limitation and is prone to operating instability, they also enable a higher level of biomass retention, greater settling velocity and lower sludge volume index compared to flocs. Thus, optimized control of granule size is essential for achieving good pollutant removal performance and simultaneously sustaining long-term stable operation of granule-based reactors. This new integrated approach is effective to quantify and differentiate the characteristics of activated sludge granules and flocs. The evaluation results also provide useful information for the application of activated sludge granules in full-scale wastewater treatment plants. PMID:28211540

  6. Performance Evaluations of Ceramic Wafer Seals

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.; DeMange, Jeffrey J.; Steinetz, Bruce M.

    2006-01-01

    Future hypersonic vehicles will require high temperature, dynamic seals in advanced ramjet/scramjet engines and on the vehicle airframe to seal the perimeters of movable panels, flaps, and doors. Seal temperatures in these locations can exceed 2000 F, especially when the seals are in contact with hot ceramic matrix composite sealing surfaces. NASA Glenn Research Center is developing advanced ceramic wafer seals to meet the needs of these applications. High temperature scrub tests performed between silicon nitride wafers and carbon-silicon carbide rub surfaces revealed high friction forces and evidence of material transfer from the rub surfaces to the wafer seals. Stickage between adjacent wafers was also observed after testing. Several design changes to the wafer seals were evaluated as possible solutions to these concerns. Wafers with recessed sides were evaluated as a potential means of reducing friction between adjacent wafers. Alternative wafer materials are also being considered as a means of reducing friction between the seals and their sealing surfaces and because the baseline silicon nitride wafer material (AS800) is no longer commercially available.

  7. Rhythmic oscillations in quantitative EEG measured during a continuous performance task.

    PubMed

    Arruda, James E; Zhang, Hongmei; Amoss, R Toby; Coburn, Kerry L; Aue, William R

    2009-03-01

    The objective of the present investigation was to determine if cyclic variations in human performance recorded during a 30 min continuous performance task would parallel cyclic variations in right-hemisphere beta-wave activity. A fast fourier transformation was performed on the quantitative electroencephalogram (qEEG) and the performance record of each participant (N = 62), producing an individual periodogram for each outcome measure. An average periodogram was then produced for both qEEG and performance by combining (averaging) the amplitudes associated with each periodicity in the 62 original periodograms. Periodicities ranging from 1.00 to 2.00 min and from 4.70 to 5.70 min with amplitudes greater than would be expected due to chance were retained (Smith et al. 2003). The results of the present investigation validate the existence of cyclic variations in human performance that have been identified previously (Smith et al. 2003) and extend those findings by implicating right-hemisphere mediated arousal in the process (Arruda et al. 1996, 1999, 2007). Significant cyclic variations in left-hemisphere beta-wave activity were not observed. Taken together, the findings of the present investigation support a model of sustained attention that predicts cyclic changes in human performance that are the result of cyclic changes in right-hemisphere arousal.

  8. Performance Evaluation Modeling of Network Sensors

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Jennings, Esther H.; Gao, Jay L.

    2003-01-01

    Substantial benefits are promised by operating many spatially separated sensors collectively. Such systems are envisioned to consist of sensor nodes that are connected by a communications network. A simulation tool is being developed to evaluate the performance of networked sensor systems, incorporating such metrics as target detection probabilities, false alarms rates, and classification confusion probabilities. The tool will be used to determine configuration impacts associated with such aspects as spatial laydown, and mixture of different types of sensors (acoustic, seismic, imaging, magnetic, RF, etc.), and fusion architecture. The QualNet discrete-event simulation environment serves as the underlying basis for model development and execution. This platform is recognized for its capabilities in efficiently simulating networking among mobile entities that communicate via wireless media. We are extending QualNet's communications modeling constructs to capture the sensing aspects of multi-target sensing (analogous to multiple access communications), unimodal multi-sensing (broadcast), and multi-modal sensing (multiple channels and correlated transmissions). Methods are also being developed for modeling the sensor signal sources (transmitters), signal propagation through the media, and sensors (receivers) that are consistent with the discrete event paradigm needed for performance determination of sensor network systems. This work is supported under the Microsensors Technical Area of the Army Research Laboratory (ARL) Advanced Sensors Collaborative Technology Alliance.

  9. Performance Evaluation of Photovoltaic Solar Air Conditioning

    NASA Astrophysics Data System (ADS)

    Snegirjovs, A.; Shipkovs, P.; Lebedeva, K.; Kashkarova, G.; Migla, L.; Gantenbein, P.; Omlin, L.

    2016-12-01

    Information on the electrical-driven solar air conditioning (SAC) is rather scanty. A considerable body of technical data mostly concerns large-scale photo-voltaic solar air conditioning (PV-SAC) systems. Reliable information about the energy output has arisen only in recent years; however, it is still not easily accessible, and sometimes its sources are closed. Despite these facts, solar energy researchers, observers and designers devote special attention to this type of SAC systems. In this study, performance evaluation is performed for the PV-SAC technology, in which low-power (up to 15 kWp of cooling power on average) systems are used. Such a system contains a PV electric-driven compression chiller with cold and heat sensible thermal storage capacities, and a rejected energy unit used for preheating domestic hot water (DHW). In a non-cooling season, it is possible to partly employ the system in the reverse mode for DHW production. In this mode, the ambient air serves as a heat source. Besides, free cooling is integrated in the PV-SAC concept.

  10. Evaluation of quantitative accuracy in CZT-based pre-clinical SPECT for various isotopes

    NASA Astrophysics Data System (ADS)

    Park, S.-J.; Yu, A. R.; Kim, Y.-s.; Kang, W.-S.; Jin, S. S.; Kim, J.-S.; Son, T. J.; Kim, H.-J.

    2015-05-01

    In vivo pre-clinical single-photon emission computed tomography (SPECT) is a valuable tool for functional small animal imaging, but several physical factors, such as scatter radiation, limit the quantitative accuracy of conventional scintillation crystal-based SPECT. Semiconductor detectors such as CZT overcome these deficiencies through superior energy resolution. To our knowledge, little scientific information exists regarding the accuracy of quantitative analysis in CZT-based pre-clinical SPECT systems for different isotopes. The aim of this study was to assess the quantitative accuracy of CZT-based pre-clinical SPECT for four isotopes: 201Tl, 99mTc, 123I, and 111In. The quantitative accuracy of the CZT-based Triumph X-SPECT (Gamma-Medica Ideas, Northridge, CA, U.S.A.) was compared with that of a conventional SPECT using GATE simulation. Quantitative errors due to the attenuation and scatter effects were evaluated for all four isotopes with energy windows of 5%, 10%, and 20%. A spherical source containing the isotope was placed at the center of the air-or-water-filled mouse-sized cylinder phantom. The CZT-based pre-clinical SPECT was more accurate than the conventional SPECT. For example, in the conventional SPECT with an energy window of 10%, scatter effects degraded quantitative accuracy by up to 11.52%, 5.10%, 2.88%, and 1.84% for 201Tl, 99mTc, 123I, and 111In, respectively. However, with the CZT-based pre-clinical SPECT, the degradations were only 9.67%, 5.45%, 2.36%, and 1.24% for 201Tl, 99mTc, 123I, and 111In, respectively. As the energy window was increased, the quantitative errors increased in both SPECT systems. Additionally, the isotopes with lower energy of photon emissions had greater quantitative error. Our results demonstrated that the CZT-based pre-clinical SPECT had lower overall quantitative errors due to reduced scatter and high detection efficiency. Furthermore, the results of this systematic assessment quantifying the accuracy of these SPECT

  11. Quantitative evaluation of desertification extent based on geographic unit by remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Zhoulong; Wang, Dapeng; Zhang, Chunlai; Zhang, Anding

    2007-06-01

    The quantitative evaluation of desertification extent with remotely sensed imagery has been a hot spot of remote sensing application research. The evaluation process should consider the principles of dominance, integration and so on. Traditional evaluation methods to desertification extent are usually carried out at the scale of discrete pixels, which fails to taken into account of the influence of adjacent pixels and results in noises on the evaluation result images, inducing the unilateralism result. If we try to use filters to reduce the noises, then the evaluation results will be wrong contrasting with its real result. Based on former researches and the geographic science principle, this paper discusses the method of assessing desertification extent at the scale of geographic unit, in which the geographic unit is determined by vegetation coverage index and spatial information. The test results show that this method provides more accurate assessment of the ground situation avoiding the limitations of traditional methods.

  12. [Drifts and pernicious effects of the quantitative evaluation of research: the misuse of bibliometrics].

    PubMed

    Gingras, Yves

    2015-06-01

    The quantitative evaluation of scientific research relies increasingly on bibliometric indicators of publications and citations. We present the issues raised by the simplistic use of these methods and recall the dangers of using poorly built indicators and technically defective rankings that do not measure the dimensions they are supposed to measure, for example the of publications, laboratories or universities. We show that francophone journals are particularly susceptible to suffer from the bad uses of too simplistic bibliometric rankings of scientific journals.

  13. Evaluating the Performance of Calculus Classes Using Operational Research Tools.

    ERIC Educational Resources Information Center

    Soares de Mello, Joao Carlos C. B.; Lins, Marcos P. E.; Soares de Mello, Maria Helena C.; Gomes, Eliane G.

    2002-01-01

    Compares the efficiency of calculus classes and evaluates two kinds of classes: traditional and others that use computational methods in teaching. Applies quantitative evaluation methods using two operational research tools, multicriteria decision aid methods (mainly using the MACBETH approach) and data development analysis. (Author/YDS)

  14. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  15. Murine model of disseminated fusariosis: evaluation of the fungal burden by traditional CFU and quantitative PCR.

    PubMed

    González, Gloria M; Márquez, Jazmín; Treviño-Rangel, Rogelio de J; Palma-Nicolás, José P; Garza-González, Elvira; Ceceñas, Luis A; Gerardo González, J

    2013-10-01

    Systemic disease is the most severe clinical form of fusariosis, and the treatment involves a challenge due to the refractory response to antifungals. Treatment for murine Fusarium solani infection has been described in models that employ CFU quantitation in organs as a parameter of therapeutic efficacy. However, CFU counts do not precisely reproduce the amount of cells for filamentous fungi such as F. solani. In this study, we developed a murine model of disseminated fusariosis and compared the fungal burden with two methods: CFU and quantitative PCR. ICR and BALB/c mice received an intravenous injection of 1 × 10(7) conidia of F. solani per mouse. On days 2, 5, 7, and 9, mice from each mice strain were killed. The spleen and kidneys of each animal were removed and evaluated by qPCR and CFU determinations. Results from CFU assay indicated that the spleen and kidneys had almost the same fungal burden in both BALB/c and ICR mice during the days of the evaluation. In the qPCR assay, the spleen and kidney of each mouse strain had increased fungal burden in each determination throughout the entire experiment. The fungal load determined by the qPCR assay was significantly greater than that determined from CFU measurements of tissue. qPCR could be considered as a tool for quantitative evaluation of fungal burden in experimental disseminated F. solani infection.

  16. Comparison of the quantitative analysis performance between pulsed voltage atom probe and pulsed laser atom probe.

    PubMed

    Takahashi, J; Kawakami, K; Raabe, D

    2017-01-31

    The difference in quantitative analysis performance between the voltage-mode and laser-mode of a local electrode atom probe (LEAP3000X HR) was investigated using a Fe-Cu binary model alloy. Solute copper atoms in ferritic iron preferentially field evaporate because of their significantly lower evaporation field than the matrix iron, and thus, the apparent concentration of solute copper tends to be lower than the actual concentration. However, in voltage-mode, the apparent concentration was higher than the actual concentration at 40K or less due to a detection loss of matrix iron, and the concentration decreased with increasing specimen temperature due to the preferential evaporation of solute copper. On the other hand, in laser-mode, the apparent concentration never exceeded the actual concentration, even at lower temperatures (20K), and this mode showed better quantitative performance over a wide range of specimen temperatures. These results indicate that the pulsed laser atom probe prevents both detection loss and preferential evaporation under a wide range of measurement conditions.

  17. 40 CFR Table 6 to Subpart Wwww of... - Basic Requirements for Performance Tests, Performance Evaluations, and Design Evaluations for New...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On Control... Performance Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On... emissions to an add-on control device that is a PTE Meet the requirements for a PTE EPA method 204...

  18. 40 CFR Table 6 to Subpart Wwww of... - Basic Requirements for Performance Tests, Performance Evaluations, and Design Evaluations for New...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On Control... Performance Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On... emissions to an add-on control device that is a PTE Meet the requirements for a PTE EPA method 204...

  19. 40 CFR Table 6 to Subpart Wwww of... - Basic Requirements for Performance Tests, Performance Evaluations, and Design Evaluations for New...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On Control... Performance Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On... emissions to an add-on control device that is a PTE Meet the requirements for a PTE EPA method 204...

  20. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  1. Evaluation of Iron Content in Human Cerebral Cavernous Malformation using Quantitative Susceptibility Mapping

    PubMed Central

    Tan, Huan; Liu, Tian; Wu, Ying; Thacker, Jon; Shenkar, Robert; Mikati, Abdul Ghani; Shi, Changbin; Dykstra, Conner; Wang, Yi; Prasad, Pottumarthi V.; Edelman, Robert R.; Awad, Issam A.

    2014-01-01

    Objectives To investigate and validate quantitative susceptibility mapping (QSM) for lesional iron quantification in cerebral cavernous malformations (CCM). Materials and Methods Magnetic resonance imaging (MRI) studies were performed in phantoms and 16 patients on a 3T scanner. QSM, susceptibility weighted imaging (SWI), and R2* maps were reconstructed from in vivo data acquired with a three-dimensional, multi-echo, and T2*-weighted gradient echo sequence. Magnetic susceptibility measurements were correlated to SWI and R2* results. In addition, iron concentrations from surgically excised CCM lesion specimens were determined using inductively coupled plasma mass spectrometry and correlated with QSM measurements. Results The QSM images demonstrated excellent image quality for depicting CCM lesions in both sporadic and familial cases. Susceptibility measurements revealed a positive linear correlation with R2* values (R2 = 0.99 for total, R2 = 0.69 for mean; p < 0.01). QSM values of known iron-rich brain regions matched closely with previous studies and in interobserver consistency. A strong correlation was found between QSM and the concentration of iron phantoms (0.925, p < 0.01), as well as between QSM and mass spectroscopy estimation of iron deposition (0.999 for total iron, 0.86 for iron concentration; p < 0.01) in 18 fragments of 4 excised human CCM lesion specimens. Conclusions The ability of QSM to evaluate iron deposition in CCM lesions was illustrated via phantom, in vivo and ex vivo validation studies. QSM may be a potential biomarker for monitoring CCM disease activity and response to treatments. PMID:24619210

  2. Performance Evaluation and the Internet 2 Performance Initiative.

    ERIC Educational Resources Information Center

    Simco, Greg

    2001-01-01

    Explains the Internet 2 collaborative end-to-end performance initiative that focuses on performance measurement, analysis, and improvements that lead to a standard set of network capabilities and limitations. Discusses results of this performance initiative, including providing direction for current and future network development. (Author/LRW)

  3. DRACS thermal performance evaluation for FHR

    SciTech Connect

    Lv, Q.; Lin, H. C.; Kim, I. H.; Sun, X.; Christensen, R. N.; Blue, T. E.; Yoder, G. L.; Wilson, D. F.; Sabharwall, P.

    2015-03-01

    Direct Reactor Auxiliary Cooling System (DRACS) is a passive decay heat removal system proposed for the Fluoride-salt-cooled High-temperature Reactor (FHR) that combines coated particle fuel and a graphite moderator with a liquid fluoride salt as the coolant. The DRACS features three coupled natural circulation/convection loops, relying completely on buoyancy as the driving force. These loops are coupled through two heat exchangers, namely, the DRACS Heat Exchanger and the Natural Draft Heat Exchanger. In addition, a fluidic diode is employed to minimize the parasitic flow into the DRACS primary loop and correspondingly the heat loss to the DRACS during normal operation of the reactor, and to keep the DRACS ready for activation, if needed, during accidents. To help with the design and thermal performance evaluation of the DRACS, a computer code using MATLAB has been developed. This code is based on a one-dimensional formulation and its principle is to solve the energy balance and integral momentum equations. By discretizing the DRACS system in the axial direction, a bulk mean temperature is assumed for each mesh cell. The temperatures of all the cells, as well as the mass flow rates in the DRACS loops, are predicted by solving the governing equations that are obtained by integrating the energy conservation equation over each cell and integrating the momentum conservation equation over each of the DRACS loops. In addition, an intermediate heat transfer loop equipped with a pump has also been modeled in the code. This enables the study of flow reversal phenomenon in the DRACS primary loop, associated with the pump trip process. Experimental data from a High-Temperature DRACS Test Facility (HTDF) are not available yet to benchmark the code. A preliminary code validation is performed by using natural circulation experimental data available in the literature, which are as closely relevant as possible. The code is subsequently applied to the HTDF that is under

  4. High-Performance Monopropellants and Catalysts Evaluated

    NASA Technical Reports Server (NTRS)

    Reed, Brian D.

    2004-01-01

    The NASA Glenn Research Center is sponsoring efforts to develop advanced monopropellant technology. The focus has been on monopropellant formulations composed of an aqueous solution of hydroxylammonium nitrate (HAN) and a fuel component. HAN-based monopropellants do not have a toxic vapor and do not need the extraordinary procedures for storage, handling, and disposal required of hydrazine (N2H4). Generically, HAN-based monopropellants are denser and have lower freezing points than N2H4. The performance of HAN-based monopropellants depends on the selection of fuel, the HAN-to-fuel ratio, and the amount of water in the formulation. HAN-based monopropellants are not seen as a replacement for N2H4 per se, but rather as a propulsion option in their own right. For example, HAN-based monopropellants would prove beneficial to the orbit insertion of small, power-limited satellites because of this propellant's high performance (reduced system mass), high density (reduced system volume), and low freezing point (elimination of tank and line heaters). Under a Glenn-contracted effort, Aerojet Redmond Rocket Center conducted testing to provide the foundation for the development of monopropellant thrusters with an I(sub sp) goal of 250 sec. A modular, workhorse reactor (representative of a 1-lbf thruster) was used to evaluate HAN formulations with catalyst materials. Stoichiometric, oxygen-rich, and fuelrich formulations of HAN-methanol and HAN-tris(aminoethyl)amine trinitrate were tested to investigate the effects of stoichiometry on combustion behavior. Aerojet found that fuelrich formulations degrade the catalyst and reactor faster than oxygen-rich and stoichiometric formulations do. A HAN-methanol formulation with a theoretical Isp of 269 sec (designated HAN269MEO) was selected as the baseline. With a combustion efficiency of at least 93 percent demonstrated for HAN-based monopropellants, HAN269MEO will meet the I(sub sp) 250 sec goal.

  5. Evaluating performance of high efficiency mist eliminators

    SciTech Connect

    Waggoner, Charles A.; Parsons, Michael S.; Giffin, Paxton K.

    2013-07-01

    Processing liquid wastes frequently generates off gas streams with high humidity and liquid aerosols. Droplet laden air streams can be produced from tank mixing or sparging and processes such as reforming or evaporative volume reduction. Unfortunately these wet air streams represent a genuine threat to HEPA filters. High efficiency mist eliminators (HEME) are one option for removal of liquid aerosols with high dissolved or suspended solids content. HEMEs have been used extensively in industrial applications, however they have not seen widespread use in the nuclear industry. Filtering efficiency data along with loading curves are not readily available for these units and data that exist are not easily translated to operational parameters in liquid waste treatment plants. A specialized test stand has been developed to evaluate the performance of HEME elements under use conditions of a US DOE facility. HEME elements were tested at three volumetric flow rates using aerosols produced from an iron-rich waste surrogate. The challenge aerosol included submicron particles produced from Laskin nozzles and super micron particles produced from a hollow cone spray nozzle. Test conditions included ambient temperature and relative humidities greater than 95%. Data collected during testing HEME elements from three different manufacturers included volumetric flow rate, differential temperature across the filter housing, downstream relative humidity, and differential pressure (dP) across the filter element. Filter challenge was discontinued at three intermediate dPs and the filter to allow determining filter efficiency using dioctyl phthalate and then with dry surrogate aerosols. Filtering efficiencies of the clean HEME, the clean HEME loaded with water, and the HEME at maximum dP were also collected using the two test aerosols. Results of the testing included differential pressure vs. time loading curves for the nine elements tested along with the mass of moisture and solid

  6. Performance and evaluation of real-time multicomputer control systems

    NASA Technical Reports Server (NTRS)

    Shin, K. G.

    1983-01-01

    New performance measures, detailed examples, modeling of error detection process, performance evaluation of rollback recovery methods, experiments on FTMP, and optimal size of an NMR cluster are discussed.

  7. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    EPA Science Inventory

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  8. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  9. LANDSAT-4 horizon scanner performance evaluation

    NASA Technical Reports Server (NTRS)

    Bilanow, S.; Chen, L. C.; Davis, W. M.; Stanley, J. P.

    1984-01-01

    Representative data spans covering a little more than a year since the LANDSAT-4 launch were analyzed to evaluate the flight performance of the satellite's horizon scanner. High frequency noise was filtered out by 128-point averaging. The effects of Earth oblateness and spacecraft altitude variations are modeled, and residual systematic errors are analyzed. A model for the predicted radiance effects is compared with the flight data and deficiencies in the radiance effects modeling are noted. Correction coefficients are provided for a finite Fourier series representation of the systematic errors in the data. Analysis of the seasonal dependence of the coefficients indicates the effects of some early mission problems with the reference attitudes which were computed by the onboard computer using star trackers and gyro data. The effects of sun and moon interference, unexplained anomalies in the data, and sensor noise characteristics and their power spectrum are described. The variability of full orbit data averages is shown. Plots of the sensor data for all the available data spans are included.

  10. Multiple component quantitative analysis for the pattern recognition and quality evaluation of Kalopanacis Cortex using HPLC.

    PubMed

    Men, Chu Van; Jang, Yu Seon; Lee, Kwan Jun; Lee, Jae Hyun; Quang, Tran Hong; Long, Nguyen Van; Luong, Hoang Van; Kim, Young Ho; Kang, Jong Seong

    2011-12-01

    A quantitative and pattern recognition analyses were conducted for quality evaluation of Kalopanacis Cortex (KC) using HPLC. For quantitative analysis, four bioactive compounds, liriodendrin, pinoresinol O-β-D-glucopyranoside, acanthoside B and kalopanaxin B, were determined. The analysis method was optimized and validated using ODS column with mobile phase of methanol and aqueous phosphoric acid. The validation gave acceptable linearities (r > 0.9995), recoveries (98.4% to 101.9%) and precisions (RSD < 2.20). The limit of detection of compounds ranged from 0.4 to 0.9 μg/mL. Among the four compounds, liriodendrin was recommended as a marker compound for the quality control of KC. The pattern analysis was successfully carried out by analyzing thirty two samples from four species, and the authentic KC samples were completely discriminated from other inauthentic species by linear discriminant analysis. The results indicated that the method was suitable for the quantitative analysis of liriodendrin and the quality evaluation of KC.

  11. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective.

  12. Quantitative analysis combined with chromatographic fingerprint and antioxidant activities for the comprehensive evaluation of Compound Danshen Tablets.

    PubMed

    Chen, Jiao; Gao, Jiayue; Sun, Guoxiang

    2017-03-01

    The composition of traditional Chinese medicine is extremely complex, so it is difficult to ensure quality consistency. We took Compound Danshen Tablets as the object of the study, by using high-performance liquid chromatography to establish multiwavelength fusion fingerprints. Characteristic fingerprints of 30 batches of samples were generated at four wavelengths and evaluated by systematic quantified fingerprint method. An on-line antioxidant determination method was used for the determination of the antioxidant components in Compound Danshen Tablets. The fingerprint analysis of the marker compounds can reflect the content of the marker compounds, which were determined by using the external standard method. This study elucidated that multiwavelength fusion fingerprint profiles and multiple markers compound analysis in conjunction with the assay of antioxidant activity offered a reliable and efficient approach to quantitatively evaluate the quality consistency of the traditional Chinese medicine and herbal preparations.

  13. Quantitative methods in the tuberculosis epidemiology and in the evaluation of BCG vaccination programs.

    PubMed

    Lugosi, L

    1986-01-01

    Controversies concerning the protective efficacy of the BCG vaccination result mostly from the fact that quantitative methods have not been used in the evaluation of the BCG programs. Therefore, to eliminate the current controversy an unconditional requirement is to apply valid biostatistical models to analyse the results of the BCG programs. In order to achieve objective statistical inferences and epidemiological interpretations the following conditions should be fulfilled: data for evaluation have to be taken from epidemiological trials exempt from sampling error, since the morbidity rates are not normally distributed an appropriate normalizing transformation is needed for point and confidence interval estimations, only unbiased point estimates (dependent variables) could be used in valid models for hypothesis tests, in cases of rejected null hypothesis the ranked estimates of the compared groups must be evaluated in a multiple comparison model in order to diminish the Type I error in the decision. The following quantitative methods are presented to evaluate the effectiveness of BCG vaccination in Hungary: linear regression analysis, stepwise regression analysis and log-linear analysis.

  14. 48 CFR 2936.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Evaluation of contractor... Construction 2936.201 Evaluation of contractor performance. The HCA must establish procedures to evaluate construction contractor performance and prepare performance reports as required by FAR 36.201....

  15. 48 CFR 2452.216-73 - Performance evaluation plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Performance evaluation plan... 2452.216-73 Performance evaluation plan. As prescribed in 2416.406(e)(3), insert the following clause in all award fee contracts: Performance Evaluation Plan (AUG 1987) (a) The Government...

  16. Comparison of quantitative structure-activity relationship model performances on carboquinone derivatives.

    PubMed

    Bolboacă, Sorana-Daniela; Jäntschi, Lorentz

    2009-10-14

    Quantitative structure-activity relationship (qSAR) models are used to understand how the structure and activity of chemical compounds relate. In the present study, 37 carboquinone derivatives were evaluated and two different qSAR models were developed using members of the Molecular Descriptors Family (MDF) and the Molecular Descriptors Family on Vertices (MDFV). The usual parameters of regression models and the following estimators were defined and calculated in order to analyze the validity and to compare the models: Akaike's information criteria (three parameters), Schwarz (or Bayesian) information criterion, Amemiya prediction criterion, Hannan-Quinn criterion, Kubinyi function, Steiger's Z test, and Akaike's weights. The MDF and MDFV models proved to have the same estimation ability of the goodness-of-fit according to Steiger's Z test. The MDFV model proved to be the best model for the considered carboquinone derivatives according to the defined information and prediction criteria, Kubinyi function, and Akaike's weights.

  17. Performance evaluation of fault-tolerant systems with application to the IUS

    NASA Technical Reports Server (NTRS)

    Missana, J.-O.; Walker, Bruce K.

    1988-01-01

    A new method for quantitatively evaluating the performance of fault-tolerant systems is developed and applied to an example. The method assumes that the random failure and diagnostic decision behavior of the system can be modeled by a finite state Markov process. A performance value must be assigned to each of the states of the model. The method then generates the moments of the probability mass function of the cumulative performance and uses these to generate a maximum entropy approximation to the performance PMF. Some computational considerations are discussed. The method is applied to a typical mission for the Inertial Upper Stage to examine the attitude accuracy performance of the inertial system.

  18. Importance of Purity Evaluation and the Potential of Quantitative 1H NMR as a Purity Assay

    PubMed Central

    2015-01-01

    In any biomedical and chemical context, a truthful description of chemical constitution requires coverage of both structure and purity. This qualification affects all drug molecules, regardless of development stage (early discovery to approved drug) and source (natural product or synthetic). Purity assessment is particularly critical in discovery programs and whenever chemistry is linked with biological and/or therapeutic outcome. Compared with chromatography and elemental analysis, quantitative NMR (qNMR) uses nearly universal detection and provides a versatile and orthogonal means of purity evaluation. Absolute qNMR with flexible calibration captures analytes that frequently escape detection (water, sorbents). Widely accepted structural NMR workflows require minimal or no adjustments to become practical 1H qNMR (qHNMR) procedures with simultaneous qualitative and (absolute) quantitative capability. This study reviews underlying concepts, provides a framework for standard qHNMR purity assays, and shows how adequate accuracy and precision are achieved for the intended use of the material. PMID:25295852

  19. The development of a quantitative evaluation tool for simulations in nursing education.

    PubMed

    Todd, Martha; Manz, Julie A; Hawkins, Kim S; Parsons, Mary E; Hercinger, Maribeth

    2008-01-01

    In a complex healthcare environment, educating nursing students to safely care for clients is a challenging endeavor. As the use of high fidelity simulations increases, the ability to evaluate students is essential. A review of the literature identified a lack of tested simulation evaluation instruments to accurately measure student performance. A simulation evaluation tool was developed and tested with senior nursing students. Content validity was established from the literature and from the review of the tool by an expert panel. Reliability was established using sixteen simulation sessions, with two trained evaluators at each session. Percent agreement by evaluators ranged from 84.4% to 89.1%. Additional research needs to verify these results with different evaluators, varying levels of students, and additional scenarios. A valid, reliable tool to evaluate simulation experiences improves student assessment skills and ultimately clinical performance.

  20. Quantitative Evaluation of Vascularity Using 2-D Power Doppler Ultrasonography May Not Identify Malignancy of the Thyroid.

    PubMed

    Yoon, Jung Hyun; Shin, Hyun Joo; Kim, Eun-Kyung; Moon, Hee Jung; Roh, Yun Ho; Kwak, Jin Young

    2015-11-01

    The purpose of this study was to evaluate the usefulness of a quantitative vascular index in predicting thyroid malignancy. A total of 1309 thyroid nodules in 1257 patients (mean age: 50.2 y, range: 18-83 y) were included. The vascularity pattern and vascular index (VI) measured by quantification software for each nodule were obtained from 2-D power Doppler ultrasonography (US). Gray-scale US + vascularity pattern was compared with gray-scale US + VI with respect to diagnostic performance. Of the 1309 thyroid nodules, 927 (70.8%) were benign and 382 (29.2%) were malignant. The area under the receiver operating characteristics curve (Az) for gray-scale US (0.82) was significantly higher than that for US combined with vascularity pattern (0.77) or VI (0.70, all p < 0.001). Quantified VIs were higher in benign nodules, but did not improve the performance of 2-D US in diagnosing thyroid malignancy.

  1. Space Suit Performance: Methods for Changing the Quality of Quantitative Data

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew; Benson, Elizabeth; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. To verify that new suits will enable astronauts to perform to their maximum capacity, prototype suits must be built and tested with human subjects. However, engineers and flight surgeons often have difficulty understanding and applying traditional representations of human data without training. To overcome these challenges, NASA is developing modern simulation and analysis techniques that focus on 3D visualization. Early understanding of actual performance early on in the design cycle is extremely advantageous to increase performance capabilities, reduce the risk of injury, and reduce costs. The primary objective of this project was to test modern simulation and analysis techniques for evaluating the performance of a human operating in extra-vehicular space suits.

  2. [Quantitative analysis of deacylgymnemic acid by high-performance liquid chromatography].

    PubMed

    Suzuki, K; Ishihara, S; Uchida, M; Komoda, Y

    1993-04-01

    A method of the quantitative analysis was established for the determination of deacylgymnemic acid (DAGA) in the alkaline hydrolysate of the sample containing gymnemic acids which are ingredients of Gymnema sylvestre R. BR. leaves, by means of high-performance liquid chromatography. This method was used for comparing the contents of gymnemic acids in various samples. The amount of gymnemic acids analyzed as DAGA in 70% ethanol extract of dry leaves was about twice that in hot water extract. The commercial health-supplemental foods of five companies were investigated for the contents of gymnemic acids as DAGA and there were large differences from 38 to 251 mg in the dosage per day recommended by each company.

  3. [Aldehydes and ketones in silage: quantitative analysis by high performance liquid chromatography].

    PubMed

    Langin, D; Nguyen, P; Dumon, H; Malek, A

    1989-01-01

    Carbonyl compound toxicity is known in several species but no study has been carried out with ruminants. Such volatile compounds exist in silages. After condensation of aldehydes and ketones with 2,4-dinitrophenylhydrazine, quantitative analysis was performed with 37 silages. It was found that quantities of carbonyl compounds varied from 36 mg/kg of dry matter (DM) to 1,535 mg/kg DM with a mean value of 642 mg/kg DM. Ethanal was 63% of the total amount of carbonyl compounds (mol/kg DM). Other molecules were propanal, propanone, butanal (n- and iso-), butanone and n- and iso- pentanal. The total amount of carbonyl compounds correlated positively with the dry matter percentage and negatively with the pH, crude fiber, ash content and volatile fatty acids. Thus, carbonyl compounds seem to be dependent up on silage storage conditions. Lactic flora could be involved in the synthesis of these compounds.

  4. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (Cf) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined Cf for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  5. Assembling of Fluid Filtration System for Quantitative Evaluation of Microleakage in Dental Materials

    PubMed Central

    Javidi, Maryam; Naghavi, Neda; Roohani, Ehsan

    2008-01-01

    INTRODUCTION: There are several methods for evaluating microleakage in dentistry, for example dye or bacterial leakage, electro-chemical methods, radioisotope labeling and fluid filtration. The purpose of this study was to assemble the fluid filtration system for quantitative evaluation of microleakage in dental materials. MATERIALS AND METHODS: The roots were connected to a tube filled with an underwater pressure supply. A bubble was introduced into the water to measure endodontic leakage. A digital camera and professional software were utilized to record and measure the bubble displacement. RESULTS: Our system was constructed successfully and functioned correctly. CONCLUSION: In this pilot study we found this system efficient for the evaluation of microleakage of dental materials. PMID:24146673

  6. Genetic algorithm based image binarization approach and its quantitative evaluation via pooling

    NASA Astrophysics Data System (ADS)

    Hu, Huijun; Liu, Ya; Liu, Maofu

    2015-12-01

    The binarized image is very critical to image visual feature extraction, especially shape feature, and the image binarization approaches have been attracted more attentions in the past decades. In this paper, the genetic algorithm is applied to optimizing the binarization threshold of the strip steel defect image. In order to evaluate our genetic algorithm based image binarization approach in terms of quantity, we propose the novel pooling based evaluation metric, motivated by information retrieval community, to avoid the lack of ground-truth binary image. Experimental results show that our genetic algorithm based binarization approach is effective and efficiency in the strip steel defect images and our quantitative evaluation metric on image binarization via pooling is also feasible and practical.

  7. A methodology to quantitatively evaluate the safety of a glazing robot.

    PubMed

    Lee, Seungyeol; Yu, Seungnam; Choi, Junho; Han, Changsoo

    2011-03-01

    A new construction method using robots is spreading widely among construction sites in order to overcome labour shortages and frequent construction accidents. Along with economical efficiency, safety is a very important factor for evaluating the use of construction robots in construction sites. However, the quantitative evaluation of safety is difficult compared with that of economical efficiency. In this study, we suggested a safety evaluation methodology by defining the 'worker' and 'work conditions' as two risk factors, defining the 'worker' factor as posture load and the 'work conditions' factor as the work environment and the risk exposure time. The posture load evaluation reflects the risk of musculoskeletal disorders which can be caused by work posture and the risk of accidents which can be caused by reduced concentration. We evaluated the risk factors that may cause various accidents such as falling, colliding, capsizing, and squeezing in work environments, and evaluated the operational risk by considering worker exposure time to risky work environments. With the results of the evaluations for each factor, we calculated the general operational risk and deduced the improvement ratio in operational safety by introducing a construction robot. To verify these results, we compared the safety of the existing human manual labour and the proposed robotic labour construction methods for manipulating large glass panels.

  8. Sensitive on-chip quantitative real-time PCR performed on an adaptable and robust platform.

    PubMed

    Lund-Olesen, Torsten; Dufva, Martin; Dahl, John Arne; Collas, Philippe; Hansen, Mikkel Fougt

    2008-12-01

    A robust, flexible and efficient system for performing high sensitivity quantitative on-chip real-time PCR for research purposes is presented. The chips used consist of microchannels etched in silicon. The surface in the channels is a thermally grown silicon dioxide and the channel is sealed by a glass lid. The chips contain four PCR chambers but this number can be increased for further multiplexing. Contrary to PCR chips with oil covered open chambers, these channel-like chambers are easily integrated in lab-on-a-chip devices. The temperature is controlled by a Peltier element and the fluorochrome detector system is a commercially available fluorescence stereo microscope equipped with a CCD camera. The setup shows an excellent signal-to-noise ratio of about 400 compared to that of about 150 obtained in a commercial real time PCR machine. A detection limit of a few copies of target molecules is found, which is 100 to 100,000-fold better than other on-chip real-time PCR systems presented in the literature. This demonstrates that the PCR system can be used for critical applications. We also demonstrate that high quality melting curves can be obtained. Such curves are important in lab-on-a-chip systems for identification of amplified product. The usability of the system is validated by performing quantitative on-chip measurements of the amount of specific gene sequences co-immunoprecipitated with various posttranslationally modified histone proteins. Similar results are obtained from on-chip experiments and experiments carried out in a commercial system on larger sample volumes.

  9. Performance Evaluation of Voice Over Internet Protocol

    DTIC Science & Technology

    2002-12-01

    Table 1. Codec Comparison ...........................................................................................17 Table 2. VoIP Packet Priority...31 Table 9. Comparison of VoIP Performance Measurement ............................................42 Table 10. Mean Opinion Score...environment. A VoIP performance measurement on a local Ethernet is used as the baseline for performance comparison . Furthermore, this research

  10. 40 CFR Table 6 to Subpart Wwww of... - Basic Requirements for Performance Tests, Performance Evaluations, and Design Evaluations for New...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On Control... Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On Control... emissions to an add-on control device that is a PTE Meet the requirements for a PTE EPA method 204...

  11. 40 CFR Table 6 to Subpart Wwww of... - Basic Requirements for Performance Tests, Performance Evaluations, and Design Evaluations for New...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On Control... Tests, Performance Evaluations, and Design Evaluations for New and Existing Sources Using Add-On Control... emissions to an add-on control device that is a PTE Meet the requirements for a PTE EPA method 204...

  12. Comparison of Diagnostic Performance of Semi-Quantitative Knee Ultrasound and Knee Radiography with MRI: Oulu Knee Osteoarthritis Study.

    PubMed

    Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W; Arokoski, Jari P; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T; Tervonen, Osmo; Koski, Juhani M; Saarakkala, Simo

    2016-03-01

    Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level.

  13. Quantitative and fiber-selective evaluation of dose-dependent nerve blockade by intrathecal lidocaine in rats.

    PubMed

    Oda, Mayuko; Kitagawa, Norihito; Yang, Bang-Xiang; Totoki, Tadahide; Morimoto, Masatoshi

    2005-03-01

    We investigated whether cutaneous stimulus threshold (CST), as determined using a Neurometer, could be used for quantitative and differential nerve evaluation of reversible and irreversible nerve block following intrathecal lidocaine administration in rats. Rats with intrathecal catheters were randomly assigned to one of five groups (saline or 2, 5, 10, or 20% lidocaine). Prior to and 4 days after drug administration, CST was determined at 5, 250, and 2000 Hz. In the 2% lidocaine group, CST from end of lidocaine infusion to recovery from anesthesia was also monitored. Skin-clamp testing and gait observation were performed for comparison with CST findings. Behavioral examinations revealed persistent sensory or motor impairment lasting 4 days in groups receiving >/=5% lidocaine but not in the saline and 2% lidocaine groups. With 2% lidocaine, return to baseline CSTs at 5 and 250 Hz was delayed compared with thresholds at 2000 Hz. Although CSTs in the 5% group at 5 and 250 Hz increased significantly, thresholds at 2000 Hz did not differ from those in rats administered saline. CSTs with >/=10% lidocaine displayed no differences between frequencies. At each frequency, CSTs for rats with >/=5% lidocaine increased in a clearly concentration-dependent manner. These results suggest that CST testing enables evaluation of the different nerve functions for Abeta, Adelta, and C fibers in rats for lidocaine concentrations quantitative assessment of persistent neurological deficit induced by lidocaine in rats.

  14. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  15. Quantitative evaluation of registration methods for atlas-based diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Wu, Xue; Eggebrecht, Adam T.; Culver, Joseph P.; Zhan, Yuxuan; Basevi, Hector; Dehghani, Hamid

    2013-06-01

    In Diffuse Optical Tomography (DOT), an atlas-based model can be used as an alternative to a subject-specific anatomical model for recovery of brain activity. The main step of the generation of atlas-based subject model is the registration of atlas model to the subject head. The accuracy of the DOT then relies on the accuracy of registration method. In this work, 11 registration methods are quantitatively evaluated. The registration method with EEG 10/20 systems with 19 landmarks and non-iterative point to point algorithm provides approximately 1.4 mm surface error and is considered as the most efficient registration method.

  16. Quantitative evaluation of annular bright-field phase images in STEM.

    PubMed

    Ishida, Takafumi; Kawasaki, Tadahiro; Tanji, Takayoshi; Ikuta, Takashi

    2015-04-01

    A phase reconstruction method based on multiple scanning transmission electron microscope (STEM) images was evaluated quantitatively using image simulations. The simulation results indicated that the phase shift caused by a single atom was proportional to the 0.6th power of the atomic number Z. For a thin SrTiO3 [001] crystal, the reconstructed phase at each atomic column increased according to the specimen thickness. The STEM phase images can quantify the oxygen vacancy concentration if the thickness is less than several nanometers.

  17. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  18. Qualitative and quantitative evaluation of in vivo SD-OCT measurement of rat brain

    PubMed Central

    Xie, Yijing; Harsan, Laura-Adela; Bienert, Thomas; Kirch, Robert D.; von Elverfeldt, Dominik; Hofmann, Ulrich G.

    2017-01-01

    OCT has been demonstrated as an efficient imaging modality in various biomedical and clinical applications. However, there is a missing link with respect to the source of contrast between OCT and other modern imaging modalities, no quantitative comparison has been demonstrated between them, yet. We evaluated, to our knowledge, for the first time in vivo OCT measurement of rat brain with our previously proposed forward imaging method by both qualitatively and quantitatively correlating OCT with the corresponding T1-weighted and T2-weighted magnetic resonance images, fiber density map (FDM), and two types of histology staining (cresyl violet and acetylcholinesterase AchE), respectively. Brain anatomical structures were identified and compared across OCT, MRI and histology imaging modalities. Noticeable resemblances corresponding to certain anatomical structures were found between OCT and other image profiles. Correlation was quantitatively assessed by estimating correlation coefficient (R) and mutual information (MI). Results show that the 1-D OCT measurements in regards to the intensity profile and estimated attenuation factor, do not have profound linear correlation with the other image modalities suggested from correlation coefficient estimation. However, findings in mutual information analysis demonstrate that there are markedly high MI values in OCT-MRI signals. PMID:28270970

  19. Evaluation of the impact of peak description on the quantitative capabilities of comprehensive two-dimensional liquid chromatography.

    PubMed

    Place, Benjamin J; Morris, Mallory J; Phillips, Melissa M; Sander, Lane C; Rimmer, Catherine A

    2014-11-14

    Comprehensive, two-dimensional liquid chromatography (LC × LC) is a powerful technique for the separation of complex mixtures. Most studies using LC × LC are focused on qualitative efforts, such as increasing peak capacity. The present study examined the use of LC × LC-UV/vis for the separation and quantitation of polycyclic aromatic hydrocarbons (PAHs). More specifically, this study evaluated the impact of different peak integration approaches on the quantitative performance of the LC × LC method. For well-resolved three-dimensional peaks, parameters such as baseline definition, peak base shape, and peak width determination did not have a significant impact on accuracy and precision. For less-resolved peaks, a dropped baseline and the summation of all slices in the peak improved the accuracy and precision of the integration methods. The computational approaches to three-dimensional peak integration are provided, including fully descriptive, select slice, and summed heights integration methods, each with its own strengths and weaknesses. Overall, the integration methods presented quantify each of the PAHs within acceptable precision and accuracy ranges and have comparable performance to that of single dimension liquid chromatography.

  20. The Interpretation of Student Performance on Evaluative Tests.

    ERIC Educational Resources Information Center

    Aikenhead, Glen S.

    Reported is a study on the use of quantitative data in evaluating a science course for the purpose of introducing an alternative form of information presentation capable of supplying qualitative feedback valuable to students, teachers, and curriculum developers. Fifty-five teachers, randomly selected during the 1967-68 Project Physics (PP)…

  1. An evaluation of prospective motion correction (PMC) for high resolution quantitative MRI

    PubMed Central

    Callaghan, Martina F.; Josephs, Oliver; Herbst, Michael; Zaitsev, Maxim; Todd, Nick; Weiskopf, Nikolaus

    2015-01-01

    Quantitative imaging aims to provide in vivo neuroimaging biomarkers with high research and diagnostic value that are sensitive to underlying tissue microstructure. In order to use these data to examine intra-cortical differences or to define boundaries between different myelo-architectural areas, high resolution data are required. The quality of such measurements is degraded in the presence of motion hindering insight into brain microstructure. Correction schemes are therefore vital for high resolution, whole brain coverage approaches that have long acquisition times and greater sensitivity to motion. Here we evaluate the use of prospective motion correction (PMC) via an optical tracking system to counter intra-scan motion in a high resolution (800 μm isotropic) multi-parameter mapping (MPM) protocol. Data were acquired on six volunteers using a 2 × 2 factorial design permuting the following conditions: PMC on/off and motion/no motion. In the presence of head motion, PMC-based motion correction considerably improved the quality of the maps as reflected by fewer visible artifacts and improved consistency. The precision of the maps, parameterized through the coefficient of variation in cortical sub-regions, showed improvements of 11–25% in the presence of deliberate head motion. Importantly, in the absence of motion the PMC system did not introduce extraneous artifacts into the quantitative maps. The PMC system based on optical tracking offers a robust approach to minimizing motion artifacts in quantitative anatomical imaging without extending scan times. Such a robust motion correction scheme is crucial in order to achieve the ultra-high resolution required of quantitative imaging for cutting edge in vivo histology applications. PMID:25859178

  2. Quantitative evaluation by measurement and modeling of the variations in dose distributions deposited in mobile targets.

    PubMed

    Ali, Imad; Alsbou, Nesreen; Taguenang, Jean-Michel; Ahmad, Salahuddin

    2017-03-03

    The objective of this study is to quantitatively evaluate variations of dose distributions deposited in mobile target by measurement and modeling. The effects of variation in dose distribution induced by motion on tumor dose coverage and sparing of normal tissues were investigated quantitatively. The dose distributions with motion artifacts were modeled considering different motion patterns that include (a) motion with constant speed and (b) sinusoidal motion. The model predictions of the dose distributions with motion artifacts were verified with measurement where the dose distributions from various plans that included three-dimensional conformal and intensity-modulated fields were measured with a multiple-diode-array detector (MapCheck2), which was mounted on a mobile platform that moves with adjustable motion parameters. For each plan, the dose distributions were then measured with MapCHECK2 using different motion amplitudes from 0-25 mm. In addition, mathematical modeling was developed to predict the variations in the dose distributions and their dependence on the motion parameters that included amplitude, frequency and phase for sinusoidal motions. The dose distributions varied with motion and depended on the motion pattern particularly the sinusoidal motion, which spread out along the direction of motion. Study results showed that in the dose region between isocenter and the 50% isodose line, the dose profile decreased with increase of the motion amplitude. As the range of motion became larger than the field length along the direction of motion, the dose profiles changes overall including the central axis dose and 50% isodose line. If the total dose was delivered over a time much longer than the periodic time of motion, variations in motion frequency and phase do not affect the dose profiles. As a result, the motion dose modeling developed in this study provided quantitative characterization of variation in the dose distributions induced by motion, which

  3. An evaluation of ARM radiosonde operational performance

    SciTech Connect

    Lesht, B.M.

    1995-06-01

    Because the ARM (Atmospheric Radiation Measurement) program uses data from radiosondes for real-time quality control and sensitive modeling applications, it is important to have a quantitative measure of the quality of the radiosonde data themselves. Two methods have been tried for estimating the quality of radiosonde data: comparisons with known standards before launch and examination of pseudo-replicate samples by single sensors aloft. The ground check procedure showed that the ARM radiosondes are within manufacturer`s specifications for measuring relative humidity; procedural artifacts prevented verification for temperature. Pseudo-replicates from ascent and descent suggest that the temperature measurement is within the specified {minus_plus}0.2 C. On average ascent and descent data are similar, but detailed structure may be obscured on descent by loss of sampling density, and the descent involves other uncertainties.

  4. Quantitative evaluation protocol for upper limb motor coordination analysis in patients with ataxia.

    PubMed

    Marini, F; Chwastek, C; Romei, M; Cavalleri, M; Bonato, S; Reni, G

    2010-01-01

    Objective and quantitative measurement is crucial in the definition of functional impairment and in the tracking of disease progress over time of patients affected by progressive pathologies, such as ataxia. A new experimental procedure for the quantitative description of upper limb movement and coordination analysis was developed by the integration of an optoelectronic system and dedicated electronic board with four visual and pressure stimuli. 20 passive retroreflective markers were placed on the subject's body and two types pointing tests were defined: in the first one, the subjects were asked to reach with the index finger five consecutive times each of the three targets ("repetitive test"), and in the second one, the subjects were asked to randomly reach the targets with the index finger ("random test"). The preliminary results showed that patients affected by ataxia took more time with a less smooth finger tip movement to perform the reaching tests when compared to healthy subjects. The velocity was lower and its profile was more irregular in ataxic subjects. The new developed experimental procedure seems to be very promising in the quantitative description of upper limb movements of pathological and healthy subjects and it seems to be able to distinguish the impairments due to different levels of ataxia.

  5. FLUORESCENT TRACER EVALUATION OF PROTECTIVE CLOTHING PERFORMANCE

    EPA Science Inventory

    Field studies evaluating chemical protective clothing (CPC), which is often employed as a primary control option to reduce occupational exposures during pesticide applications, are limited. This study, supported by the U.S. Environmental Protection Agency (EPA), was designed to...

  6. Evaluating Performances of Solar-Energy Systems

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1987-01-01

    CONC11 computer program calculates performances of dish-type solar thermal collectors and power systems. Solar thermal power system consists of one or more collectors, power-conversion subsystems, and powerprocessing subsystems. CONC11 intended to aid system designer in comparing performance of various design alternatives. Written in Athena FORTRAN and Assembler.

  7. Performance evaluation of image processing algorithms in digital mammography

    NASA Astrophysics Data System (ADS)

    Zanca, Federica; Van Ongeval, Chantal; Jacobs, Jurgen; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2008-03-01

    The purpose of the study is to evaluate the performance of different image processing algorithms in terms of representation of microcalcification clusters in digital mammograms. Clusters were simulated in clinical raw ("for processing") images. The entire dataset of images consisted of 200 normal mammograms, selected out of our clinical routine cases and acquired with a Siemens Novation DR system. In 100 of the normal images a total of 142 clusters were simulated; the remaining 100 normal mammograms served as true negative input cases. Both abnormal and normal images were processed with 5 commercially available processing algorithms: Siemens OpView1 and Siemens OpView2, Agfa Musica1, Sectra Mamea AB Sigmoid and IMS Raffaello Mammo 1.2. Five observers were asked to locate and score the cluster(s) in each image, by means of dedicated software tool. Observer performance was assessed using the JAFROC Figure of Merit. FROC curves, fitted using the IDCA method, have also been calculated. JAFROC analysis revealed significant differences among the image processing algorithms in the detection of microcalcifications clusters (p=0.0000369). Calculated average Figures of Merit are: 0.758 for Siemens OpView2, 0.747 for IMS Processing 1.2, 0.736 for Agfa Musica1 processing, 0.706 for Sectra Mamea AB Sigmoid processing and 0.703 for Siemens OpView1. This study is a first step towards a quantitative assessment of image processing in terms of cluster detection in clinical mammograms. Although we showed a significant difference among the image processing algorithms, this method does not on its own allow for a global performance ranking of the investigated algorithms.

  8. Quantitative structure-retention relationships of pesticides in reversed-phase high-performance liquid chromatography.

    PubMed

    Aschi, Massimiliano; D'Archivio, Angelo Antonio; Maggi, Maria Anna; Mazzeo, Pietro; Ruggieri, Fabrizio

    2007-01-23

    In this paper, a quantitative structure-retention relationships (QSRR) method is employed to predict the retention behaviour of pesticides in reversed-phase high-performance liquid chromatography (HPLC). A six-parameter nonlinear model is developed by means of a feed-forward artificial neural network (ANN) with back-propagation learning rule. Accurate description of the retention factors of 26 compounds including commonly used insecticides, herbicides and fungicides and some metabolites is successfully achieved. In addition to the acetonitrile content, included to describe composition of the water-acetonitrile mobile phase, the octanol-water partition coefficient (from literature) and four quantum chemical descriptors are considered to account for the effect of solute structure on the retention. These are: the total dipole moment, the mean polarizability, the anisotropy of polarizability and a descriptor of hydrogen bonding ability based on the atomic charges on hydrogen bond donor and acceptor chemical functionalities. The proposed nonlinear QSRR model exhibits a high degree of correlation between observed and computed retention factors and a good predictive performance in wide range of mobile phase composition (40-65%, v/v acetonitrile) that supports its application for the prediction of the chromatographic behaviour of unknown pesticides. A multilinear regression model based on the same six descriptors shows a significantly worse predictive capability.

  9. EVALUATION OF QUANTITATIVE REAL TIME PCR FOR THE MEASUREMENT OF HELICOBATER PYLORI AT LOW CONCENTRATIONS IN DRINKING WATER

    EPA Science Inventory

    Aims: To determine the performance of a rapid, real time polymerase chain reaction (PCR) method for the detection and quantitative analysis Helicobacter pylori at low concentrations in drinking water.

    Methods and Results: A rapid DNA extraction and quantitative PCR (QPCR)...

  10. Towards a Quantitative Performance Measurement Framework to Assess the Impact of Geographic Information Standards

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.

    2012-12-01

    Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives

  11. Team Primacy Concept (TPC) Based Employee Evaluation and Job Performance

    ERIC Educational Resources Information Center

    Muniute, Eivina I.; Alfred, Mary V.

    2007-01-01

    This qualitative study explored how employees learn from Team Primacy Concept (TPC) based employee evaluation and how they use the feedback in performing their jobs. TPC based evaluation is a form of multirater evaluation, during which the employee's performance is discussed by one's peers in a face-to-face team setting. The study used Kolb's…

  12. Quantitative Evaluation of the Total Magnetic Moments of Colloidal Magnetic Nanoparticles: A Kinetics-based Method.

    PubMed

    Liu, Haiyi; Sun, Jianfei; Wang, Haoyao; Wang, Peng; Song, Lina; Li, Yang; Chen, Bo; Zhang, Yu; Gu, Ning

    2015-06-08

    A kinetics-based method is proposed to quantitatively characterize the collective magnetization of colloidal magnetic nanoparticles. The method is based on the relationship between the magnetic force on a colloidal droplet and the movement of the droplet under a gradient magnetic field. Through computational analysis of the kinetic parameters, such as displacement, velocity, and acceleration, the magnetization of colloidal magnetic nanoparticles can be calculated. In our experiments, the values measured by using our method exhibited a better linear correlation with magnetothermal heating, than those obtained by using a vibrating sample magnetometer and magnetic balance. This finding indicates that this method may be more suitable to evaluate the collective magnetism of colloidal magnetic nanoparticles under low magnetic fields than the commonly used methods. Accurate evaluation of the magnetic properties of colloidal nanoparticles is of great importance for the standardization of magnetic nanomaterials and for their practical application in biomedicine.

  13. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  14. Evaluation of green coffee beans quality using near infrared spectroscopy: a quantitative approach.

    PubMed

    Santos, João Rodrigo; Sarraguça, Mafalda C; Rangel, António O S S; Lopes, João A

    2012-12-01

    Characterisation of coffee quality based on bean quality assessment is associated with the relative amount of defective beans among non-defective beans. It is therefore important to develop a methodology capable of identifying the presence of defective beans that enables a fast assessment of coffee grade and that can become an analytical tool to standardise coffee quality. In this work, a methodology for quality assessment of green coffee based on near infrared spectroscopy (NIRS) is proposed. NIRS is a green chemistry, low cost, fast response technique without the need of sample processing. The applicability of NIRS was evaluated for Arabica and Robusta varieties from different geographical locations. Partial least squares regression was used to relate the NIR spectrum to the mass fraction of defective and non-defective beans. Relative errors around 5% show that NIRS can be a valuable analytical tool to be used by coffee roasters, enabling a simple and quantitative evaluation of green coffee quality in a fast way.

  15. Quantitative evaluation of wall heat loads by lost fast ions in the Large Helical Device

    NASA Astrophysics Data System (ADS)

    Morimoto, Junki; Suzuki, Yasuhiro; Seki, Ryosuke

    2016-10-01

    In fusion plasmas, fast ions are produced by neutral beam injections (NBI), ion cyclotron heating (ICH) and fusion reactions. Some of fast ions are lost from fusion plasmas because of some kinds of drift and instability. These lost fast ions may cause damages on plasma facing components such as divertors and diagnostic instruments in fusion reactors. Therefore, wall heat loads by lost fast ions in the Large Helical Device (LHD) is under investigation. For this purpose, we have been developing the Monte-Carlo code for the quantitative evaluation of wall heat loads based on following the guiding center orbits of fast ions. Using this code, we investigate wall heat loads and hitting points of lost fast ions produced by NBI in LHD. Magnetic field configurations, which depend on beta values, affect orbits of fast ions and wall heat loads. Therefore, the wall heat loads by fast ions in equilibrium magnetic fields including finite beta effect and magnetic islands are quantitatively evaluated. The differences of wall heat loads and particle deposition patterns for cases of the vacuum field and various beta equilibrium fields will be presented at the meeting.

  16. Highly sensitive and quantitative evaluation of the EGFR T790M mutation by nanofluidic digital PCR.

    PubMed

    Iwama, Eiji; Takayama, Koichi; Harada, Taishi; Okamoto, Isamu; Ookubo, Fumihiko; Kishimoto, Junji; Baba, Eishi; Oda, Yoshinao; Nakanishi, Yoichi

    2015-08-21

    The mutation of T790M in EGFR is a major mechanism of resistance to treatment with EGFR-TKIs. Only qualitative detection (presence or absence) of T790M has been described to date, however. Digital PCR (dPCR) analysis has recently been applied to the quantitative detection of target molecules in cancer with high sensitivity. In the present study, 25 tumor samples (13 obtained before and 12 after EGFR-TKI treatment) from 18 NSCLC patients with activating EGFR mutations were evaluated for T790M with dPCR. The ratio of the number of T790M alleles to that of activating mutation alleles (T/A) was determined. dPCR detected T790M in all 25 samples. Although T790M was present in all pre-TKI samples from 13 patients, 10 of these patients had a low T/A ratio and manifested substantial tumor shrinkage during treatment with EGFR-TKIs. In six of seven patients for whom both pre- and post-TKI samples were available, the T/A ratio increased markedly during EGFR-TKI treatment. Highly sensitive dPCR thus detected T790M in all NSCLC patients harboring activating EGFR mutations whether or not they had received EGFR-TKI treatment. Not only highly sensitive but also quantitative detection of T790M is important for evaluation of the contribution of T790M to EGFR-TKI resistance.

  17. Quantitative evaluation on internal seeing induced by heat-stop of solar telescope.

    PubMed

    Liu, Yangyi; Gu, Naiting; Rao, Changhui

    2015-07-27

    heat-stop is one of the essential thermal control devices of solar telescope. The internal seeing induced by its temperature rise will degrade the imaging quality significantly. For quantitative evaluation on internal seeing, an integrated analysis method based on computational fluid dynamics and geometric optics is proposed in this paper. Firstly, the temperature field of the heat-affected zone induced by heat-stop temperature rise is obtained by the method of computational fluid dynamics calculation. Secondly, the temperature field is transformed to refractive index field by corresponding equations. Thirdly, the wavefront aberration induced by internal seeing is calculated by geometric optics based on optical integration in the refractive index field. This integrated method is applied in the heat-stop of the Chinese Large Solar Telescope to quantitatively evaluate its internal seeing. The analytical results show that the maximum acceptable temperature rise of heat-stop is up to 5 Kelvins above the ambient air at any telescope pointing directions under the condition that the root-mean-square of wavefront aberration induced by internal seeing is less than 25nm. Furthermore, it is found that the magnitude of wavefront aberration gradually increases with the increase of heat-stop temperature rise for a certain telescope pointing direction. Meanwhile, with the variation of telescope pointing varying from the horizontal to the vertical direction, the magnitude of wavefront aberration decreases at first and then increases for the same heat-stop temperature rise.

  18. Quantitative evaluation of malignant gliomas damage induced by photoactivation of IR700 dye

    PubMed Central

    Sakuma, Morito; Kita, Sayaka; Higuchi, Hideo

    2016-01-01

    Abstract The processes involved in malignant gliomas damage were quantitatively evaluated by microscopy. The near-infrared fluorescent dye IR700 that is conjugated to an anti-CD133 antibody (IR700-CD133) specifically targets malignant gliomas (U87MG) and stem cells (BT142) and is endocytosed into the cells. The gliomas are then photodamaged by the release of reactive oxygen species (ROS) and the heat induced by illumination of IR700 by a red laser, and the motility of the vesicles within these cells is altered as a result of cellular damage. To investigate these changes in motility, we developed a new method that measures fluctuations in the intensity of phase-contrast images obtained from small areas within cells. The intensity fluctuation in U87MG cells gradually decreased as cell damage progressed, whereas the fluctuation in BT142 cells increased. The endocytosed IR700 dye was co-localized in acidic organelles such as endosomes and lysosomes. The pH in U87MG cells, as monitored by a pH indicator, was decreased and then gradually increased by the illumination of IR700, while the pH in BT142 cells increased monotonically. In these experiments, the processes of cell damage were quantitatively evaluated according to the motility of vesicles and changes in pH. PMID:27877897

  19. Quantitative evaluation of cervical cord compression by computed tomographic myelography in Thoroughbred foals

    PubMed Central

    YAMADA, Kazutaka; SATO, Fumio; HADA, Tetsuro; HORIUCHI, Noriyuki; IKEDA, Hiroki; NISHIHARA, Kahori; SASAKI, Naoki; KOBAYASHI, Yoshiyasu; NAMBO, Yasuo

    2016-01-01

    ABSTRACT Five Thoroughbred foals (age, 8–33 weeks; median age, 31 weeks; weight, 122–270 kg; median weight, 249 kg) exhibiting ataxia with suspected cervical myelopathy (n=4) and limb malformation (n=1) were subjected to computed tomographic (CT) myelography. The areas of the subarachnoid space and cervical cord were measured on transverse CT images. The area of the cervical cord was divided by the area of subarachnoid space, and stenosis ratios were quantitatively evaluated and compared on the basis of histopathological examination. The sites with a ratio above 52.8% could have been primary lesion sites in the histopathological examination, although one site with a ratio of 54.1% was not a primary lesion site. Therefore, in this study, a ratio between 52.8–54.1% was suggested to be borderline for physical compression that damages the cervical cord. All the cervical vertebrae could not be scanned in three of the five cases. Therefore, CT myelography is not a suitable method for locating the site of compression, but it should be used for quantitative evaluation of cervical stenosis diagnosed by conventional myelography. In conclusion, the stenosis ratios determined using CT myelography could be applicable for detecting primary lesion sites in the cervical cord. PMID:27974873

  20. Panoramic imaging is not suitable for quantitative evaluation, classification, and follow up in unilateral condylar hyperplasia.

    PubMed

    Nolte, J W; Karssemakers, L H E; Grootendorst, D C; Tuinzing, D B; Becking, A G

    2015-05-01

    Patients with suspected unilateral condylar hyperplasia are often screened radiologically with a panoramic radiograph, but this is not sufficient for routine diagnosis and follow up. We have therefore made a quantitative analysis and evaluation of panoramic radiographs in a large group of patients with the condition. During the period 1994-2011, 132 patients with 113 panoramic radiographs were analysed using a validated method. There was good reproducibility between observers, but the condylar neck and head were the regions reported with least reliability. Although in most patients asymmetry of the condylar head, neck, and ramus was confirmed, the kappa coefficient as an indicator of agreement between two observers was poor (-0.040 to 0.504). Hardly any difference between sides was measured at the gonion angle, and the body appeared to be higher on the affected side in 80% of patients. Panoramic radiographs might be suitable for screening, but are not suitable for the quantitative evaluation, classification, and follow up of patients with unilateral condylar hyperplasia.

  1. Quantitative evaluation of stemflow flux during the rainfall-discharge process in a forested area

    NASA Astrophysics Data System (ADS)

    Ikawa, R.; Shimada, J.; Shimizu, T.

    2006-12-01

    Stemflow is very important as a point spot input of precipitation and tree solutes to the ground surface in a forest. However, it has not been attached importance for its hydrological significance because of its quantitative contribution per unit area when compared to throughfall. In the densely forested area with relatively high rainfall, some studies recently point out that stemflow has a significant influence on runoff generation, soil erosion, groundwater recharge, soil solution chemistry, and the distribution of understory vegetation and epiphytes (Levia and Frost, 2003). It is known that there exist clear differences of isotopic composition and chemistries in the gross rainfall, throughfall, and stemflow, even in a rainfall event. In order to evaluate the stemflow contribution for the infiltration into a forest soil and groundwater, the precise isotopic observation for rainfall and river discharge water during rainfall-discharge process has been conducted in a densely forested headwater catchment of Kahoku experimental forest (KHEW: 33o08'N, 133o43'E) , Kyusyu island, Japan, since June, 2004. Water samples of gross rainfall, throughfall, stemflow, and riverwater were collected every hour using automatic water sampler. These samples were analyzed for deuterium and oxygen stable isotopes, inorganic water chemistry, and dissolved Silica. To evaluate the stemflow contribution during the rainfall-discharge process, catchments scale tank model was considered by using stemflow and throughfall as an input, and an isotopic fluctuation of river water during rainfall event was calculated by this model which was evaluated by the observed isotopic fluctuation in the river water. In the AGU fall meeting, we will explain more precisely about the quantitative evaluation method of stemflow contribution during rainfall-discharge process by using chemical isotopic data and tank model.

  2. A novel integrated approach to quantitatively evaluate the efficiency of extracellular polymeric substances (EPS) extraction process.

    PubMed

    Sun, Min; Li, Wen-Wei; Yu, Han-Qing; Harada, Hideki

    2012-12-01

    A novel integrated approach is developed to quantitatively evaluate the extracellular polymeric substances (EPS) extraction efficiency after taking into account EPS yield, EPS damage, and cell lysis. This approach incorporates grey relational analysis and fuzzy logic analysis, in which the evaluation procedure is established on the basis of grey relational coefficients generation, membership functions construction, and fuzzy rules description. The flocculation activity and DNA content of EPS are chosen as the two evaluation responses. To verify the feasibility and effectiveness of this integrated approach, EPS from Bacillus megaterium TF10 are extracted using five different extraction methods, and their extraction efficiencies are evaluated as one real case study. Based on the evaluation results, the maximal extraction grades and corresponding optimal extraction times of the five extraction methods are ordered as EDTA, 10 h > formaldehyde + NaOH, 60 min > heating, 120 min > ultrasonication, 30 min > H₂SO₄, 30 min > control. The proposed approach here offers an effective tool to select appropriate EPS extraction methods and determine the optimal extraction conditions.

  3. EVALUATION OF VENTILATION PERFORMANCE FOR INDOOR SPACE

    EPA Science Inventory

    The paper discusses a personal-computer-based application of computational fluid dynamics that can be used to determine the turbulent flow field and time-dependent/steady-state contaminant concentration distributions within isothermal indoor space. (NOTE: Ventilation performance ...

  4. 48 CFR 8.406-7 - Contractor Performance Evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Contractor Performance... ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Federal Supply Schedules 8.406-7 Contractor Performance Evaluation. Ordering activities must prepare an evaluation of contractor performance for...

  5. Patterned Armor Performance Evaluation for Multiple Impacts

    DTIC Science & Technology

    2003-08-01

    return it to the originator. Army Research Laboratory Aberdeen Proving Ground , MD 21005-5069 ARL-TR-3038 August 2003 Patterned Armor ...patterned armor performance against multiple impacts. This performance measure can then be compared to a well-posed multiple-hit criterion to assess...Aberdeen Proving Ground , MD, September 2002. de Rosset, W. S. Reactive Armor Model Sensitivity Studies; ARL-TR-1849; U.S. Army Research Laboratory

  6. Evaluation of performance impairment by spacecraft contaminants

    NASA Technical Reports Server (NTRS)

    Geller, I.; Hartman, R. J., Jr.; Mendez, V. M.

    1977-01-01

    The environmental contaminants (isolated as off-gases in Skylab and Apollo missions) were evaluated. Specifically, six contaminants were evaluated for their effects on the behavior of juvenile baboons. The concentrations of contaminants were determined through preliminary range-finding studies with laboratory rats. The contaminants evaluated were acetone, methyl ethyl ketone (MEK), methyl isobutyl ketone (MIBK), trichloroethylene (TCE), heptane and Freon 21. When the studies of the individual gases were completed, the baboons were also exposed to a mixture of MEK and TCE. The data obtained revealed alterations in the behavior of baboons exposed to relatively low levels of the contaminants. These findings were presented at the First International Symposium on Voluntary Inhalation of Industrial Solvents in Mexico City, June 21-24, 1976. A preprint of the proceedings is included.

  7. A model for evaluating the social performance of construction waste management

    SciTech Connect

    Yuan Hongping

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Scant attention is paid to social performance of construction waste management (CWM). Black-Right-Pointing-Pointer We develop a model for assessing the social performance of CWM. Black-Right-Pointing-Pointer With the model, the social performance of CWM can be quantitatively simulated. - Abstract: It has been determined by existing literature that a lot of research efforts have been made to the economic performance of construction waste management (CWM), but less attention is paid to investigation of the social performance of CWM. This study therefore attempts to develop a model for quantitatively evaluating the social performance of CWM by using a system dynamics (SD) approach. Firstly, major variables affecting the social performance of CWM are identified and a holistic system for assessing the social performance of CWM is formulated in line with feedback relationships underlying these variables. The developed system is then converted into a SD model through the software iThink. An empirical case study is finally conducted to demonstrate application of the model. Results of model validation indicate that the model is robust and reasonable to reflect the situation of the real system under study. Findings of the case study offer helpful insights into effectively promoting the social performance of CWM of the project investigated. Furthermore, the model exhibits great potential to function as an experimental platform for dynamically evaluating effects of management measures on improving the social performance of CWM of construction projects.

  8. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    SciTech Connect

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.; Vaccaro, S.; Schwalbach, P.; Liljenfeldt, Henrik; Tobin, Stephen

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  9. Quantitative microscopic evaluation of mucin areas and its percentage in mucinous carcinoma of the breast using tissue histological images.

    PubMed

    Saha, Monjoy; Arun, Indu; Basak, Bijan; Agarwal, Sanjit; Ahmed, Rosina; Chatterjee, Sanjoy; Bhargava, Rohit; Chakraborty, Chandan

    2016-06-01

    Mucinous carcinoma (MC) of the breast is very rare (∼1-7% of all breast cancers), invasive ductal carcinoma. Presence of pools of extracellular mucin is one of the most important histological features for MC. This paper aims at developing a quantitative computer-aided methodology for automated identification of mucin areas and its percentage using tissue histological images. The proposed method includes pre-processing (i.e., colour space transformation and colour normalization), mucin regions segmentation, post-processing, and performance evaluation. The proposed algorithm achieved 97.74% segmentation accuracy in comparison to ground truths. In addition, the percentage of mucin present in the tissue regions is calculated by the mucin index (MI) for grading MC (pure, moderately, minimally mucinous).

  10. Seismic performance evaluation of substation structures

    SciTech Connect

    Hwang, H.H.M.; Huo, J.R.

    1995-12-31

    This paper presents an approach for evaluating seismic hazards at the site and generating fragility curves for structures such as the capacitor bank in one of the major substations in the Memphis electric transmission system. The results from this study will be used to determine the adequacy of electric supply to several major hospitals in downtown Memphis after a large New Madrid earthquake.

  11. Disaggregating Pupil Performance Scores: Evaluating School Effectiveness.

    ERIC Educational Resources Information Center

    Allen, Henriette L.; Tadlock, James A.

    This report describes various components of the process of disaggregating student achievement data and provides examples of each component. Information is provided to allow school districts to conduct their own school effectiveness evaluation. Results of the California Achievement Tests form one of the bases of the analysis. The underlying…

  12. Performance Evaluation of Artificial Intelligence Systems

    DTIC Science & Technology

    1987-08-17

    difficulty Veit and Callero (1981) have Aq developed an evaluation technique called the Subjective Transfer Function (STF) ’-~ ~approach. In the STF...Information Retrieval, Montreal, Canada. Veit, C.T., M. Callero . (1981) Subjective Transfer Function Approach to Complex System " 4Analysis. Rand Corp

  13. Performance evaluation of 1 kw PEFC

    SciTech Connect

    Komaki, Hideaki; Tsuchiyama, Syozo

    1996-12-31

    This report covers part of a joint study on a PEFC propulsion system for surface ships, summarized in a presentation to this Seminar, entitled {open_quote}Study on a PEFC Propulsion System for Surface Ships{close_quotes}, and which envisages application to a 1,500 DWT cargo vessel. The aspect treated here concerns the effects brought on PEFC operating performance by conditions particular to shipboard operation. The performance characteristics were examined through tests performed on a 1 kw stack and on a single cell (Manufactured by Fuji Electric Co., Ltd.). The tests covered the items (1) to (4) cited in the headings of the sections that follow. Specifications of the stack and single cell are as given.

  14. Quantitative evaluation of regularized phase retrieval algorithms on bone scaffolds seeded with bone cells

    NASA Astrophysics Data System (ADS)

    Weber, L.; Langer, M.; Tavella, S.; Ruggiu, A.; Peyrin, F.

    2016-05-01

    In the field of regenerative medicine, there has been a growing interest in studying the combination of bone scaffolds and cells that can maximize newly formed bone. In-line phase-contrast x-ray tomography was used to image porous bone scaffolds (Skelite©), seeded with bone forming cells. This technique allows the quantification of both mineralized and soft tissue, unlike with classical x-ray micro-computed tomography. Phase contrast images were acquired at four distances. The reconstruction is typically performed in two successive steps: phase retrieval and tomographic reconstruction. In this work, different regularization methods were applied to the phase retrieval process. The application of a priori terms for heterogeneous objects enables quantitative 3D imaging of not only bone morphology, mineralization, and soft tissue formation, but also cells trapped in the pre-bone matrix. A statistical study was performed to derive statistically significant information on the different culture conditions.

  15. Performance evaluation of SAR/GMTI algorithms

    NASA Astrophysics Data System (ADS)

    Garber, Wendy; Pierson, William; Mcginnis, Ryan; Majumder, Uttam; Minardi, Michael; Sobota, David

    2016-05-01

    There is a history and understanding of exploiting moving targets within ground moving target indicator (GMTI) data, including methods for modeling performance. However, many assumptions valid for GMTI processing are invalid for synthetic aperture radar (SAR) data. For example, traditional GMTI processing assumes targets are exo-clutter and a system that uses a GMTI waveform, i.e. low bandwidth (BW) and low pulse repetition frequency (PRF). Conversely, SAR imagery is typically formed to focus data at zero Doppler and requires high BW and high PRF. Therefore, many of the techniques used in performance estimation of GMTI systems are not valid for SAR data. However, as demonstrated by papers in the recent literature,1-11 there is interest in exploiting moving targets within SAR data. The techniques employed vary widely, including filter banks to form images at multiple Dopplers, performing smear detection, and attempting to address the issue through waveform design. The above work validates the need for moving target exploitation in SAR data, but it does not represent a theory allowing for the prediction or bounding of performance. This work develops an approach to estimate and/or bound performance for moving target exploitation specific to SAR data. Synthetic SAR data is generated across a range of sensor, environment, and target parameters to test the exploitation algorithms under specific conditions. This provides a design tool allowing radar systems to be tuned for specific moving target exploitation applications. In summary, we derive a set of rules that bound the performance of specific moving target exploitation algorithms under variable operating conditions.

  16. Evaluation of relative MS response factors of drug metabolites for semi-quantitative assessment of chemical liabilities in drug discovery.

    PubMed

    Blanz, Joachim; Williams, Gareth; Dayer, Jerôme; Délémonté, Thierry; Gertsch, Werner; Ramstein, Philippe; Aichholz, Reiner; Trunzer, Markus; Pearson, David

    2017-02-02

    Drug metabolism studies are performed in drug discovery to identify metabolic soft spots, detect potentially toxic or reactive metabolites and provide an early insight into potential species differences. The relative peak area approach is often used to semi-quantitatively estimate the abundance of metabolites. Differences in the LC/MS responses result in an under- or overestimation of the metabolite and misinterpretation of results. The relative MS response factors of 132 structurally diverse drug candidates and their 233 corresponding metabolites were evaluated using a capillary-LC/HRMS system. All of the synthesized metabolites discussed here were previously identified as key biotransformation products in discovery investigations or predicted to be formed. The most commonly occurring biotransformation mechanisms such as oxygenation, dealkylation and amide cleavage are represented within this dataset. However, relatively few phase II metabolites were evaluated due to the limited availability of authentic standards. 85 % of these metabolites had a relative response factor (RF) in the range between 0.2 (5 fold under-prediction) and 2.0 (2 fold over-prediction) and the median MS RF was 0.6. Exceptions to this included very small metabolites that were hardly detectable. Additional experiments performed to understand the impact of the MS platform, flow rate and concentration suggested that these parameters do not have a significant impact on the RF of the compounds tested. This indicates that the use of relative peak areas to semi-quantitatively estimate the abundance of metabolites is justified in the drug discovery setting in order to guide medicinal chemistry efforts.

  17. Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging.

    PubMed

    Oishi, Kenichi; Faria, Andreia V; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2013-11-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a "growth percentile chart," which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced. Future

  18. Reprint of "Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging".

    PubMed

    Oishi, Kenichi; Faria, Andreia V; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2014-02-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a "growth percentile chart," which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced. Future

  19. SAFIR operation and evaluation of it's performance

    NASA Astrophysics Data System (ADS)

    Kawasaki, Z.-I.; Yamamoto, K.; Matsuura, K.; Richard, P.; Matsui, Toshiaki; Sonoi, Yasuo; Shimokura, Naoyoshi

    1994-06-01

    SAFIR (Surveillance et d'Alerte Foudre par Interferometrie Radioelectrique) has been equipped and operated in Japan since June 12th, 1991 as a cooperative project among Osaka University, Kansai Electric Power Co. INC.(KEPCO), and the French manufacturer DIMENSIONS. The operational coverage covers Northern Kinki District, Wakasa District, and Hokuriku District. Hokuriku District is well known for its winter thunderstorm activity. The method for the evaluation was to take the cross-correlation between the meteorological radar echo pattern and the distribution pattern of lightning discharges detected by SAFIR. We obtained high cross-correlation coefficients and concluded that the SAFIR locations were shown to have statistically high accuracy. We also show the case study of the occurrence of the lightning strike, which is recorded by KEPCO, to evaluate the usefulness of the warning by SAFIR.

  20. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  1. Performance evaluation of a liquid solar collector

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Report describes thermal performance and structural-load tests on commercial single glazed flat-plate solar collector with gross area of 63.5 sq ft that uses water as heat-transfer medium. Report documents test instrumentation and procedures and presents data as tables and graphs. Results are analyzed by standard data-reduction methods.

  2. Game Performance Evaluation in Male Goalball Players

    PubMed Central

    Molik, Bartosz; Morgulec-Adamowicz, Natalia; Kosmol, Andrzej; Perkowski, Krzysztof; Bednarczuk, Grzegorz; Skowroński, Waldemar; Gomez, Miguel Angel; Koc, Krzysztof; Rutkowska, Izabela; Szyman, Robert J

    2015-01-01

    Goalball is a Paralympic sport exclusively for athletes who are visually impaired and blind. The aims of this study were twofold: to describe game performance of elite male goalball players based upon the degree of visual impairment, and to determine if game performance was related to anthropometric characteristics of elite male goalball players. The study sample consisted of 44 male goalball athletes. A total of 38 games were recorded during the Summer Paralympic Games in London 2012. Observations were reported using the Game Efficiency Sheet for Goalball. Additional anthropometric measurements included body mass (kg), body height (cm), the arm span (cm) and length of the body in the defensive position (cm). The results differentiating both groups showed that the players with total blindness obtained higher means than the players with visual impairment for game indicators such as the sum of defense (p = 0.03) and the sum of good defense (p = 0.04). The players with visual impairment obtained higher results than those with total blindness for attack efficiency (p = 0.04), the sum of penalty defenses (p = 0.01), and fouls (p = 0.01). The study showed that athletes with blindness demonstrated higher game performance in defence. However, athletes with visual impairment presented higher efficiency in offensive actions. The analyses confirmed that body mass, body height, the arm span and length of the body in the defensive position did not differentiate players’ performance at the elite level. PMID:26834872

  3. Reduced short term memory in congenital adrenal hyperplasia (CAH) and its relationship to spatial and quantitative performance.

    PubMed

    Collaer, Marcia L; Hindmarsh, Peter C; Pasterski, Vickie; Fane, Briony A; Hines, Melissa

    2016-02-01

    Girls and women with classical congenital adrenal hyperplasia (CAH) experience elevated androgens prenatally and show increased male-typical development for certain behaviors. Further, individuals with CAH receive glucocorticoid (GC) treatment postnatally, and this GC treatment could have negative cognitive consequences. We investigated two alternative hypotheses, that: (a) early androgen exposure in females with CAH masculinizes (improves) spatial perception and quantitative abilities at which males typically outperform females, or (b) CAH is associated with performance decrements in these domains, perhaps due to reduced short-term-memory (STM). Adolescent and adult individuals with CAH (40 female and 29 male) were compared with relative controls (29 female and 30 male) on spatial perception and quantitative abilities as well as on Digit Span (DS) to assess STM and on Vocabulary to assess general intelligence. Females with CAH did not perform better (more male-typical) on spatial perception or quantitative abilities than control females, failing to support the hypothesis of cognitive masculinization. Rather, in the sample as a whole individuals with CAH scored lower on spatial perception (p ≤ .009), a quantitative composite (p ≤ .036), and DS (p ≤ .001), despite no differences in general intelligence. Separate analyses of adolescent and adult participants suggested the spatial and quantitative effects might be present only in adult patients with CAH; however, reduced DS performance was found in patients with CAH regardless of age group. Separate regression analyses showed that DS predicted both spatial perception and quantitative performance (both p ≤ .001), when age, sex, and diagnosis status were controlled. Thus, reduced STM in CAH patients versus controls may have more general cognitive consequences, potentially reducing spatial perception and quantitative skills. Although hyponatremia or other aspects of salt-wasting crises or additional hormone

  4. A quantitative metrology for performance characterization of breast tomosynthesis systems based on an anthropomorphic phantom

    NASA Astrophysics Data System (ADS)

    Ikejimba, Lynda; Chen, Yicheng; Oberhofer, Nadia; Kiarashi, Nooshin; Lo, Joseph Y.; Samei, Ehsan

    2015-03-01

    Purpose: Common methods for assessing image quality of digital breast tomosynthesis (DBT) devices currently utilize simplified or otherwise unrealistic phantoms, which use inserts in a uniform background and gauge performance based on a subjective evaluation of insert visibility. This study proposes a different methodology to assess system performance using a three-dimensional clinically-informed anthropomorphic breast phantom. Methods: The system performance is assessed by imaging the phantom and computationally characterizing the resultant images in terms of several new metrics. These include a contrast index (reflective of local difference between adipose and glandular material), a contrast to noise ratio index (reflective of contrast against local background noise), and a nonuniformity index (reflective of contributions of noise and artifacts within uniform adipose regions). Indices were measured at ROI sizes of 10mm and 37 mm, respectively. The method was evaluated at fixed dose of 1.5 mGy AGD. Results: Results indicated notable differences between systems. At 10 mm, vendor A had the highest contrast index, followed by B and C in that. The performance ranking was identical at the largest ROI size. The non-uniformity index similarly exhibited system-dependencies correlated with visual appearance of clutter from out-of-plane artifacts. Vendor A had the greatest NI at all ROI sizes, B had the second greatest, and C the least. Conclusions: The findings illustrate that the anthropomorphic phantom can be used as a quality control tool with results that are targeted to be more reflective of clinical performance of breast tomosynthesis systems of multiple manufacturers.

  5. Evaluating Suit Fit Using Performance Degradation

    NASA Technical Reports Server (NTRS)

    Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar

    2012-01-01

    The Mark III planetary technology demonstrator space suit can be tailored to an individual by swapping the modular components of the suit, such as the arms, legs, and gloves, as well as adding or removing sizing inserts in key areas. A method was sought to identify the transition from an ideal suit fit to a bad fit and how to quantify this breakdown using a metric of mobility-based human performance data. To this end, the degradation of the range of motion of the elbow and wrist of the suit as a function of suit sizing modifications was investigated to attempt to improve suit fit. The sizing range tested spanned optimal and poor fit and was adjusted incrementally in order to compare each joint angle across five different sizing configurations. Suited range of motion data were collected using a motion capture system for nine isolated and functional tasks utilizing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm by itself. Findings indicated that no single joint drives the performance of the arm as a function of suit size; instead it is based on the interaction of multiple joints along a limb. To determine a size adjustment range where an individual can operate the suit at an acceptable level, a performance detriment limit was set. This user-selected limit reveals the task-dependent tolerance of the suit fit around optimal size. For example, the isolated joint motion indicated that the suit can deviate from optimal by as little as -0.6 in to -2.6 in before experiencing a 10% performance drop in the wrist or elbow joint. The study identified a preliminary method to quantify the impact of size on performance and developed a new way to gauge tolerances around optimal size.

  6. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  7. Investigation of the performance characteristics of a plasma synthetic jet actuator based on a quantitative Schlieren method

    NASA Astrophysics Data System (ADS)

    Zong, Hao-hua; Wu, Yun; Song, Hui-min; Jia, Min; Liang, Hua; Li, Ying-hong; Zhang, Zhi-bo

    2016-05-01

    A quantitative Schlieren method is developed to calculate the density field of axisymmetric flows. With this method, the flow field structures of plasma synthetic jets are analysed in detail. Major performance parameters, including the maximum density increase behind the shock wave, the expelled mass per pulse and the impulse, are obtained to evaluate the intensity of the shock wave and the jet. A high-density but low-velocity jet issues out of the cavity after the precursor shock wave, with a vortex ring at the wave front. The vortex ring gradually lags behind the center jet during the propagation, and its profile resembles a pair of kidneys in shape. After the jet terminates, the vortex ring breaks down and the whole density field is separated into two regions. In one period, the jet front velocity first increases and then decreases, with a maximum value of 270 m s-1. The precursor shock wave velocity decays quickly from 370 m s-1 to 340 m s-1 in the first 50 μs. The variation in the maximum density rise behind the precursor shock wave is similar to that of the jet front velocity. The averaged exit density drops sharply at around 50 μs and then gradually rises. The maximum mass flow rate is about 0.35 g s-1, and the total expelled mass in one period occupies 26% of the initial cavity gas mass. The impulse produced in the jet stage is estimated to be 5 μN s-1. The quantitative Schlieren method developed can also be used in the research of other compressible axisymmetric flows.

  8. Quantitative profiling of perfluoroalkyl substances by ultrahigh-performance liquid chromatography and hybrid quadrupole time-of-flight mass spectrometry.

    PubMed

    Picó, Yolanda; Farré, Marinella; Barceló, Damià

    2015-06-01

    The accurate determination of perfluoroalkyl substances (PFSAs) in water, sediment, fish, meat, and human milk was achieved by ultrahigh-performance liquid chromatography-quadrupole time-of-flight mass spectrometry (UHPLC-QqTOF-MS) with an ABSciex Triple TOF®. A group of 21 PFSAs was selected as target to evaluate the quantitative possibilities. Full scan MS acquisition data allows quantification at relevant low levels (0.1-50 ng L(-1) in water, 0.05-2 ng g(-1) in sediments, 0.01-5 ng g(-1) in fish and meat, and 0.005-2 ng g(-1) in human milk depending on the compound). Automatic information dependent acquisition product ion mass spectrometry (IDA-MS/MS) confirms the identity even for those compounds that presented only one product ion. The preparation of a homemade database using the extracted ion chromatogram (XIC) Manager of the software based upon retention time, accurate mass, isotopic pattern, and MS/MS library searching achieves not only the successful identification of PFSAs but also of some pharmaceuticals, such as acetaminophen, ibuprofen, salicylic acid, and gemfibrozid. Mean recoveries and relative standard deviation (RSD) were 67-99% (9-16% RSD) for water, 62-103% (8-18% RSD) for sediment, 60-95% (8-17% RSD) for fish, 64-95% (8-15% RSD) for meat, and 63-95% (8-16%) for human milk. The quantitative data obtained for 60 samples by UHPLC-QqTOF-MS agree with those obtained by LC-MS/MS with a triple quadrupole (QqQ).

  9. The use of a battery of tracking tests in the quantitative evaluation of neurological function

    NASA Technical Reports Server (NTRS)

    Repa, B. S.; Albers, J. W.; Potvin, A. R.; Tourtellotte, W. W.

    1972-01-01

    A tracking test battery has been applied in a drug trail designed to compare the efficacy of L-DOPA and amantadine to that of L-DOPA and placebo in the treatment of 28 patients with Parkinson's disease. The drug trial provided an ideal opportunity for objectively evaluating the usefulness of tracking tests in assessing changes in neurologic function. Evaluating changes in patient performance resulting from disease progression and controlled clinical trials is of great importance in establishing effective treatment programs.

  10. Quantitation of meloxicam in the plasma of koalas (Phascolarctos cinereus) by improved high performance liquid chromatography

    PubMed Central

    Kimble, Benjamin; Li, Kong Ming

    2013-01-01

    An improved method to determine meloxicam (MEL) concentrations in koala plasma using reversed phase high performance liquid chromatography equipped with a photo diode array detector was developed and validated. A plasma sample clean-up step was carried out with hydrophilic-lipophilic copolymer solid phase extraction cartridges. MEL was separated from an endogenous interference using an isocratic mobile phase [acetonitrile and 50 mM potassium phosphate buffer (pH 2.15), 45:55 (v:v)] on a Nova-Pak C18 4-µm (300 × 3.9 mm) column. Retention times for MEL and piroxicam were 8.03 and 5.56 min, respectively. Peak area ratios of MEL to the internal standard (IS) were used for regression analysis of the calibration curve, which was linear from 10 to 1,000 ng/mL (r2 > 0.9998). Average absolute recovery rates were 91% and 96% for MEL and the IS, respectively. This method had sufficient sensitivity (lower quantitation limit of 10 ng/mL), precision, accuracy, and selectivity for routine analysis of MEL in koala plasma using 250-µL sample volumes. Our technique clearly resolved the MEL peak from the complex koala plasma matrix and accurately measured MEL concentrations in small plasma volumes. PMID:23388431

  11. A Computational Approach for Functional Mapping of Quantitative Trait Loci That Regulate Thermal Performance Curves

    PubMed Central

    Yap, John Stephen; Wang, Chenguang; Wu, Rongling

    2007-01-01

    Whether and how thermal reaction norm is under genetic control is fundamental to understand the mechanistic basis of adaptation to novel thermal environments. However, the genetic study of thermal reaction norm is difficult because it is often expressed as a continuous function or curve. Here we derive a statistical model for dissecting thermal performance curves into individual quantitative trait loci (QTL) with the aid of a genetic linkage map. The model is constructed within the maximum likelihood context and implemented with the EM algorithm. It integrates the biological principle of responses to temperature into a framework for genetic mapping through rigorous mathematical functions established to describe the pattern and shape of thermal reaction norms. The biological advantages of the model lie in the decomposition of the genetic causes for thermal reaction norm into its biologically interpretable modes, such as hotter-colder, faster-slower and generalist-specialist, as well as the formulation of a series of hypotheses at the interface between genetic actions/interactions and temperature-dependent sensitivity. The model is also meritorious in statistics because the precision of parameter estimation and power of QTLdetection can be increased by modeling the mean-covariance structure with a small set of parameters. The results from simulation studies suggest that the model displays favorable statistical properties and can be robust in practical genetic applications. The model provides a conceptual platform for testing many ecologically relevant hypotheses regarding organismic adaptation within the Eco-Devo paradigm. PMID:17579725

  12. Quantitative evaluation of the memory bias effect in ROC studies with PET/CT

    NASA Astrophysics Data System (ADS)

    Kallergi, Maria; Pianou, Nicoletta; Georgakopoulos, Alexandros; Kafiri, Georgia; Pavlou, Spiros; Chatziioannou, Sofia

    2012-02-01

    PURPOSE. The purpose of the study was to evaluate the memory bias effect in ROC experiments with tomographic data and, specifically, in the evaluation of two different PET/CT protocols for the detection and diagnosis of recurrent thyroid cancer. MATERIALS AND METHODS. Two readers participated in an ROC experiment that evaluated tomographic images from 43 patients followed up for thyroid cancer recurrence. Readers evaluated first whole body PET/CT scans of the patients and then a combination of whole body and high-resolution head and neck scans of the same patients. The second set was read twice. Once within 48 hours of the first set and the second time at least a month later. The detection and diagnostic performances of the readers in the three reading sessions were assessed with the DBMMRMC and LABMRMC software using the area under the ROC curve as a performance index. Performances were also evaluated by comparing the number and the size of the detected abnormal foci among the three readings. RESULTS. There was no performance difference between first and second treatments. There were statistically significant differences between first and third, and second and third treatments showing that memory can seriously affect the outcome of ROC studies. CONCLUSION. Despite the fact that tomographic data involve numerous image slices per patient, the memory bias effect is present and substantial and should be carefully eliminated from analogous ROC experiments.

  13. PERFORMANCE EVALUATION OF TYPE I MARINE SANITATION DEVICES

    EPA Science Inventory

    This performance test was designed to evaluate the effectiveness of two Type I Marine Sanitation Devices (MSDs): the Electro Scan Model EST 12, manufactured by Raritan Engineering Company, Inc., and the Thermopure-2, manufactured by Gross Mechanical Laboratories, Inc. Performance...

  14. Comparative evaluation of three commercial quantitative cytomegalovirus standards by use of digital and real-time PCR.

    PubMed

    Hayden, R T; Gu, Z; Sam, S S; Sun, Y; Tang, L; Pounds, S; Caliendo, A M

    2015-05-01

    The recent development of the 1st WHO International Standard for human cytomegalovirus (CMV) and the introduction of commercially produced secondary standards have raised hopes of improved agreement among laboratories performing quantitative PCR for CMV. However, data to evaluate the trueness and uniformity of secondary standards and the consistency of results achieved when these materials are run on various assays are lacking. Three concentrations of each of the three commercially prepared secondary CMV standards were tested in quadruplicate by three real-time and two digital PCR methods. The mean results were compared in a pairwise fashion with nominal values provided by each manufacturer. The agreement of results among all methods for each sample and for like concentrations of each standard was also assessed. The relationship between the nominal values of standards and the measured values varied, depending upon the assay used and the manufacturer of the standards, with the degree of bias ranging from +0.6 to -1.0 log10 IU/ml. The mean digital PCR result differed significantly among the secondary standards, as did the results of the real-time PCRs, particularly when plotted against nominal log10 IU values. Commercially available quantitative secondary CMV standards produce variable results when tested by different real-time and digital PCR assays, with various magnitudes of bias compared to nominal values. These findings suggest that the use of such materials may not achieve the intended uniformity among laboratories measuring CMV viral load, as envisioned by adaptation of the WHO standard.

  15. Separation and quantitation of metallothioneins by high-performance liquid chromatography coupled with atomic absorption spectrophotometry

    SciTech Connect

    Lehman, L.D.; Klaassen, C.D.

    1986-03-01

    A rapid, reproducible, and sensitive high-performance liquid chromatography (HPLC) method for the determination of the concentrations of metallothionein-I (MT-I) and metallothionein-II (MT-II) in rat liver has been developed. Metallothioneins (MTs) were separated and quantitated by anion-exchange high-performance liquid chromatography coupled with atomic absorption spectrophotometry (AAS). Purified rat liver MT-I and MT-II, used as standards for developing the method, were easily resolved, eluting at 7.5 and 10.4 min, respectively. To establish standard curves, protein concentrations of solutions of the purified MTs were determined by the Kjeldahl method for the determination of nitrogen, after which the standards were saturated with Cd (final concentration of 50 ppm Cd). Rat liver cytosols obtained from untreated and Cd- or Zn-treated rats were prepared for HPLC-AAS analysis by saturation with Cd (50 ppm Cd) followed by heat denaturation (placing in a boiling water bath for 1 min). Based on the method of standard additions, recovery of MTs exceeded 95% and repeated injection of a sample yielded a coefficient of variance of approximately 2%. A detection limit of 5 ..mu..g MT/g liver was established for the method. Only MT-II was detected in untreated rats, whereas following exposure to Cd or Zn, both forms of MTs were detected. Concentrations of total MTs in liver of untreated and Cd- or Zn-treated rats were also determined by the Cd/hemoglobin radioassay (which fails to distinguish MT-I from MT-II) and indicated that results obtained with the HPLC-AAS method compared favorably to the Cd/hemoglobin radioassay.

  16. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  17. Quality consistency evaluation of Melissa officinalis L. commercial herbs by HPLC fingerprint and quantitation of selected phenolic acids.

    PubMed

    Arceusz, Agnieszka; Wesolowski, Marek

    2013-09-01

    To evaluate the quality consistency of commercial medicinal herbs, a simple and reliable HPLC method with UV-vis detector was developed, both for fingerprint analysis and quantitation of some pharmacologically active constituents (marker compounds). Melissa officinalis L. (lemon balm) was chosen for this study because it is widely used as an aromatic, culinary and medicine remedy. About fifty peaks were found in each chromatogram of a lemon balm extract, including twelve satisfactorily resolved characteristic peaks. A reference chromatographic fingerprint for the studied medicinal herb was calculated using Matlab 9.1 software as a result of analysing all the 19 lemon balm samples obtained from 12 Polish manufacturers. The similarity values and the results of principal component analysis revealed that all the samples were highly correlated with the reference fingerprint and could be accurately classified in relation to their quality consistency. Next, a quantitation of selected phenolic acids in the studied samples was performed. The results have shown that the levels of phenolic acids, i.e. gallic, chlorogenic, syringic, caffeic, ferulic and rosmarinic were as follows (mg/g of dry weight): 0.001-0.067, 0.010-0.333, 0.007-0.553, 0.047-0.705, 0.006-1.589 and 0.158-48.608, respectively. Statistical analysis indicated that rosmarinic acid occurs in M. officinalis at the highest level, whereas gallic acid in the lowest. A detailed inspection of these data has also revealed that reference chromatographic fingerprints combined with quantitation of pharmacologically active constituents of the plant could be used as an efficient strategy for monitoring of the lemon balm quality consistency.

  18. Evaluating the efficacy of continuous quantitative characters for reconstructing the phylogeny of a morphologically homogeneous spider taxon (Araneae, Mygalomorphae, Antrodiaetidae, Antrodiaetus).

    PubMed

    Hendrixson, Brent E; Bond, Jason E

    2009-10-01

    The use of continuous quantitative characters for phylogenetic analyses has long been contentious in the systematics literature. Recent studies argue for and against their use, but there have been relatively few attempts to evaluate whether these characters provide an accurate estimate of phylogeny, despite the fact that a number of methods have been developed to analyze these types of data for phylogenetic inference. A tree topology will be produced for a given methodology and set of characters, but little can be concluded with regards to the accuracy of phylogenetic signal without an independent evaluation of those characters. We assess the performance of continuous quantitative characters for the mygalomorph spider genus Antrodiaetus, a group that is morphologically homogeneous and one for which few discrete (morphological) characters have been observed. Phylogenetic signal contained in continuous quantitative characters is compared to an independently derived phylogeny inferred on the basis of multiple nuclear and mitochondrial gene loci. Tree topology randomizations, regression techniques, and topological tests all demonstrate that continuous quantitative characters in Antrodiaetus conflict with the phylogenetic signal contained in the gene trees. Our results show that the use of continuous quantitative characters for phylogenetic reconstruction may be inappropriate for reconstructing Antrodiaetus phylogeny and indicate that due caution should be exercised before employing this character type in the absence of other independently derived sources of characters.

  19. Performance Evaluation of Vinyl Replacement Windows.

    DTIC Science & Technology

    1980-07-15

    VINYL REPLACEMENT WINDOWS P. B. SHEPHERD JOHNS - MANVILLE SALES CORPORATION LEU ! RESEARCH & DEVELOPMENT CENTER CLE 0 KEN-CARYL RANCH DENVER, COLORADO...PE F -I.. E RI.. Philip B.heperd/K^678D00 9. PERFORMING ORGANIZATION NAME AND ADDRESS SO. PROGRAM ELEMENT. PROJECT, TASK Johns - Manville Sales Corporatp...Development Center Ken-Caryl Ranch Denver. Colorado 80217 (303) 979-1000 October 23, 1979 Dear Sir: The Johns - Manville R&D Center has been contacted

  20. Brookfield Homes Passive House Performance Evaluation

    SciTech Connect

    A. Herk; Poerschke, A.; Beach, R.

    2016-02-01

    In 2012-2013, IBACOS worked with a builder, Brookfield Homes in Denver, Colorado, to design and construct a Passive House certified model home. IBACOS used several modeling programs and calculation methods to complete the final design package along with Brookfield's architect KGA Studio. This design package included upgrades to the thermal enclosure, basement insulation, windows, and heating, ventilation, and air conditioning. Short-term performance testing in the Passive House was done during construction and after construction.

  1. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  2. Performance Evaluation Method for Dissimilar Aircraft Designs

    NASA Technical Reports Server (NTRS)

    Walker, H. J.

    1979-01-01

    A rationale is presented for using the square of the wingspan rather than the wing reference area as a basis for nondimensional comparisons of the aerodynamic and performance characteristics of aircraft that differ substantially in planform and loading. Working relationships are developed and illustrated through application to several categories of aircraft covering a range of Mach numbers from 0.60 to 2.00. For each application, direct comparisons of drag polars, lift-to-drag ratios, and maneuverability are shown for both nondimensional systems. The inaccuracies that may arise in the determination of aerodynamic efficiency based on reference area are noted. Span loading is introduced independently in comparing the combined effects of loading and aerodynamic efficiency on overall performance. Performance comparisons are made for the NACA research aircraft, lifting bodies, century-series fighter aircraft, F-111A aircraft with conventional and supercritical wings, and a group of supersonic aircraft including the B-58 and XB-70 bomber aircraft. An idealized configuration is included in each category to serve as a standard for comparing overall efficiency.

  3. Evaluation of Improvements to Brayton Cycle Performance.

    DTIC Science & Technology

    1986-05-29

    34 Report No. MTR-7274, The MITRE Corporation, July 1976. 10. Boyce, M.P., Vyas , Y.K., and Trevillion, W.L., "The External Combustion Steam Injected Gas...for Power, Vol. 102, January 1980, pp. 42-49. 27. Keenan, J.H., Keyes, F.G., Hill , P.G., and Moore, J.G., StamTables, John Wiley & Sons, New York, 1978...Turbines and Power Vol. 107, October 1985, pp. 880-889. 29. Fraas, A P., Engineering Evaluation of Energy Systems, McGraw- Hill , New York, 1982

  4. Evaluation of Mobile Phone Performance for Near-Infrared Fluorescence Imaging.

    PubMed

    Ghassemi, Pejhman; Wang, Bohan; Wang, Jianting; Wang, Quanzeng; Chen, Yu; Pfefer, T Joshua

    2016-08-19

    We have investigated the potential for contrast-enhanced near-infrared fluorescence imaging of tissue on a mobile phone platform. CCD- and phone-based cameras were used to image molded and 3Dprinted tissue phantoms, and an ex vivo animal model. Quantitative and qualitative evaluations of image quality demonstrate the viability of this approach and elucidate variations in performance due to wavelength, pixel color and image processing.

  5. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org).

  6. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  7. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    PubMed Central

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant. PMID:25763050

  8. Qualitative and quantitative evaluation of derivatization reagents for different types of protein-bound carbonyl groups.

    PubMed

    Bollineni, Ravi Chand; Fedorova, Maria; Hoffmann, Ralf

    2013-09-07

    Mass spectrometry (MS) of 'carbonylated proteins' often involves derivatization of reactive carbonyl groups to facilitate their enrichment, identification and quantification. Among the many reported reagents, 2,4-dinitrophenylhydrazine (DNPH), biotin hydrazide (BHZ) and O-(biotinylcarbazoylmethyl) hydroxylamine (ARP) are the most frequently used. Despite their common use in carbonylation research, their reactivity towards protein-bound carbonyls has not been quantitatively evaluated in detail, to the best of our knowledge. Thus we studied the reactivity and specificity of these reagents towards different classes of reactive carbonyl groups (e.g. aldehydes, ketones and lactams), each being represented by a synthetic peptide carrying an accordingly modified residue. All three tagging reagents were selective for aliphatic aldehydes and ketones. Lactams and carbonyl-containing tryptophan oxidation products, however, were labelled only at low levels or not at all. Whereas DNPH derivatization was efficient under the published standard conditions, the derivatization conditions for BHZ and ARP had to be altered. Acidic conditions provided quantitative labelling yields for ARP. Peptides derivatized with DNPH, BHZ and ARP fragmented efficiently in tandem mass spectrometry, when the experimental conditions were chosen carefully for each reagent. Importantly, the tested carbonylated peptides did not cross-react with amino groups in other proteins present during sample preparations or enzymatic digestion. Thus, it appears favourable to digest proteins first and then derivatise the reactive carbonyl groups more efficiently at the peptide level under acidic conditions. The carbonylated model peptides used in this study might be valid internal standards for carbonylation proteomics.

  9. Quantitative and Qualitative Evaluation of Iranian Researchers’ Scientific Production in Dentistry Subfields

    PubMed Central

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-01-01

    Background: As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers’ scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. Methods: This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. Results: 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). Conclusions: The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields. PMID:26635439

  10. Quantitative evaluation of hidden defects in cast iron components using ultrasound activated lock-in vibrothermography

    SciTech Connect

    Montanini, R.; Freni, F.; Rossi, G. L.

    2012-09-15

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.

  11. Chairside quantitative immunochromatographic evaluation of salivary cotinine and its correlation with chronic periodontitis

    PubMed Central

    Surya, Chamarthi; Swamy, Devulapally Narasimha; Chakrapani, Swarna; Kumar, Surapaneni Sunil

    2012-01-01

    Background: Cigarette smoking is an established and modifiable risk factor for periodontitis. Periodontitis appears to be dose-dependent on smoking. The purpose of this study was to assess a reliable marker of tobacco smoke exposure (salivary cotinine) chairside and to confirm the quantitative association between smoking and chronic periodontitis. Materials and Methods: Saliva samples from 80 males, aged 30–60 years, with chronic periodontitis, were evaluated chairside using NicAlert™ cotinine test strips (NCTS). Patients were divided into two groups: A (cotinine negative) and B (cotinine positive). Plaque index (PI), Gingival index (GI), gingival bleeding index (GBI), probing pocket depth (PPD), clinical attachment level (CAL), and gingival recession (GR) were compared between the two groups and among the subjects of group B. Results: Comparison showed that the severity of PPD (P<0.001), CAL (P<0.001), and GR (P<0.001) was more in group B than in group A. Severity of all periodontal parameters increased with increased salivary cotinine among the subjects in group B. Conclusion: Quantitative direct association can be established between salivary cotinine and the severity of periodontitis. Immunochromatography-based cotinine test strips are a relatively easy method for quantification of salivary cotinine chairside. Immediate and personalized feedback from a chairside test can improve compliance, quit rates, and ease reinforcing smoking cessation. PMID:23492903

  12. The performance environment of the England youth soccer teams: a quantitative investigation.

    PubMed

    Pain, Matthew A; Harwood, Chris G

    2008-09-01

    We examined the performance environment of the England youth soccer teams. Using a conceptually grounded questionnaire developed from the themes identified by Pain and Harwood (2007), 82 players and 23 national coaches and support staff were surveyed directly following international tournaments regarding the factors that positively and negatively influenced performance. The survey enabled data to be captured regarding both the extent and magnitude of the impact of the factors comprising the performance environment. Overall, team and social factors were generally perceived to have the greatest positive impact, with players and staff showing high levels of consensus in their evaluations. Team leadership and strong team cohesion were identified by both groups as having the greatest positive impact. Overall, far fewer variables were perceived to have a negative impact on performance, especially for players. The main negatives common to both groups were players losing composure during games, player boredom, and a lack of available activities in the hotel. The major findings support those of Pain and Harwood (2007) and in using a larger sample helped to corroborate and strengthen the generalizability of the findings.

  13. Phased array performance evaluation with photoelastic visualization

    SciTech Connect

    Ginzel, Robert; Dao, Gavin

    2014-02-18

    New instrumentation and a widening range of phased array transducer options are affording the industry a greater potential. Visualization of the complex wave components using the photoelastic system can greatly enhance understanding of the generated signals. Diffraction, mode conversion and wave front interaction, together with beam forming for linear, sectorial and matrix arrays, will be viewed using the photoelastic system. Beam focus and steering performance will be shown with a range of embedded and surface targets within glass samples. This paper will present principles and sound field images using this visualization system.

  14. ATAMM enhancement and multiprocessing performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.

    1994-01-01

    The algorithm to architecture mapping model (ATAAM) is a Petri net based model which provides a strategy for periodic execution of a class of real-time algorithms on multicomputer dataflow architecture. The execution of large-grained, decision-free algorithms on homogeneous processing elements is studied. The ATAAM provides an analytical basis for calculating performance bounds on throughput characteristics. Extension of the ATAMM as a strategy for cyclo-static scheduling provides for a truly distributed ATAMM multicomputer operating system. An ATAAM testbed consisting of a centralized graph manager and three processors is described using embedded firmware on 68HC11 microcontrollers.

  15. Quantitative evaluation of statistical errors in small-angle X-ray scattering measurements

    PubMed Central

    Sedlak, Steffen M.; Bruetzel, Linda K.; Lipfert, Jan

    2017-01-01

    A new model is proposed for the measurement errors incurred in typical small-angle X-ray scattering (SAXS) experiments, which takes into account the setup geometry and physics of the measurement process. The model accurately captures the experimentally determined errors from a large range of synchrotron and in-house anode-based measurements. Its most general formulation gives for the variance of the buffer-subtracted SAXS intensity σ2(q) = [I(q) + const.]/(kq), where I(q) is the scattering intensity as a function of the momentum transfer q; k and const. are fitting parameters that are characteristic of the experimental setup. The model gives a concrete procedure for calculating realistic measurement errors for simulated SAXS profiles. In addition, the results provide guidelines for optimizing SAXS measurements, which are in line with established procedures for SAXS experiments, and enable a quantitative evaluation of measurement errors. PMID:28381982

  16. Evaluation of postmortem bacterial migration using culturing and real-time quantitative PCR.

    PubMed

    Tuomisto, Sari; Karhunen, Pekka J; Vuento, Risto; Aittoniemi, Janne; Pessi, Tanja

    2013-07-01

    Postmortem bacteriology can be a valuable tool for evaluating deaths due to bacterial infection or for researching the involvement of bacteria in various diseases. In this study, time-dependent postmortem bacterial migration into liver, mesenteric lymph node, pericardial fluid, portal, and peripheral vein was analyzed in 33 autopsy cases by bacterial culturing and real-time quantitative polymerase chain reaction (RT-qPCR). None suffered or died from bacterial infection. According to culturing, pericardial fluid and liver were the most sterile samples up to 5 days postmortem. In these samples, multigrowth and staphylococci were not or rarely detected. RT-qPCR was more sensitive and showed higher bacterial positivity in all samples. Relative amounts of intestinal bacterial DNA (bifidobacteria, bacteroides, enterobacter, clostridia) increased with time. Sterility of blood samples was low during the studied time periods (1-7 days). The best postmortem microbiological sampling sites were pericardial fluid and liver up to 5 days after death.

  17. Quantitative MR evaluation of body composition in patients with Duchenne muscular dystrophy.

    PubMed

    Pichiecchio, Anna; Uggetti, Carla; Egitto, Maria Grazia; Berardinelli, Angela; Orcesi, Simona; Gorni, Ksenija Olga Tatiana; Zanardi, Cristina; Tagliabue, Anna

    2002-11-01

    The aim of this study was to propose a quantitative MR protocol with very short acquisition time and good reliability in volume construction, for the evaluation of body composition in patients affected by Duchenne muscular dystrophy (DMD). This MR protocol was compared with common anthropometric evaluations of the same patients. Nine boys affected by DMD, ranging in age from 6 to 12 years, were selected to undergo MR examination. Transversal T1-weighted spin-echo sequences (0.5T; TR 300 ms, TE 10 ms, slice thickness 10 mm, slice gap 1 mm) were used for all acquisitions, each consisting of 8 slices and lasting just 54 s. Whole-body examination needed an average of nine acquisitions. Afterwards, images were downloaded to an independent workstation and, through their electronic segmentation with a reference filter, total volume and adipose tissue volumes were calculated manually. This process took up to 2 h for each patient. The MR data were compared with anthropometric evaluations. Affected children have a marked increase in adipose tissue and a decrease in lean tissue compared with reference healthy controls. Mean fat mass calculated by MR is significantly higher than mean fat mass obtained using anthropometric measurements ( p<0.001). Our MR study proved to be accurate and easy to apply, although it was time-consuming. We recommend it in monitoring the progression of the disease and planning DMD patients' diet.

  18. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  19. Quantitative analysis of topoisomerase II{alpha} to rapidly evaluate cell proliferation in brain tumors

    SciTech Connect

    Oda, Masashi; Arakawa, Yoshiki; Kano, Hideyuki; Kawabata, Yasuhiro; Katsuki, Takahisa; Shirahata, Mitsuaki; Ono, Makoto; Yamana, Norikazu; Hashimoto, Nobuo; Takahashi, Jun A. . E-mail: jat@kuhp.kyoto-u.ac.jp

    2005-06-17

    Immunohistochemical cell proliferation analyses have come into wide use for evaluation of tumor malignancy. Topoisomerase II{alpha} (topo II{alpha}), an essential nuclear enzyme, has been known to have cell cycle coupled expression. We here show the usefulness of quantitative analysis of topo II{alpha} mRNA to rapidly evaluate cell proliferation in brain tumors. A protocol to quantify topo II{alpha} mRNA was developed with a real-time RT-PCR. It took only 3 h to quantify from a specimen. A total of 28 brain tumors were analyzed, and the level of topo II{alpha} mRNA was significantly correlated with its immuno-staining index (p < 0.0001, r = 0.9077). Furthermore, it sharply detected that topo II{alpha} mRNA decreased in growth-inhibited glioma cell. These results support that topo II{alpha} mRNA may be a good and rapid indicator to evaluate cell proliferate potential in brain tumors.

  20. Genome-wide evaluation for quantitative trait loci under the variance component model

    PubMed Central

    Han, Lide

    2010-01-01

    The identity-by-descent (IBD) based variance component analysis is an important method for mapping quantitative trait loci (QTL) in outbred populations. The interval-mapping approach and various modified versions of it may have limited use in evaluating the genetic variances of the entire genome because they require evaluation of multiple models and model selection. In this study, we developed a multiple variance component model for genome-wide evaluation using both the maximum likelihood (ML) method and the MCMC implemented Bayesian method. We placed one QTL in every few cM on the entire genome and estimated the QTL variances and positions simultaneously in a single model. Genomic regions that have no QTL usually showed no evidence of QTL while regions with large QTL always showed strong evidence of QTL. While the Bayesian method produced the optimal result, the ML method is computationally more efficient than the Bayesian method. Simulation experiments were conducted to demonstrate the efficacy of the new methods. Electronic supplementary material The online version of this article (doi:10.1007/s10709-010-9497-1) contains supplementary material, which is available to authorized users. PMID:20835884

  1. Quantitative evaluation of interaction force between functional groups in protein and polymer brush surfaces.

    PubMed

    Sakata, Sho; Inoue, Yuuki; Ishihara, Kazuhiko

    2014-03-18

    To understand interactions between polymer surfaces and different functional groups in proteins, interaction forces were quantitatively evaluated by force-versus-distance curve measurements using atomic force microscopy with a functional-group-functionalized cantilever. Various polymer brush surfaces were systematically prepared by surface-initiated atom transfer radical polymerization as well-defined model surfaces to understand protein adsorption behavior. The polymer brush layers consisted of phosphorylcholine groups (zwitterionic/hydrophilic), trimethylammonium groups (cationic/hydrophilic), sulfonate groups (anionic/hydrophilic), hydroxyl groups (nonionic/hydrophilic), and n-butyl groups (nonionic/hydrophobic) in their side chains. The interaction forces between these polymer brush surfaces and different functional groups (carboxyl groups, amino groups, and methyl groups, which are typical functional groups existing in proteins) were quantitatively evaluated by force-versus-distance curve measurements using atomic force microscopy with a functional-group-functionalized cantilever. Furthermore, the amount of adsorbed protein on the polymer brush surfaces was quantified by surface plasmon resonance using albumin with a negative net charge and lysozyme with a positive net charge under physiological conditions. The amount of proteins adsorbed on the polymer brush surfaces corresponded to the interaction forces generated between the functional groups on the cantilever and the polymer brush surfaces. The weakest interaction force and least amount of protein adsorbed were observed in the case of the polymer brush surface with phosphorylcholine groups in the side chain. On the other hand, positive and negative surfaces generated strong forces against the oppositely charged functional groups. In addition, they showed significant adsorption with albumin and lysozyme, respectively. These results indicated that the interaction force at the functional group level might be

  2. What Makes a Good Criminal Justice Professor? A Quantitative Analysis of Student Evaluation Forms

    ERIC Educational Resources Information Center

    Gerkin, Patrick M.; Kierkus, Christopher A.

    2011-01-01

    The goal of this research is to understand how students define teaching effectiveness. By using multivariate regression analysis of 8,000+ student evaluations of teaching compiled by a School of Criminal Justice at a Midwestern public university, this paper explores the relationships between individual indicators of instructor performance (e.g.…

  3. Quantitative evaluation of susceptibility effects caused by dental materials in head magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Strocchi, S.; Ghielmi, M.; Basilico, F.; Macchi, A.; Novario, R.; Ferretti, R.; Binaghi, E.

    2016-03-01

    This work quantitatively evaluates the effects induced by susceptibility characteristics of materials commonly used in dental practice on the quality of head MR images in a clinical 1.5T device. The proposed evaluation procedure measures the image artifacts induced by susceptibility in MR images by providing an index consistent with the global degradation as perceived by the experts. Susceptibility artifacts were evaluated in a near-clinical setup, using a phantom with susceptibility and geometric characteristics similar to that of a human head. We tested different dentist materials, called PAL Keramit, Ti6Al4V-ELI, Keramit NP, ILOR F, Zirconia and used different clinical MR acquisition sequences, such as "classical" SE and fast, gradient, and diffusion sequences. The evaluation is designed as a matching process between reference and artifacts affected images recording the same scene. The extent of the degradation induced by susceptibility is then measured in terms of similarity with the corresponding reference image. The matching process involves a multimodal registration task and the use an adequate similarity index psychophysically validated, based on correlation coefficient. The proposed analyses are integrated within a computer-supported procedure that interactively guides the users in the different phases of the evaluation method. 2-Dimensional and 3-dimensional indexes are used for each material and each acquisition sequence. From these, we drew a ranking of the materials, averaging the results obtained. Zirconia and ILOR F appear to be the best choice from the susceptibility artefacts point of view, followed, in order, by PAL Keramit, Ti6Al4V-ELI and Keramit NP.

  4. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas.

    PubMed

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken's embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  5. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas

    PubMed Central

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken’s embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  6. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... model used by the Center for Biologics Evaluation and Research (CBER) and suggestions for further...: Richard Forshee, Center for Biologics Evaluation and Research (HFM-210), Food and Drug Administration... disease computer simulation models to generate quantitative estimates of the benefits and risks...

  7. Quantitative evaluation of his-tag purification and immunoprecipitation of tristetraprolin and its mutant proteins from transfected human cells

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Histidine (His)-tag is widely used for affinity purification of recombinant proteins, but the yield and purity of expressed proteins are quite different. Little information is available about quantitative evaluation of this procedure. The objective of the current study was to evaluate the His-tag pr...

  8. High-performance piezoelectric nanogenerators for self-powered nanosystems: quantitative standards and figures of merit

    NASA Astrophysics Data System (ADS)

    Wu, Wenzhuo

    2016-03-01

    Harvesting energies from the atmosphere cost-effectively is critical for both addressing worldwide long-term energy needs at the macro-scale, and achieving the sustainable maintenance-free operation of nanodevices at the micro-scale (Wang and Wu 2012 Angew. Chem. Int. Ed. 51 11700-21). Piezoelectric nanogenerator (NG) technology has demonstrated its great application potential in harvesting the ubiquitous and abundant mechanical energy. Despite of the progress made in this rapidly-advancing field, a fundamental understanding and common standard for consistently quantifying and evaluating the performance of the various types of piezoelectric NGs is still lacking. In their recent study Crossley and Kar-Narayan (2015 Nanotechnology 26 344001), systematically investigated dynamical properties of piezoelectric NGs by taking into account the effect of driving mechanism and load frequency on NG performance. They further defined the NGs’ figures of merit as energy harvested normalized by applied strain or stress for NGs under strain-driven or stress-driven conditions, which are commonly seen in the vibrational energy harvesting. This work provides new insight and a feasible approach for consistently evaluating piezoelectric nanomaterials and NG devices, which is important for designing and optimizing nanoscale piezoelectric energy harvesters, as well as promoting their applications in emerging areas e.g. the internet of things, wearable devices, and self-powered nanosystems.

  9. High-performance piezoelectric nanogenerators for self-powered nanosystems: quantitative standards and figures of merit.

    PubMed

    Wu, Wenzhuo

    2016-03-18

    Harvesting energies from the atmosphere cost-effectively is critical for both addressing worldwide long-term energy needs at the macro-scale, and achieving the sustainable maintenance-free operation of nanodevices at the micro-scale (Wang and Wu 2012 Angew. Chem. Int. Ed. 51 11700-21). Piezoelectric nanogenerator (NG) technology has demonstrated its great application potential in harvesting the ubiquitous and abundant mechanical energy. Despite of the progress made in this rapidly-advancing field, a fundamental understanding and common standard for consistently quantifying and evaluating the performance of the various types of piezoelectric NGs is still lacking. In their recent study Crossley and Kar-Narayan (2015 Nanotechnology 26 344001), systematically investigated dynamical properties of piezoelectric NGs by taking into account the effect of driving mechanism and load frequency on NG performance. They further defined the NGs' figures of merit as energy harvested normalized by applied strain or stress for NGs under strain-driven or stress-driven conditions, which are commonly seen in the vibrational energy harvesting. This work provides new insight and a feasible approach for consistently evaluating piezoelectric nanomaterials and NG devices, which is important for designing and optimizing nanoscale piezoelectric energy harvesters, as well as promoting their applications in emerging areas e.g. the internet of things, wearable devices, and self-powered nanosystems.

  10. Traction contact performance evaluation at high speeds

    NASA Technical Reports Server (NTRS)

    Tevaarwerk, J. L.

    1981-01-01

    The results of traction tests performed on two fluids are presented. These tests covered a pressure range of 1.0 to 2.5 GPa, an inlet temperature range of 30 'C to 70 'C, a speed range of 10 to 80 m/sec, aspect ratios of .5 to 5 and spin from 0 to 2.1 percent. The test results are presented in the form of two dimensionless parameters, the initial traction slope and the maximum traction peak. With the use of a suitable rheological fluid model the actual traction curves measured can now be reconstituted from the two fluid parameters. More importantly, the knowledge of these parameters together with the fluid rheological model, allow the prediction of traction under conditions of spin, slip and any combination thereof. Comparison between theoretically predicted traction under these conditions and those measured in actual traction tests shows that this method gives good results.

  11. Sexism and Beautyism in Women's Evaluations of Peer Performance.

    ERIC Educational Resources Information Center

    Cash, Thomas F.; Trimer, Claire A.

    1984-01-01

    Investigated independent and interactive effects of physical attractiveness (PA), sex, and task sex-typing on performance evaluations by 216 college women. Found that the halo effect ("beauty is talent") of PA operated when subjects evaluated both sexes, with the exception of ratings of attractive women in out-of-role ("masculine") performances.…

  12. 48 CFR 36.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Evaluation of contractor performance. 36.201 Section 36.201 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Contracting for Construction 36.201 Evaluation of contractor performance. See 42.1502(e) for the...

  13. Evaluation of board performance in Iran’s universities of medical sciences

    PubMed Central

    Sajadi, Haniye Sadat; Maleki, Mohammadreza; Ravaghi, Hamid; Farzan, Homayoun; Aminlou, Hasan; Hadi, Mohammad

    2014-01-01

    Background: The critical role that the board plays in governance of universities clarifies the necessity of evaluating its performance. This study was aimed to evaluate the performance of the boards of medical universities and provide solutions to enhance its performance. Methods: The first phase of present study was a qualitative research in which data were collected through face-to-face semi-structured interviews. Data were analyzed by thematic approach. The second phase was a mixed qualitative and quantitative study, with quantitative part in cross-sectional format and qualitative part in content analysis format. In the quantitative part, data were collected through Ministry of Health and Medical Education (MoHME). In the qualitative part, the content of 2,148 resolutions that were selected by using stratified sampling method were analyzed. Results: Participants believed that the boards had no acceptable performance for a long time.Results also indicated the increasing number of meetings and resolutions of the boards in these 21 years. The boards’ resolutions were mostly operational in domain and administrative in nature. The share of specific resolutions was more than the general ones. Conclusion: Given the current pace of change and development and the need to timely respond them, it is recommended to accelerate the slow pace of improvement process of the boards. It appears that more delegation and strengthening the position of the boards are the effective strategies to speed up this process. PMID:25337597

  14. Evaluation of PV Module Field Performance

    SciTech Connect

    Wohlgemuth, John; Silverman, Timothy; Miller, David C.; McNutt, Peter; Kempe, Michael; Deceglie, Michael

    2015-06-14

    This paper describes an effort to inspect and evaluate PV modules in order to determine what failure or degradation modes are occurring in field installations. This paper will report on the results of six site visits, including the Sacramento Municipal Utility District (SMUD) Hedge Array, Tucson Electric Power (TEP) Springerville, Central Florida Utility, Florida Solar Energy Center (FSEC), the TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification.

  15. Quantitative evaluation of Leydig cells in testicular biopsies of men with varicocele.

    PubMed

    Francavilla, S; Bruno, B; Martini, M; Moscardelli, S; Properzi, G; Francavilla, F; Santiemma, V; Fabbrini, A

    1986-01-01

    A quantitative analysis of Leydig cells was performed in 23 testicular biopsies of men with left varicocele and sperm count ranging from zero to 95,000 sperm/mm3. The oligozoospermic patients had more Leydig cells and higher FSH and LH serum levels than the patient group with more than 10,000 sperm/mm3. The Leydig cell density appeared tightly correlated (p less than 0.01) with the serum level of LH. In oligozoospermic subjects, an altered Leydig cell function could trigger an increased LH secretion; this seems likely to be responsible for the stimulation of interstitial cells resulting in an exaggerated recruitment of mature Leydig cells from their precursors. The comparative analysis of left and right testes failed to show differences in Leydig cell density and spermatogenesis in normozoospermic and oligozoospermic patients. This suggests that the two testes are equally involved by a possible, although unknown, detrimental effect of left side varicocele.

  16. Quantitative cytology and thyroperoxidase immunochemistry: new tools in evaluating thyroid nodules by fine-needle aspiration.

    PubMed

    Pluot, M; Faroux, M J; Flament, J B; Patey, M; Theobald, S; Delisle, M J

    1996-01-01

    Fine-needle aspiration (FNA) of cold thyroid nodules is proposed to be the most useful diagnostic test for deciding which patients need surgery. A retrospective study of standard cytology (SC) performed in 776 patients who had been operated on, showed a sensitivity of 94% and a specificity of 80%. Quantitative cytology (QC) was carried out with a cell image analyzer, which classified the cases as benign or not benign. In 87 cases, sensitivity and specificity of QC alone were 100 and 76%. When SC and QC were combined, there were no false negative reports. A new monoclonal antithyroperoxidase (TPO) antibody (MoAb47) was tested. The sensitivity and specificity of TPO alone were 97 and 81%. When SC and TPO were combined, specificity rose 90%. As adjuncts to SC, QC and TPO represent useful tools for selecting patients for surgery.

  17. High performance liquid chromatographic assay for the quantitation of total glutathione in plasma

    NASA Technical Reports Server (NTRS)

    Abukhalaf, Imad K.; Silvestrov, Natalia A.; Menter, Julian M.; von Deutsch, Daniel A.; Bayorh, Mohamed A.; Socci, Robin R.; Ganafa, Agaba A.

    2002-01-01

    A simple and widely used homocysteine HPLC procedure was applied for the HPLC identification and quantitation of glutathione in plasma. The method, which utilizes SBDF as a derivatizing agent utilizes only 50 microl of sample volume. Linear quantitative response curve was generated for glutathione over a concentration range of 0.3125-62.50 micromol/l. Linear regression analysis of the standard curve exhibited correlation coefficient of 0.999. Limit of detection (LOD) and limit of quantitation (LOQ) values were 5.0 and 15 pmol, respectively. Glutathione recovery using this method was nearly complete (above 96%). Intra-assay and inter-assay precision studies reflected a high level of reliability and reproducibility of the method. The applicability of the method for the quantitation of glutathione was demonstrated successfully using human and rat plasma samples.

  18. Holistic Evaluation of Quality Consistency of Ixeris sonchifolia (Bunge) Hance Injectables by Quantitative Fingerprinting in Combination with Antioxidant Activity and Chemometric Methods

    PubMed Central

    Yang, Lanping; Sun, Guoxiang; Guo, Yong; Hou, Zhifei; Chen, Shuai

    2016-01-01

    A widely used herbal medicine, Ixeris sonchifolia (Bge.) Hance Injectable (ISHI) was investigated for quality consistency. Characteristic fingerprints of 23 batches of the ISHI samples were generated at five wavelengths and evaluated by the systematic quantitative fingerprint method (SQFM) as well as simultaneous analysis of the content of seven marker compounds. Chemometric methods, i.e., support vector machine (SVM) and principal component analysis (PCA) were performed to assist in fingerprint evaluation of the ISHI samples. Qualitative classification of the ISHI samples by SVM was consistent with PCA, and in agreement with the quantitative evaluation by SQFM. In addition, the antioxidant activities of the ISHI samples were determined by both the off-line and on-line DPPH (2, 2-diphenyl-1-picryldrazyl) radical scavenging assays. A fingerprint–efficacy relationship linking the chemical components and in vitro antioxidant activity was established and validated using the partial least squares (PLS) and orthogonal projection to latent structures (OPLS) models; and the online DPPH assay further revealed those components that had position contribution to the total antioxidant activity. Therefore, the combined use of the chemometric methods, quantitative fingerprint evaluation by SQFM, and multiple marker compound analysis in conjunction with the assay of antioxidant activity provides a powerful and holistic approach to evaluate quality consistency of herbal medicines and their preparations. PMID:26872364

  19. How Universities Evaluate Faculty Performance: A Survey of Department Heads.

    ERIC Educational Resources Information Center

    Centra, John A.

    Department heads from 134 institutions (mainly universities) indicated the weight they generally give to various criteria for evaluating individual faculty members. The questionnaire they responded to included: criteria used for evaluating overall faculty performance; sources of information for evaluating teaching; and kinds of information used…

  20. At-Risk Youth Appearance and Job Performance Evaluation

    ERIC Educational Resources Information Center

    Freeburg, Beth Winfrey; Workman, Jane E.

    2008-01-01

    The goal of this study was to identify the relationship of at-risk youth workplace appearance to other job performance criteria. Employers (n = 30; each employing from 1 to 17 youths) evaluated 178 at-risk high school youths who completed a paid summer employment experience. Appearance evaluations were significantly correlated with evaluations of…

  1. Transformational Classroom Leadership: A Novel Approach to Evaluating Classroom Performance

    ERIC Educational Resources Information Center

    Pounder, James S.

    2008-01-01

    In higher education, student evaluation of teaching is widely used as a measure of an academic's teaching performance despite considerable disagreement as to its value. This paper begins by examining the merit of teaching evaluations with reference to the factors influencing the accuracy of the teaching evaluation process. One of the central…

  2. Using Business Performance To Evaluate Multimedia Training in Manufacturing.

    ERIC Educational Resources Information Center

    Lachenmaier, Lynn S.; Moor, William C.

    1997-01-01

    Discusses training evaluation and shows how an abbreviated form of Kirkpatrick's four-level evaluation model can be used effectively to evaluate multimedia-based manufacturing training. Topics include trends in manufacturing training, quantifying performance improvement, and statistical comparisons using the Mann-Whitney test and the Tukey Quick…

  3. A Performance Measurement and Evaluation Environment for Information Systems.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.

    1987-01-01

    Describes the concept of an integrated environment which allows managers to evaluate and measure the performance of computer based information systems. Both system efficiency evaluation and user interaction evaluation are addressed, and MADAM, a system currently operational at the University of Southwestern Louisiana, is briefly described.…

  4. Thrust Stand for Electric Propulsion Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Markusic, T. E.; Jones, J. E.; Cox, M. D.

    2004-01-01

    An electric propulsion thrust stand capable of supporting thrusters with total mass of up to 125 kg and 1 mN to 1 N thrust levels has been developed and tested. The mechanical design features a conventional hanging pendulum arm attached to a balance mechanism that transforms horizontal motion into amplified vertical motion, with accommodation for variable displacement sensitivity. Unlike conventional hanging pendulum thrust stands, the deflection is independent of the length of the pendulum arm, and no reference structure is required at the end of the pendulum. Displacement is measured using a non-contact, optical linear gap displacement transducer. Mechanical oscillations are attenuated using a passive, eddy current damper. An on-board microprocessor-based level control system, which includes a two axis accelerometer and two linear-displacement stepper motors, continuously maintains the level of the balance mechanism - counteracting mechanical %era drift during thruster testing. A thermal control system, which includes heat exchange panels, thermocouples, and a programmable recirculating water chiller, continuously adjusts to varying thermal loads to maintain the balance mechanism temperature, to counteract thermal drifts. An in-situ calibration rig allows for steady state calibration both prior to and during thruster testing. Thrust measurements were carried out on a well-characterized 1 kW Hall thruster; the thrust stand was shown to produce repeatable results consistent with previously published performance data.

  5. Performance evaluation of solar water sterilization system

    SciTech Connect

    Saitoh, Takeo; El-Ghetany, H.H.

    1998-07-01

    In most countries, the contaminated water is the major cause of most of the water-born diseases. Solar energy can be used in this field because the inactivation of micro-organisms is done by the ultraviolet solar radiation. A pilot solar system for sterilizing the contaminated water is designed, constructed and tested. The experimental data showed good viability for using solar energy in the sterilization process. A mathematical model of the solar sterilizer is also presented. The governing equations are solved numerically using fourth-order Runge-Kutta method. The effects of environmental conditions (ambient temperature, wind speed and solar radiation) on the solar sterilizer performance are examined. It is found that the system is affected by the ambient temperature, wind speed, ultraviolet solar radiation intensity, level of contamination of water, quantity of water being exposed, contact area between the transparent water container in the solar sterilizer and absorber plate and system geometrical parameters. It is pointed that, for a partial cloud condition, low ambient temperature and high wind speed the thermal efficiency of the solar sterilizer was minimum.

  6. Brookfield Homes Passive House Performance Evaluation

    SciTech Connect

    Herk, A.; Poerschke, A.; Beach, R.

    2016-02-04

    In 2012-2013, IBACOS worked with a builder, Brookfield Homes in Denver, Colorado, to design and construct a Passive House certified model home. IBACOS used several modeling programs and calculation methods to complete the final design package along with Brookfield's architect KGA Studio. This design package included upgrades to the thermal enclosure, basement insulation, windows, and heating, ventilation, and air conditioning. Short-term performance testing in the Passive House was done during construction and after construction. Testing with a blower door indicated that whole-building air leakage to the outside was 324 CFM and 0.60 ACH50. The other two test homes had little short-term testing done post-construction by the local energy rater. IBACOS then monitored the energy consumption and whole-house comfort conditions of that occupied Passive House after one year of operation and compared the monitoring results to those for two other occupied test houses in the same area with similar square footage but slightly different floor plans. IBACOS also assisted the builder, Brookfield Homes, in researching design scenarios for Zero Energy Ready Home and ENERGY STAR acceptance levels. IBACOS also assisted Brookfield in conceptualizing product for Denver's Brighton Heights area. Brookfield was considering building to Zero Energy Ready Home standards in that location. IBACOS provided strategies that Brookfield may draw from in the event the builder chooses to pursue a Zero Energy Ready Home plan for that market.

  7. Inclusion and Student Learning: A Quantitative Comparison of Special and General Education Student Performance Using Team and Solo-Teaching

    ERIC Educational Resources Information Center

    Jamison, Joseph A.

    2013-01-01

    This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…

  8. Examination of Information Technology (IT) Certification and the Human Resources (HR) Professional Perception of Job Performance: A Quantitative Study

    ERIC Educational Resources Information Center

    O'Horo, Neal O.

    2013-01-01

    The purpose of this quantitative survey study was to test the Leontief input/output theory relating the input of IT certification to the output of the English-speaking U.S. human resource professional perceived IT professional job performance. Participants (N = 104) rated their perceptions of IT certified vs. non-IT certified professionals' job…

  9. Thrust Stand for Electric Propulsion Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Polzin, Kurt A.; Markusic, Thomas E.; Stanojev, Boris J.; Dehoyos, Amado; Spaun, Benjamin

    2006-01-01

    An electric propulsion thrust stand capable of supporting testing of thrusters having a total mass of up to 125 kg and producing thrust levels between 100 microN to 1 N has been developed and tested. The design features a conventional hanging pendulum arm attached to a balance mechanism that converts horizontal deflections produced by the operating thruster into amplified vertical motion of a secondary arm. The level of amplification is changed through adjustment of the location of one of the pivot points linking the system. Response of the system depends on the relative magnitudes of the restoring moments applied by the displaced thruster mass and the twisting torsional pivots connecting the members of the balance mechanism. Displacement is measured using a non-contact, optical linear gap displacement transducer and balance oscillatory motion is attenuated using a passive, eddy-current damper. The thrust stand employs an automated leveling and thermal control system. Pools of liquid gallium are used to deliver power to the thruster without using solid wire connections, which can exert undesirable time-varying forces on the balance. These systems serve to eliminate sources of zero-drift that can occur as the stand thermally or mechanically shifts during the course of an experiment. An in-situ calibration rig allows for steady-state calibration before, during and after thruster operation. Thrust measurements were carried out on a cylindrical Hall thruster that produces mN-level thrust. The measurements were very repeatable, producing results that compare favorably with previously published performance data, but with considerably smaller uncertainty.

  10. Evaluation of quantitative PCR combined with PMA treatment for molecular assessment of microbial water quality.

    PubMed

    Gensberger, Eva Theres; Polt, Marlies; Konrad-Köszler, Marianne; Kinner, Paul; Sessitsch, Angela; Kostić, Tanja

    2014-12-15

    Microbial water quality assessment currently relies on cultivation-based methods. Nucleic acid-based techniques such as quantitative PCR (qPCR) enable more rapid and specific detection of target organisms and propidium monoazide (PMA) treatment facilitates the exclusion of false positive results caused by DNA from dead cells. Established molecular assays (qPCR and PMA-qPCR) for legally defined microbial quality parameters (Escherichia coli, Enterococcus spp. and Pseudomonas aeruginosa) and indicator organism group of coliforms (implemented on the molecular detection of Enterobacteriaceae) were comparatively evaluated to conventional microbiological methods. The evaluation of an extended set of drinking and process water samples showed that PMA-qPCR for E. coli, Enterococcus spp. and P. aeruginosa resulted in higher specificity because substantial or complete reduction of false positive signals in comparison to qPCR were obtained. Complete compliance to reference method was achieved for E. coli PMA-qPCR and 100% specificity for Enterococcus spp. and P. aeruginosa in the evaluation of process water samples. A major challenge remained in sensitivity of the assays, exhibited through false negative results (7-23%), which is presumably due to insufficient sample preparation (i.e. concentration of bacteria and DNA extraction), rather than the qPCR limit of detection. For the detection of the indicator group of coliforms, the evaluation study revealed that the utilization of alternative molecular assays based on the taxonomic group of Enterobacteriaceae was not adequate. Given the careful optimization of the sensitivity, the highly specific PMA-qPCR could be a valuable tool for rapid detection of hygienic parameters such as E. coli, Enterococcus spp. and P. aeruginosa.

  11. Quantitative Laser Biospeckle Method for the Evaluation of the Activity of Trypanosoma cruzi Using VDRL Plates and Digital Analysis

    PubMed Central

    Grassi, Hilda Cristina; García, Lisbette C.; Lobo-Sulbarán, María Lorena; Velásquez, Ana; Andrades-Grassi, Francisco A.; Cabrera, Humberto; Andrades-Grassi, Jesús E.; Andrades, Efrén D. J.

    2016-01-01

    In this paper we report a quantitative laser Biospeckle method using VDRL plates to monitor the activity of Trypanosoma cruzi and the calibration conditions including three image processing algorithms and three programs (ImageJ and two programs designed in this work). Benznidazole was used as a test drug. Variable volume (constant density) and variable density (constant volume) were used for the quantitative evaluation of parasite activity in calibrated wells of the VDRL plate. The desiccation process within the well was monitored as a function of volume and of the activity of the Biospeckle pattern of the parasites as well as the quantitative effect of the surface parasite quantity (proportion of the object’s plane). A statistical analysis was performed with ANOVA, Tukey post hoc and Descriptive Statistics using R and R Commander. Conditions of volume (100μl) and parasite density (2-4x104 parasites/well, in exponential growth phase), assay time (up to 204min), frame number (11 frames), algorithm and program (RCommander/SAGA) for image processing were selected to test the effect of variable concentrations of benznidazole (0.0195 to 20μg/mL / 0.075 to 76.8μM) at various times (1, 61, 128 and 204min) on the activity of the Biospeckle pattern. The flat wells of the VDRL plate were found to be suitable for the quantitative calibration of the activity of Trypanosoma cruzi using the appropriate algorithm and program. Under these conditions, benznidazole produces at 1min an instantaneous effect on the activity of the Biospeckle pattern of T. cruzi, which remains with a similar profile up to 1 hour. A second effect which is dependent on concentrations above 1.25μg/mL and is statistically different from the effect at lower concentrations causes a decrease in the activity of the Biospeckle pattern. This effect is better detected after 1 hour of drug action. This behavior may be explained by an instantaneous effect on a membrane protein of Trypanosoma cruzi that could

  12. A Quantitative and Qualitative Evaluation of Sentence Boundary Detection for the Clinical Domain.

    PubMed

    Griffis, Denis; Shivade, Chaitanya; Fosler-Lussier, Eric; Lai, Albert M

    2016-01-01

    Sentence boundary detection (SBD) is a critical preprocessing task for many natural language processing (NLP) applications. However, there has been little work on evaluating how well existing methods for SBD perform in the clinical domain. We evaluate five popular off-the-shelf NLP toolkits on the task of SBD in various kinds of text using a diverse set of corpora, including the GENIA corpus of biomedical abstracts, a corpus of clinical notes used in the 2010 i2b2 shared task, and two general-domain corpora (the British National Corpus and Switchboard). We find that, with the exception of the cTAKES system, the toolkits we evaluate perform noticeably worse on clinical text than on general-domain text. We identify and discuss major classes of errors, and suggest directions for future work to improve SBD methods in the clinical domain. We also make the code used for SBD evaluation in this paper available for download at http://github.com/drgriffis/SBD-Evaluation.

  13. A Quantitative and Qualitative Evaluation of Sentence Boundary Detection for the Clinical Domain

    PubMed Central

    Griffis, Denis; Shivade, Chaitanya; Fosler-Lussier, Eric; Lai, Albert M.

    2016-01-01

    Sentence boundary detection (SBD) is a critical preprocessing task for many natural language processing (NLP) applications. However, there has been little work on evaluating how well existing methods for SBD perform in the clinical domain. We evaluate five popular off-the-shelf NLP toolkits on the task of SBD in various kinds of text using a diverse set of corpora, including the GENIA corpus of biomedical abstracts, a corpus of clinical notes used in the 2010 i2b2 shared task, and two general-domain corpora (the British National Corpus and Switchboard). We find that, with the exception of the cTAKES system, the toolkits we evaluate perform noticeably worse on clinical text than on general-domain text. We identify and discuss major classes of errors, and suggest directions for future work to improve SBD methods in the clinical domain. We also make the code used for SBD evaluation in this paper available for download at http://github.com/drgriffis/SBD-Evaluation. PMID:27570656

  14. Quantitative evaluation of image processing algorithms for ill-structured road detection and tracking

    NASA Astrophysics Data System (ADS)

    Dufourd, Delphine; Dalgalarrondo, Andre

    2003-09-01

    In a previous presentation at AeroSense 2002, we described a methodology to assess the results of image processing algorithms for ill-structured road detection and tracking. In this paper, we present our first application of this methodology on sixedge detectors and a database counting about 20,000 images. Our evaluation approach is based on the use of video image sequences, ground truth - reference results established by human experts - and assessment metrics which measure the quality of the image processing results. We need a quantitative, comparative and repetitive evaluation of many algorithms in order to direct future developments. The main scope of this paper consists in presenting the lessons learned from applying our methodology. More precisely, we describe the assessment metrics, the algorithms and the database. Then we describe how we manage to extract the qualities and weaknesses of each algorithm and to establish a global scoring. The insight we gain for the definition of assessment metrics is also presented. Finally, we suggest some promising directions for the development of road tracking algorithms and complementarities that must be sought after. To conclude, we describe future improvements for the database constitution, the assessment tools and the overall methodology.

  15. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  16. Exploring the utility of quantitative network design in evaluating Arctic sea-ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-03-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve ten-day to five-month sea-ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett Ice Severity Index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea-ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  17. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    PubMed

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  18. Quantitative evaluation of postsurgical inflammation by infrared radiation thermometer and laser flare-cell meter.

    PubMed

    Fujishima, H; Toda, I; Yagi, Y; Tsubota, K

    1994-07-01

    Using an infrared radiation thermometer and a laser flare-cell meter, we evaluated intraocular inflammation in 40 patients who had cataract surgery by measuring central corneal temperature, number of cells, and amount of flare in the anterior chamber. Patients were divided into two groups based on duration of surgery: Group A, more than 40 minutes; Group B, less than 40 minutes. In Group A (n = 32), corneal temperature (degrees Celsius) increased by 1.10 +/- 0.57, 0.75 +/- 0.69, 0.41 +/- 0.56, and 0.24 +/- 0.45 on days 1, 2, 14, and 30, respectively. Group B (n = 8) had no significant rise in corneal temperature, but cell count (mean +/- 1 SD) increased to 39.3 +/- 13.6, 36.4 +/- 18.1, 15.5 +/- 16.5, and 4.4 +/- 3.1 on days 1, 2, 7, and 14, respectively. Flare increased to 88.9 +/- 88.9, 45.8 +/- 30.1, 38.3 +/- 25.4, and 18.5 +/- 9.4 on days 2, 7, 14, and 30, respectively. These observations show that the longer the cataract surgery, the greater the inflammation. Although inflammation was evaluated quantitatively by both infrared radiation thermometer and laser flare-cell meter, the latter appears to be more sensitive. Thermometry will only detect the results of very traumatic surgery, with a corresponding breakdown of the blood-aqueous barrier.

  19. A quantitative and standardized robotic method for the evaluation of arm proprioception after stroke.

    PubMed

    Simo, Lucia S; Ghez, Claude; Botzer, Lior; Scheidt, Robert A

    2011-01-01

    Stroke often results in both motor and sensory deficits, which may interact in the manifested functional impairment. Proprioception is known to play important roles in the planning and control of limb posture and movement; however, the impact of proprioceptive deficits on motor function has been difficult to elucidate due in part to the qualitative nature of available clinical tests. We present a quantitative and standardized method for evaluating proprioception in tasks directly relevant to those used to assess motor function. Using a robotic manipulandum that exerted controlled displacements of the hand, stroke participants were evaluated, and compared with a control group, in their ability to detect such displacements in a 2-alternative, forced-choice paradigm. A psychometric function parameterized the decision process underlying the detection of the hand displacements. The shape of this function was determined by a signal detection threshold and by the variability of the response about this threshold. Our automatic procedure differentiates between participants with and without proprioceptive deficits and quantifies functional proprioceptive sensation on a magnitude scale that is meaningful for ongoing studies of degraded motor function in comparable horizontal movements.

  20. A quantitative health assessment index for rapid evaluation of fish condition in the field

    SciTech Connect

    Adams, S.M. ); Brown, A.M. ); Goede, R.W. )

    1993-01-01

    The health assessment index (HAI) is an extension and refinement of a previously published field necropsy system. The HAI is a quantitative index that allows statistical comparisons of fish health among data sets. Index variables are assigned numerical values based on the degree of severity or damage incurred by an organ or tissue from environmental stressors. This approach has been used to evaluate the general health status of fish populations in a wide range of reservoir types in the Tennessee River basin (North Carolina, Tennessee, Alabama, Kentucky), in Hartwell Reservoir (Georgia, South Carolina) that is contaminated by polychlorinated biphenyls, and in the Pigeon River (Tennessee, North Carolina) that receives effluents from a bleaches kraft mill. The ability of the HAI to accurately characterize the health of fish in these systems was evaluated by comparing this index to other types of fish health measures (contaminant, bioindicator, and reproductive analysis) made at the same time as the HAI. In all cases, the HAI demonstrated the same pattern of fish health status between sites as did each of the other more sophisticated health assessment methods. The HAI has proven to be a simple and inexpensive means of rapidly assessing general fish health in field situations. 29 refs., 5 tabs.