Science.gov

Sample records for quantitative performance evaluation

  1. Study on the performance evaluation of quantitative precipitation estimation and quantitative precipitation forecast

    NASA Astrophysics Data System (ADS)

    Yang, H.; Chang, K.; Suk, M.; cha, J.; Choi, Y.

    2011-12-01

    Rainfall estimation and short-term (several hours) quantitative prediction of precipitation based on meteorological radar data is one of the intensely studied topics. The Korea Peninsula has the horizontally narrow land area and complex topography with many of mountains, and so it has the characteristics that the rainfall system changes in many cases. Quantitative precipitation estimation (QPE) and quantitative precipitation forecasts (QPF) are the crucial information for severe weather or water management. We have been conducted the performance evaluation of QPE/QPF of Korea Meteorological Administration (KMA), which is the first step for optimizing QPE/QPF system in South Korea. The real-time adjusted RAR (Radar-AWS-Rainrate) system gives better agreement with the observed rain-rate than that of the fixed Z-R relation, and the additional bias correction of RAR yields the slightly better results. A correlation coefficient of R2 = 0.84 is obtained between the daily accumulated observed and RAR estimated rainfall. The RAR will be available for the hydrological applications such as the water budget. The VSRF (Very Short Range Forecast) shows better performance than the MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian) within 40 minutes, but the MAPLE better than the VSRF after 40 minutes. In case of hourly forecast, MAPLE shows better performance than the VSRF. QPE and QPF are thought to be meaningful for the nowcasting (1~2 hours) except the model forecast. The long-term forecast longer than 3 hours by meteorological model is especially meaningful for such as water management.

  2. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  3. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  4. Quantitative evaluation of carbon isotopic fractionation during reversed-phase high-performance liquid chromatography.

    PubMed

    Caimi, R J; Brenna, J T

    1997-01-01

    The fractionation of 13C during low-performance preparative LC and high-performance LC is reported quantitatively for methyl palmitate and using high-precision isotope ratio mass spectrometry (IR-MS). For both preparative and high-performance analytical columns, 13C enrichment is about 7% greater than the parent starting material, drops sharply in the first section of the peak and then settles to a value about 1% below that of the starting material. Recycling over a single HPLC column did not induce greater fractionation. These results emphasize the importance of quantitative peak collection for high-precision IR-MS studies, particularly the first part of the peak where the isotope ratio changes rapidly. PMID:9025266

  5. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    PubMed Central

    Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas

    2015-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID

  6. (Un)awareness of unilateral spatial neglect: a quantitative evaluation of performance in visuo-spatial tasks.

    PubMed

    Ronchi, Roberta; Bolognini, Nadia; Gallucci, Marcello; Chiapella, Laura; Algeri, Lorella; Spada, Maria Simonetta; Vallar, Giuseppe

    2014-12-01

    Right-brain-damaged patients with unilateral spatial neglect are usually unaware (anosognosic) about their spatial deficits. However, in the scientific literature there is a lack of systematic and quantitative evaluation of this kind of unawareness, despite the negative impact of anosognosia on rehabilitation programs. This study investigated anosognosia for neglect-related impairments at different clinical tasks, by means of a quantitative assessment. Patients were tested in two different conditions (before and after execution of each task), in order to evaluate changes in the level of awareness of neglect-related behaviours triggered by task execution. Twenty-nine right-brain-damaged patients (17 with left spatial neglect) and 27 neurologically unimpaired controls entered the study. Anosognosia for spatial deficits is not pervasive, with different tasks evoking different degrees of awareness about neglect symptoms. Indeed, patients showed a largely preserved awareness about their performance in complex visuo-motor spatial and reading tasks; conversely, they were impaired in evaluating their spatial difficulties in line bisection and drawing from memory, showing over-estimation of their performance. The selectivity of the patients' unawareness of specific manifestations of spatial neglect is further supported by their preserved awareness of performance at a linguistic task, and by the absence of anosognosia for hemiplegia. This evidence indicates that discrete processes are involved in the aware monitoring of cognitive and motor performance, which can be selectively compromised by brain damage. Awareness of spatial difficulties is supported by a number of distinct components, and influenced by the specific skills required to perform a given task.

  7. Quantitative evaluation of the performance of an industrial benchtop enclosing hood.

    PubMed

    He, Xinjian Kevin; Guffey, Steven E

    2013-01-01

    Plain benchtop enclosing hoods are assumed to be highly effective in protecting workers from airborne contaminants, but there is little research published to support or rebut that assumption. The purpose of this research was to investigate the performance of a 36 in. wide, 30 in. high, and 40 in. deep benchtop enclosing hood. The study consisted of two parts: (1) investigating the effects of hood face velocity (five levels: 111, 140, 170, 200, and 229 ft/min) and wind tunnel cross-draft velocity (five levels: 14, 26, 36, 46, and 57 ft/min) on a plain benchtop enclosing hood, and (2) studying the effects of specific interventions (no-intervention, collar flange, bottom flange, cowling, and sash) added onto the same enclosing hood. A tracer gas method was used to study the hood's performance inside a 9 ft high, 12 ft wide, and 40 ft long wind tunnel. Freon-134a concentrations were measured at the mouth and nose of an anthropometrically scaled, heated, breathing manikin holding a source between its hands while standing at the enclosing hood's face. Roughly 3 L/min of pure Freon-134a mixed with 9 L/min of helium was released from the source during all tests. Results showed that hood face velocity, wind tunnel cross-draft velocity, and interventions had statistically significant effects (p < 0.05) on the concentrations measured at the manikin's breathing zone. Lower exposures were associated with higher face velocities and higher cross-draft velocities. The highest exposures occurred when the face velocity was at the lowest test value (111 ft/min), and the cross-draft velocity was at its lowest test value (14 ft/min). For the effects of interventions to the hood face, the results showed that flanges and the cowling failed to consistently reduce exposures and often exacerbated them. However, the customized sash reduced exposures to less than the detection limit of 0.1 ppm, so a similar sash should be considered when feasible. The hood face velocity should be at least 150

  8. Fingerprint analysis, multi-component quantitation, and antioxidant activity for the quality evaluation of Salvia miltiorrhiza var. alba by high-performance liquid chromatography and chemometrics.

    PubMed

    Zhang, Danlu; Duan, Xiaoju; Deng, Shuhong; Nie, Lei; Zang, Hengchang

    2015-10-01

    Salvia miltiorrhiza Bge. var. alba C.Y. Wu and H.W. Li has wide prospects in clinical practice. A useful comprehensive method was developed for the quality evaluation of S. miltiorrhiza var. alba by three quantitative parameters: high-performance liquid chromatography fingerprint, ten-component contents, and antioxidant activity. The established method was validated for linearity, precision, repeatability, stability, and recovery. Principal components analysis and hierarchical clustering analysis were both used to evaluate the quality of the samples from different origins. The results showed that there were category discrepancies in quality of S. miltiorrhiza var. alba samples according to the three quantitative parameters. Multivariate linear regression was adopted to explore the relationship between components and antioxidant activity. Three constituents, namely, danshensu, rosmarinic acid, and salvianolic acid B, significantly correlated with antioxidant activity, and were successfully elucidated by the optimized multivariate linear regression model. The combined use of high-performance liquid chromatography fingerprint analysis, simultaneous multicomponent quantitative analysis, and antioxidant activity for the quality evaluation of S. miltiorrhiza var. alba is a reliable, comprehensive, and promising approach, which might provide a valuable reference for other herbal products in general to improve their quality control.

  9. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  10. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus.

  11. Performance evaluation of Laser Induced Breakdown Spectroscopy (LIBS) for quantitative analysis of rare earth elements in phosphate glasses

    NASA Astrophysics Data System (ADS)

    Devangad, Praveen; Unnikrishnan, V. K.; Nayak, Rajesh; Tamboli, M. M.; Muhammed Shameem, K. M.; Santhosh, C.; Kumar, G. A.; Sardar, D. K.

    2016-02-01

    In the current study, we have determined the elemental compositions of synthesized rare earth doped phosphate glasses using a laboratory Laser-Induced Breakdown Spectroscopy (LIBS) system. LIBS spectra of this rare earth (samarium (Sm), thulium (Tm) and ytterbium (Yb)) doped glass samples with known composition are recorded using a highly sensitive detector. Major atomic emission lines of Sm, Tm and Yb found in LIBS spectra are reported. By considering the atomic emission line of phosphorous as an internal standard, calibration curves were constructed for all the rare earth concentrations. Very good linear regression coefficient (R2) values were obtained using this technique. Analytical predictive skill of LIBS was studied further using leave-one-out method. Low values of the reported correlation uncertainty between measured LIBS concentration ratio and certified concentration ratio confirms that LIBS technique has great potential for quantitative analysis of rare earth elements in glass matrix.

  12. Imaging Performance of Quantitative Transmission Ultrasound

    PubMed Central

    Lenox, Mark W.; Wiskin, James; Lewis, Matthew A.; Darrouzet, Stephen; Borup, David; Hsieh, Scott

    2015-01-01

    Quantitative Transmission Ultrasound (QTUS) is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound. PMID:26604918

  13. Evaluation of the MicroWorks, Inc. Swab Sampling System (MSSSTM) for Use in Performing Quantitative Swab Sampling.

    PubMed

    Rubio, Sandy; McIver, Dawn; Behm, Natalie; Fisher, Madeline; Fleming, William

    2010-01-01

    The purpose of this study was to qualify the MicroWorks, Inc. Swab Sampling System (MSSS™) swab kit for use in sampling cleanroom surfaces for bioburden. A six-part study was performed to demonstrate the suitability of the swab materials, the recovery of bioburden from typical cleanroom surfaces, the neutralization of typical disinfectants used in cleanrooms, the removal of diluents from the swabbed surface, and the hold time for test samples. A total of 13 challenge organisms were used: six National Collection of Type Cultures/American Type Culture Collection (NCTC/ATCC) standard culture organisms and seven environmental isolates, which were recovered from different MedImmune manufacturing facilities. Based on the results of the study it was shown that 12 of the challenge organisms were recovered from the calcium alginate swab materials and 13 of the challenge organisms were recovered from the sodium citrate diluent at ≥70%. Eleven organisms, including the six NCTC/ATCC organisms and five of the environmental organisms, were recovered from stainless steel, glass, polyvinylchloride curtain material, latex glove material, and neoprene at a rate of ≥70%. Effective neutralization was shown for LpH (an acid phenolic compound manufactured by Steris Corporation, Mentor, OH), Vesphene II, Spor-Klenz, 70% isopropyl alcohol (IPA), and Biocides B, X, and Y when utilizing the filtration/rinsing process. Recovery of six NCTC/ATCC organisms was demonstrated at ≥70%. The study also demonstrated that the diluents could easily be removed from the swabbed surface by following the swab with a 70% IPA wipe. A hold time of at least 24 h was demonstrated when samples were stored at 2-8 °C. The results of this study demonstrated that the MSSS™ swab kit and qualified test method recover ≥70% of surface bioburden from common cleanroom surfaces in the presence of a wide variety of disinfectants.

  14. Quantitative evaluation of oxidative stress, chronic inflammatory indices and leptin in cancer patients: correlation with stage and performance status.

    PubMed

    Mantovani, Giovanni; Macciò, Antonio; Madeddu, Clelia; Mura, Loredana; Gramignano, Giulia; Lusso, Maria Rita; Mulas, Carlo; Mudu, Maria Caterina; Murgia, Viviana; Camboni, Paolo; Massa, Elena; Ferreli, Luca; Contu, Paolo; Rinaldi, Augusto; Sanjust, Enrico; Atzei, Davide; Elsener, Bernhard

    2002-03-01

    In advanced cancer patients, the oxidative stress could take place either at the onset of disease or as a function of disease progression. To test this hypothesis, the following parameters were investigated: the erythrocyte activity of the enzymes superoxide dismutase (SOD) and glutathione peroxidase (GPx), the serum activity of glutathione reductase (GR) and the serum total antioxidant status (TAS). The total antioxidant capacity of plasma LMWA was evaluated by the cyclic voltammetry methodology. We further determined the serum levels of proinflammatory cytokines (IL-6 and TNFalpha), IL-2, leptin and C-reactive protein (CRP). All of these parameters have been correlated with the most important clinical indices of patients such as Stage of disease, ECOG PS and clinical response. Eighty-two advanced stage cancer patients and 36 healthy individuals used as controls were included in the study. Our findings show that SOD activity was significantly higher in cancer patients than in controls and GPx activity was significantly lower in cancer patients than in controls. Serum values of IL-6, TNFalpha and CRP were significantly higher in patients than in controls. Serum leptin values of cancer patients were significantly lower than controls. SOD activity increased significantly from Stage II/ECOG 0-1 to Stage IV/ECOG 0-1, whereas it decreased significantly in Stage IV/ECOG 3. GPx activity decreased significantly in Stage IV/ECOG 2-3. An inverse correlation between ECOG PS and serum leptin levels was found. Serum levels of IL-2 decreased from Stage II/ECOG 0-1 to Stage IV/ECOG 2-3. A direct correlation between Stage/ECOG PS and serum levels of both IL-6 and CRP was observed. Cisplatin administration induced a significant increase of GPx after 24 hr. In conclusion, this is the first study that shows that several "biological" parameters of cancer patients such as antioxidant enzyme activity, cytokines, leptin and CRP strictly correlate with the most important clinical

  15. Fingerprint analysis and multi-ingredient quantitative analysis for quality evaluation of Xiaoyanlidan tablets by ultra high performance liquid chromatography with diode array detection.

    PubMed

    Tang, Dao-quan; Li, Zheng; Jiang, Xiang-lan; Li, Yin-jie; Du, Qian; Yang, Dong-zhi

    2014-08-01

    A rapid and sensitive ultra high performance liquid chromatography method with diode array detection was developed for the fingerprint analysis and simultaneous determination of seven active compounds in Xiaoyanlidan (XYLD) tablets. The chromatographic separations were obtained on an Agilent Eclipse plus C18 column (50 × 2.1 mm id, 1.8 μm) using gradient elution with water/formic acid (1%) and acetonitrile at a flow rate of 0.4 mL/min. Within 63 min, 36 peaks could be selected as the common peaks for fingerprint analysis to evaluate the similarities among several samples of XYLD tablets collected from different manufacturers. In quantitative analysis, seven compounds showed good regression (R > 0.9990) within test ranges and the recovery of the method was within the range of 95.9-104.3%. The method was successfully applied to the simultaneous determination of seven compounds in six batches of XYLD tablets. These results demonstrate that the combination of chromatographic fingerprint analysis and simultaneous multi-ingredient quantification using the ultra high performance liquid chromatography method with diode array detection offers a rapid, efficient, and reliable approach for quality evaluation of XYLD tablets.

  16. Quantitative and chemical fingerprint analysis for the quality evaluation of Isatis indigotica based on ultra-performance liquid chromatography with photodiode array detector combined with chemometric methods.

    PubMed

    Shi, Yan-Hong; Xie, Zhi-Yong; Wang, Rui; Huang, Shan-Jun; Li, Yi-Ming; Wang, Zheng-Tao

    2012-01-01

    A simple and reliable method of ultra-performance liquid chromatography with photodiode array detector (UPLC-PDA) was developed to control the quality of Radix Isatidis (dried root of Isatis indigotica) for chemical fingerprint analysis and quantitative analysis of eight bioactive constituents, including R,S-goitrin, progoitrin, epiprogoitrin, gluconapin, adenosine, uridine, guanosine, and hypoxanthine. In quantitative analysis, the eight components showed good regression (R > 0.9997) within test ranges, and the recovery method ranged from 99.5% to 103.0%. The UPLC fingerprints of the Radix Isatidis samples were compared by performing chemometric procedures, including similarity analysis, hierarchical clustering analysis, and principal component analysis. The chemometric procedures classified Radix Isatidis and its finished products such that all samples could be successfully grouped according to crude herbs, prepared slices, and adulterant Baphicacanthis cusiae Rhizoma et Radix. The combination of quantitative and chromatographic fingerprint analysis can be used for the quality assessment of Radix Isatidis and its finished products.

  17. Performance evaluation of new automated hepatitis B viral markers in the clinical laboratory: two quantitative hepatitis B surface antigen assays and an HBV core-related antigen assay.

    PubMed

    Park, Yongjung; Hong, Duck Jin; Shin, Saeam; Cho, Yonggeun; Kim, Hyon-Suk

    2012-05-01

    We evaluated quantitative hepatitis B surface antigen (qHBsAg) assays and a hepatitis B virus (HBV) core-related antigen (HBcrAg) assay. A total of 529 serum samples from patients with hepatitis B were tested. HBsAg levels were determined by using the Elecsys (Roche Diagnostics, Indianapolis, IN) and Architect (Abbott Laboratories, Abbott Park, IL) qHBsAg assays. HBcrAg was measured by using Lumipulse HBcrAg assay (Fujirebio, Tokyo, Japan). Serum aminotransferases and HBV DNA were respectively quantified by using the Hitachi 7600 analyzer (Hitachi High-Technologies, Tokyo, Japan) and the Cobas AmpliPrep/Cobas TaqMan test (Roche). Precision of the qHBsAg and HBcrAg assays was assessed, and linearity of the qHBsAg assays was verified. All assays showed good precision performance with coefficients of variation between 4.5% and 5.3% except for some levels. Both qHBsAg assays showed linearity from 0.1 to 12,000.0 IU/mL and correlated well (r = 0.9934). HBsAg levels correlated with HBV DNA (r = 0.3373) and with HBcrAg (r = 0.5164), and HBcrAg also correlated with HBV DNA (r = 0.5198; P < .0001). This observation could provide impetus for further research to elucidate the clinical usefulness of the qHBsAg and HBcrAg assays.

  18. Quantitative roadmap of holographic media performance

    NASA Astrophysics Data System (ADS)

    Kowalski, Benjamin A.; McLeod, Robert R.

    2015-09-01

    For holographic photopolymer media, the "formula limit" concept enables facile calculation of the fraction of writing chemistry that is usefully patterned, and the fraction that is wasted. This provides a quantitative context to compare the performance of a diverse range of media formulations from the literature, using only information already reported in the original works. Finally, this analysis is extended to estimate the scope of achievable future performance improvements.

  19. C-arm cone beam CT guidance of sinus and skull base surgery: quantitative surgical performance evaluation and development of a novel high-fidelity phantom

    NASA Astrophysics Data System (ADS)

    Vescan, A. D.; Chan, H.; Daly, M. J.; Witterick, I.; Irish, J. C.; Siewerdsen, J. H.

    2009-02-01

    Surgical simulation has become a critical component of surgical practice and training in the era of high-precision image-guided surgery. While the ability to simulate surgery of the paranasal sinuses and skull base has been conventionally limited to 3D digital simulation or cadaveric dissection, we have developed novel methods employing rapid prototyping technology and 3D printing to create high-fidelity models from real patient images (CT or MR). Such advances allow creation of patient-specific models for preparation, simulation, and training before embarking on the actual surgery. A major challenge included the development of novel material formulations compatible with the rapid prototyping process while presenting anatomically realistic flexibility, cut-ability, drilling purchase, and density (CT number). Initial studies have yielded realistic models of the paranasal sinuses and skull base for simulation and training in image-guided surgery. The process of model development and material selection is reviewed along with the application of the phantoms in studies of high-precision surgery guided by C-arm cone-beam CT (CBCT). Surgical performance is quantitatively evaluated under CBCT guidance, with the high-fidelity phantoms providing an excellent test-bed for reproducible studies across a broad spectrum of challenging surgical tasks. Future work will broaden the atlas of models to include normal anatomical variations as well as a broad spectrum of benign and malignant disease. The role of high-fidelity models produced by rapid prototyping is discussed in the context of patient-specific case simulation, novel technology development (specifically CBCT guidance), and training of future generations of sinus and skull base surgeons.

  20. Influence of sulphur-fumigation on the quality of white ginseng: a quantitative evaluation of major ginsenosides by high performance liquid chromatography.

    PubMed

    Jin, Xin; Zhu, Ling-Ying; Shen, Hong; Xu, Jun; Li, Song-Lin; Jia, Xiao-Bin; Cai, Hao; Cai, Bao-Chang; Yan, Ru

    2012-12-01

    White ginseng was reported to be sulphur-fumigated during post-harvest handling. In the present study, the influence of sulphur-fumigation on the quality of white ginseng and its decoction were quantitatively evaluated through simultaneous quantification of 14 major ginsenosides by a validated high performance liquid chromatography. Poroshell 120 EC-C18 (100mm×3.0mm, 2.7μm) column was chosen for the separation of the major ginsenosides, which were eluted with gradient water and acetonitrile as mobile phase. The analytes were monitored by UV at 203nm. The method was validated in terms of linearity, sensitivity, precision, accuracy and stability. The sulphur-fumigated and non-fumigated white ginseng samples, as well as their respective decoctions, were comparatively analysed with the newly-validated method. It was found that the contents of nine ginsenosides detected in raw materials decreased by about 3-85%, respectively, and the total content of the nine ginsenosides detected in raw materials, decreased by almost 54% after sulphur-fumigation. On the other hand, the contents of 10 ginsenosides detected in decoctions of sulphur-fumigated white ginseng were decreased by about 33-83%, respectively, and the total content of ginsenosides was decreased by up to 64% when compared with that of non-fumigated white ginseng. In addition, ginsenoside Rh(2) and Rg(5) could be detected in the decoctions of sulphur-fumigated white ginseng but not in that of non-fumigated white ginseng. It is suggested that sulphur-fumigation can significantly influence not only the contents of original ginsenosides, but also the decocting-induced chemical transformation of ginsenosides in white ginseng.

  1. Combination of quantitative analysis and chemometric analysis for the quality evaluation of three different frankincenses by ultra high performance liquid chromatography and quadrupole time of flight mass spectrometry.

    PubMed

    Zhang, Chao; Sun, Lei; Tian, Run-tao; Jin, Hong-yu; Ma, Shuang-Cheng; Gu, Bing-ren

    2015-10-01

    Frankincense has gained increasing attention in the pharmaceutical industry because of its pharmacologically active components such as boswellic acids. However, the identity and overall quality evaluation of three different frankincense species in different Pharmacopeias and the literature have less been reported. In this paper, quantitative analysis and chemometric evaluation were established and applied for the quality control of frankincense. Meanwhile, quantitative and chemometric analysis could be conducted under the same analytical conditions. In total 55 samples from four habitats (three species) of frankincense were collected and six boswellic acids were chosen for quantitative analysis. Chemometric analyses such as similarity analysis, hierarchical cluster analysis, and principal component analysis were used to identify frankincense of three species to reveal the correlation between its components and species. In addition, 12 chromatographic peaks have been tentatively identified explored by reference substances and quadrupole time-of-flight mass spectrometry. The results indicated that the total boswellic acid profiles of three species of frankincense are similar and their fingerprints can be used to differentiate between them. PMID:26228790

  2. Quantitative evaluation of gait ataxia by accelerometers.

    PubMed

    Shirai, Shinichi; Yabe, Ichiro; Matsushima, Masaaki; Ito, Yoichi M; Yoneyama, Mitsuru; Sasaki, Hidenao

    2015-11-15

    An appropriate biomarker for spinocerebellar degeneration (SCD) has not been identified. Here, we performed gait analysis on patients with pure cerebellar type SCD and assessed whether the obtained data could be used as a neurophysiological biomarker for cerebellar ataxia. We analyzed 25 SCD patients, 25 patients with Parkinson's disease as a disease control, and 25 healthy control individuals. Acceleration signals during 6 min of walking and 1 min of standing were measured by two sets of triaxial accelerometers that were secured with a fixation vest to the middle of the lower and upper back of each subject. We extracted two gait parameters, the average and the coefficient of variation of motion trajectory amplitude, from each acceleration component. Then, each component was analyzed by correlation with the Scale for the Assessment and Rating of Ataxia (SARA) and the Berg Balance Scale (BBS). Compared with the gait control of healthy subjects and concerning correlation with severity and disease specificity, our results suggest that the average amplitude of medial-lateral (upper back) of straight gait is a physiological biomarker for cerebellar ataxia. Our results suggest that gait analysis is a quantitative and concise evaluation scale for the severity of cerebellar ataxia.

  3. Apprentice Performance Evaluation.

    ERIC Educational Resources Information Center

    Gast, Clyde W.

    The Granite City (Illinois) Steel apprentices are under a performance evaluation from entry to graduation. Federally approved, the program is guided by joint apprenticeship committees whose monthly meetings include performance evaluation from three information sources: journeymen, supervisors, and instructors. Journeymen's evaluations are made…

  4. A Program to Evaluate Quantitative Analysis Unknowns

    ERIC Educational Resources Information Center

    Potter, Larry; Brown, Bruce

    1978-01-01

    Reports on a computer batch program that will not only perform routine grading using several grading algorithms, but will also calculate various statistical measures by which the class performance can be evaluated and cumulative data collected. ( Author/CP)

  5. Influence of processing procedure on the quality of Radix Scrophulariae: a quantitative evaluation of the main compounds obtained by accelerated solvent extraction and high-performance liquid chromatography.

    PubMed

    Cao, Gang; Wu, Xin; Li, Qinglin; Cai, Hao; Cai, Baochang; Zhu, Xuemei

    2015-02-01

    An improved high-performance liquid chromatography with diode array detection combined with accelerated solvent extraction method was used to simultaneously determine six compounds in crude and processed Radix Scrophulariae samples. Accelerated solvent extraction parameters such as extraction solvent, temperature, number of cycles, and analysis procedure were systematically optimized. The results indicated that compared with crude Radix Scrophulariae samples, the processed samples had lower contents of harpagide and harpagoside but higher contents of catalpol, acteoside, angoroside C, and cinnamic acid. The established method was sufficiently rapid and reliable for the global quality evaluation of crude and processed herbal medicines.

  6. Chemical fingerprint and quantitative analysis for the quality evaluation of Vitex negundo seeds by reversed-phase high-performance liquid chromatography coupled with hierarchical clustering analysis.

    PubMed

    Shu, Zhiheng; Li, Xiuqing; Rahman, Khalid; Qin, Luping; Zheng, Chengjian

    2016-01-01

    A simple and efficient method was developed for the chemical fingerprint analysis and simultaneous determination of four phenylnaphthalene-type lignans in Vitex negundo seeds using high-performance liquid chromatography with diode array detection. For fingerprint analysis, 13 V. negundo seed samples were collected from different regions in China, and the fingerprint chromatograms were matched by the computer-aided Similarity Evaluation System for Chromatographic Fingerprint of TCM (Version 2004A). A total of 21 common peaks found in all the chromatograms were used for evaluating the similarity between these samples. Additionally, simultaneous quantification of four major bioactive ingredients was conducted to assess the quality of V. negundo seeds. Our results indicated that the contents of four lignans in V. negundo seeds varied remarkably in herbal samples collected from different regions. Moreover, the hierarchical clustering analysis grouped these 13 samples into three categories, which was consistent with the chemotypes of those chromatograms. The method developed in this study provides a substantial foundation for the establishment of reasonable quality control standards for V. negundo seeds.

  7. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  8. A QUANTITATIVE TECHNIQUE FOR PERFORMING PLASMAPHERESIS

    PubMed Central

    Melnick, Daniel; Cowgill, George R.

    1936-01-01

    1. A special apparatus and technique are described which permit one to conduct plasmapheresis quantitatively. 2. The validity of the methods employed, for determining serum protein concentration and blood volume as prerequisites for the calculation of the amount of blood to be withdrawn, are discussed. PMID:19870575

  9. Evaluating Performance of Components

    NASA Technical Reports Server (NTRS)

    Katz, Daniel; Tisdale, Edwin; Norton, Charles

    2004-01-01

    Parallel Component Performance Benchmarks is a computer program developed to aid the evaluation of the Common Component Architecture (CCA) - a software architecture, based on a component model, that was conceived to foster high-performance computing, including parallel computing. More specifically, this program compares the performances (principally by measuring computing times) of componentized versus conventional versions of the Parallel Pyramid 2D Adaptive Mesh Refinement library - a software library that is used to generate computational meshes for solving physical problems and that is typical of software libraries in use at NASA s Jet Propulsion Laboratory.

  10. Quantitative criteria for assessment of gamma-ray imager performance

    NASA Astrophysics Data System (ADS)

    Gottesman, Steve; Keller, Kristi; Malik, Hans

    2015-08-01

    In recent years gamma ray imagers such as the GammaCamTM and Polaris have demonstrated good imaging performance in the field. Imager performance is often summarized as "resolution", either angular, or spatial at some distance from the imager, however the definition of resolution is not always related to the ability to image an object. It is difficult to quantitatively compare imagers without a common definition of image quality. This paper examines three categories of definition: point source; line source; and area source. It discusses the details of those definitions and which ones are more relevant for different situations. Metrics such as Full Width Half Maximum (FWHM), variations on the Rayleigh criterion, and some analogous to National Imagery Interpretability Rating Scale (NIIRS) are discussed. The performance against these metrics is evaluated for a high resolution coded aperture imager modeled using Monte Carlo N-Particle (MCNP), and for a medium resolution imager measured in the lab.

  11. Assessment beyond Performance: Phenomenography in Educational Evaluation

    ERIC Educational Resources Information Center

    Micari, Marina; Light, Gregory; Calkins, Susanna; Streitwieser, Bernhard

    2007-01-01

    Increasing calls for accountability in education have promoted improvements in quantitative evaluation approaches that measure student performance; however, this has often been to the detriment of qualitative approaches, reducing the richness of educational evaluation as an enterprise. In this article the authors assert that it is not merely…

  12. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems.

  13. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems. PMID:24110711

  14. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  15. Performance of calibration standards for antigen quantitation with flow cytometry.

    PubMed

    Lenkei, R; Gratama, J W; Rothe, G; Schmitz, G; D'hautcourt, J L; Arekrans, A; Mandy, F; Marti, G

    1998-10-01

    In the frame of the activities initiated by the Task Force for Antigen Quantitation of the European Working Group on Clinical Cell Analysis (EWGCCA), an experiment was conducted to evaluate microbead standards used for quantitative flow cytometry (QFCM). An unified window of analysis (UWA) was established on three different instruments (EPICS XL [Coulter Corporation, Miami, FL], FACScan and FACS Calibur [Becton Dickinson, San Jose, CA]) with QC3 microbeads (FCSC, PR). By using this defined fluorescence intensity scale, the performance of several monoclonal antibodies directed to CD3, CD4, and CD8 (conjugated and unconjugated), from three manufacturers (BDIS, Coulter [Immunotech], and DAKO) was tested. In addition, the QIFI system (DAKO) and QuantiBRITE (BDIS), and a method of relative fluorescence intensity (RFI, method of Giorgi), were compared. mAbs reacting with three more antigens, CD16, CD19, and CD38 were tested on the FACScan instrument. Quantitation was carried out using a single batch of cryopreserved peripheral blood leukocytes, and all tests were performed as single color analyses. Significant correlations were observed between the antibody-binding capacity (ABC) values of the same CD antigen measured with various calibrators and with antibodies differing in respect to vendor, labeling and possible epitope recognition. Despite the significant correlations, the ABC values of most monoclonal antibodies differed by 20-40% when determined by the different fluorochrome conjugates and different calibrators. The results of this study indicate that, at the present stage of QFCM consistent ABC values may be attained between laboratories provided that a specific calibration system is used including specific calibrators, reagents, and protocols.

  16. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  17. A study on the quantitative evaluation of skin barrier function

    NASA Astrophysics Data System (ADS)

    Maruyama, Tomomi; Kabetani, Yasuhiro; Kido, Michiko; Yamada, Kenji; Oikaze, Hirotoshi; Takechi, Yohei; Furuta, Tomotaka; Ishii, Shoichi; Katayama, Haruna; Jeong, Hieyong; Ohno, Yuko

    2015-03-01

    We propose a quantitative evaluation method of skin barrier function using Optical Coherence Microscopy system (OCM system) with coherency of near-infrared light. There are a lot of skin problems such as itching, irritation and so on. It has been recognized skin problems are caused by impairment of skin barrier function, which prevents damage from various external stimuli and loss of water. To evaluate skin barrier function, it is a common strategy that they observe skin surface and ask patients about their skin condition. The methods are subjective judgements and they are influenced by difference of experience of persons. Furthermore, microscopy has been used to observe inner structure of the skin in detail, and in vitro measurements like microscopy requires tissue sampling. On the other hand, it is necessary to assess objectively skin barrier function by quantitative evaluation method. In addition, non-invasive and nondestructive measuring method and examination changes over time are needed. Therefore, in vivo measurements are crucial for evaluating skin barrier function. In this study, we evaluate changes of stratum corneum structure which is important for evaluating skin barrier function by comparing water-penetrated skin with normal skin using a system with coherency of near-infrared light. Proposed method can obtain in vivo 3D images of inner structure of body tissue, which is non-invasive and non-destructive measuring method. We formulate changes of skin ultrastructure after water penetration. Finally, we evaluate the limit of performance of the OCM system in this work in order to discuss how to improve the OCM system.

  18. Functional Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Greenisen, Michael C.; Hayes, Judith C.; Siconolfi, Steven F.; Moore, Alan D.

    1999-01-01

    The Extended Duration Orbiter Medical Project (EDOMP) was established to address specific issues associated with optimizing the ability of crews to complete mission tasks deemed essential to entry, landing, and egress for spaceflights lasting up to 16 days. The main objectives of this functional performance evaluation were to investigate the physiological effects of long-duration spaceflight on skeletal muscle strength and endurance, as well as aerobic capacity and orthostatic function. Long-duration exposure to a microgravity environment may produce physiological alterations that affect crew ability to complete critical tasks such as extravehicular activity (EVA), intravehicular activity (IVA), and nominal or emergency egress. Ultimately, this information will be used to develop and verify countermeasures. The answers to three specific functional performance questions were sought: (1) What are the performance decrements resulting from missions of varying durations? (2) What are the physical requirements for successful entry, landing, and emergency egress from the Shuttle? and (3) What combination of preflight fitness training and in-flight countermeasures will minimize in-flight muscle performance decrements? To answer these questions, the Exercise Countermeasures Project looked at physiological changes associated with muscle degradation as well as orthostatic intolerance. A means of ensuring motor coordination was necessary to maintain proficiency in piloting skills, EVA, and IVA tasks. In addition, it was necessary to maintain musculoskeletal strength and function to meet the rigors associated with moderate altitude bailout and with nominal or emergency egress from the landed Orbiter. Eight investigations, referred to as Detailed Supplementary Objectives (DSOs) 475, 476, 477, 606, 608, 617, 618, and 624, were conducted to study muscle degradation and the effects of exercise on exercise capacity and orthostatic function (Table 3-1). This chapter is divided into

  19. Evaluating quantitative proton-density-mapping methods.

    PubMed

    Mezer, Aviv; Rokem, Ariel; Berman, Shai; Hastie, Trevor; Wandell, Brian A

    2016-10-01

    Quantitative magnetic resonance imaging (qMRI) aims to quantify tissue parameters by eliminating instrumental bias. We describe qMRI theory, simulations, and software designed to estimate proton density (PD), the apparent local concentration of water protons in the living human brain. First, we show that, in the absence of noise, multichannel coil data contain enough information to separate PD and coil sensitivity, a limiting instrumental bias. Second, we show that, in the presence of noise, regularization by a constraint on the relationship between T1 and PD produces accurate coil sensitivity and PD maps. The ability to measure PD quantitatively has applications in the analysis of in-vivo human brain tissue and enables multisite comparisons between individuals and across instruments. Hum Brain Mapp 37:3623-3635, 2016. © 2016 Wiley Periodicals, Inc.

  20. Evaluating quantitative proton-density-mapping methods.

    PubMed

    Mezer, Aviv; Rokem, Ariel; Berman, Shai; Hastie, Trevor; Wandell, Brian A

    2016-10-01

    Quantitative magnetic resonance imaging (qMRI) aims to quantify tissue parameters by eliminating instrumental bias. We describe qMRI theory, simulations, and software designed to estimate proton density (PD), the apparent local concentration of water protons in the living human brain. First, we show that, in the absence of noise, multichannel coil data contain enough information to separate PD and coil sensitivity, a limiting instrumental bias. Second, we show that, in the presence of noise, regularization by a constraint on the relationship between T1 and PD produces accurate coil sensitivity and PD maps. The ability to measure PD quantitatively has applications in the analysis of in-vivo human brain tissue and enables multisite comparisons between individuals and across instruments. Hum Brain Mapp 37:3623-3635, 2016. © 2016 Wiley Periodicals, Inc. PMID:27273015

  1. Evaluating Economic Performance and Policies.

    ERIC Educational Resources Information Center

    Thurow, Lester C.

    1987-01-01

    Argues that a social welfare approach to evaluating economic performance is inappropriate at the high school level. Provides several historical case studies which could be used to augment instruction aimed at the evaluation of economic performance and policies. (JDH)

  2. More Bias in Performance Evaluation?

    ERIC Educational Resources Information Center

    Gallagher, Michael C.

    1978-01-01

    The results of this study indicate that a single performance evaluation should not be used for different purposes since the stated purpose of the evaluation can affect the actual performance rating. (Author/IRT)

  3. Quantitative damage evaluation of localized deep pitting

    SciTech Connect

    Al Beed, A.A.; Al Garni, M.A.

    2000-04-01

    Localized deep pitting is considered difficult to precisely measure and evaluate using simple techniques and daily-use analysis approaches. A case study was made of carbon steel heat exchangers in a typical fresh cooling water environment that experienced severe pitting. To effectively and precisely evaluate the encountered pitting damage, a simple measurement and analyses approach was devised. In this article, the pitting measurement technique and the damage evaluation approach are presented and discussed in detail.

  4. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams. PMID:27505659

  5. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  6. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  7. Investigations of the relationship between use of in vitro cell culture-quantitative PCR and a mouse-based bioassay for evaluating critical factors affecting the disinfection performance of pulsed UV light for treating Cryptosporidium parvum oocysts in saline.

    PubMed

    Garvey, Mary; Farrell, Hugh; Cormican, Martin; Rowan, Neil

    2010-03-01

    Cryptosporidium parvum is an enteric coccidian parasite that is recognised as a frequent cause of water-borne disease in humans. We report for the first time on use of the in vitro HCT-8 cell culture-quantitative PCR (qPCR) assay and the in vivo SCID-mouse bioassay for evaluating critical factors that reduce or eliminate infectivity of C. parvum after irradiating oocysts in saline solution under varying operational conditions with pulsed UV light. Infections post UV treatments were detected by immunofluorescence (IF) microscopy and by quantitative PCR in cell culture, and by IF staining of faeces and by hematoxylin and eosin staining of intestinal villi in mice. There was a good agreement between using cell culture-qPCR and the mouse assay for determining reduction or elimination of C. parvum infectivity as a consequence of varying UV operating conditions. Reduction in infectivity depended on the intensity of lamp discharge energy applied, amount of pulsing and population size of oocysts (P < or = 0.05). Conventional radiometer was unable to measure fluence or UV dose in saline samples due to the ultra-short non-continuous nature of the high-energy light pulses. Incorporation of humic acid at a concentration above that found in surface water (i.e., < or =10 ppm) did not significantly affect PUV disinfection capability irrespective of parameters tested (P < or = 0.05). These observations show that use of this HCT-8 cell culture assay is equivalent to using the 'gold standard' mouse-based infectivity assay for determining disinfection performances of PUV for treating C. parvum in saline solution. PMID:20096310

  8. Quantitative versus Qualitative Evaluation: A Tool to Decide Which to Use

    ERIC Educational Resources Information Center

    Dobrovolny, Jackie L.; Fuentes, Stephanie Christine G.

    2008-01-01

    Evaluation is often avoided in human performance technology (HPT), but it is an essential and frequently catalytic activity that adds significant value to projects. Knowing how to approach an evaluation and whether to use qualitative, quantitative, or both methods makes evaluation much easier. In this article, we provide tools to help determine…

  9. Quantitative evaluation of Radix Paeoniae Alba sulfur-fumigated with different durations and purchased from herbal markets: simultaneous determination of twelve components belonging to three chemical types by improved high performance liquid chromatography-diode array detector.

    PubMed

    Kong, Ming; Liu, Huan-Huan; Xu, Jun; Wang, Chun-Ru; Lu, Ming; Wang, Xiao-Ning; Li, You-Bin; Li, Song-Lin

    2014-09-01

    In this study, a improved high performance liquid chromatography-diode array detector (HPLC-DAD) method for simultaneous quantification of twelve major components belonging to three chemical types was developed and validated, and was applied to quantitatively compare the quality of Radix Paeoniae Alba (RPA) sulfur-fumigated with different durations and purchased from commercial herbal markets. The contents of paeoniflorin, benzoylpaeoniflorin, oxypaeoniflorin, benzoic acid and paeonol decreased whereas that of paeoniflorin sulfonate increased in RPA with the extending of sulfur-fumigation duration. Different levels of paeoniflorin sulfonate were determined in ten of seventeen commercial RPA samples, indicating that these ten samples may be sulfur-fumigated with different durations. Moreover, the relative standard deviation of the contents of each component was higher in the commercial sulfur-fumigated RPA samples than that in commercial non-fumigated RPA samples, and the percentage of the total average content of monoterpene glycosides in the determined analytes was higher in the decoctions of commercial sulfur-fumigated RPA than that in commercial non-fumigated RPA samples. All these results suggested that the established method was precise, accurate and sensitive enough for the global quality evaluation of sulfur-fumigated RPA, and sulfur-fumigation can not only change the proportions of bioactive components, but also cause the reduction of the quality consistency of both raw materials and aqueous decoctions of RPA.

  10. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  11. QUANTITATIVE EVALUATION OF FIRE SEPARATION AND BARRIERS

    SciTech Connect

    Coutts, D

    2007-04-17

    Fire barriers, and physical separation are key components in managing the fire risk in Nuclear Facilities. The expected performance of these features have often been predicted using rules-of-thumb or expert judgment. These approaches often lack the convincing technical bases that exist when addressing other Nuclear Facility accident events. This paper presents science-based approaches to demonstrate the effectiveness of fire separation methods.

  12. Quantitative and Qualitative Change in Children's Mental Rotation Performance

    ERIC Educational Resources Information Center

    Geiser, Christian; Lehmann, Wolfgang; Corth, Martin; Eid, Michael

    2008-01-01

    This study investigated quantitative and qualitative changes in mental rotation performance and solution strategies with a focus on sex differences. German children (N = 519) completed the Mental Rotations Test (MRT) in the 5th and 6th grades (interval: one year; age range at time 1: 10-11 years). Boys on average outperformed girls on both…

  13. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: QA TESTS, QUANTITATION AND SPECTROSCOPY

    EPA Science Inventory

    Confocal Microscopy System Performance: QA tests, Quantitation and Spectroscopy.

    Robert M. Zucker 1 and Jeremy M. Lerner 2,
    1Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research Development, U.S. Environmen...

  14. Evaluating Student Clinical Performance.

    ERIC Educational Resources Information Center

    Foster, Danny T.

    When the University of Iowa's athletic training education department developed evaluation criteria and methods to be used with students, attention was paid to validity, consistency, observation, and behaviors. The observations of student behaviors reflect three types of learning outcomes important to clinical education: cognitive, psychomotor, and…

  15. Instrument performance evaluation

    SciTech Connect

    Swinth, K.L.

    1993-03-01

    Deficiencies exist in both the performance and the quality of health physics instruments. Recognizing the implications of such deficiencies for the protection of workers and the public, in the early 1980s the DOE and the NRC encouraged the development of a performance standard and established a program to test a series of instruments against criteria in the standard. The purpose of the testing was to establish the practicality of the criteria in the standard, to determine the performance of a cross section of available instruments, and to establish a testing capability. Over 100 instruments were tested, resulting in a practical standard and an understanding of the deficiencies in available instruments. In parallel with the instrument testing, a value-impact study clearly established the benefits of implementing a formal testing program. An ad hoc committee also met several times to establish recommendations for the voluntary implementation of a testing program based on the studies and the performance standard. For several reasons, a formal program did not materialize. Ongoing tests and studies have supported the development of specific instruments and have helped specific clients understand the performance of their instruments. The purpose of this presentation is to trace the history of instrument testing to date and suggest the benefits of a centralized formal program.

  16. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  17. Evaluation (not validation) of quantitative models.

    PubMed Central

    Oreskes, N

    1998-01-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  18. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  19. Quantitative code accuracy evaluation of ISP33

    SciTech Connect

    Kalli, H.; Miwrrin, A.; Purhonen, H.

    1995-09-01

    Aiming at quantifying code accuracy, a methodology based on the Fast Fourier Transform has been developed at the University of Pisa, Italy. The paper deals with a short presentation of the methodology and its application to pre-test and post-test calculations submitted to the International Standard Problem ISP33. This was a double-blind natural circulation exercise with a stepwise reduced primary coolant inventory, performed in PACTEL facility in Finland. PACTEL is a 1/305 volumetrically scaled, full-height simulator of the Russian type VVER-440 pressurized water reactor, with horizontal steam generators and loop seals in both cold and hot legs. Fifteen foreign organizations participated in ISP33, with 21 blind calculations and 20 post-test calculations, altogether 10 different thermal hydraulic codes and code versions were used. The results of the application of the methodology to nine selected measured quantities are summarized.

  20. Quantitative evaluation of ocean thermal energy conversion (OTEC): executive briefing

    SciTech Connect

    Gritton, E.C.; Pei, R.Y.; Hess, R.W.

    1980-08-01

    Documentation is provided of a briefing summarizing the results of an independent quantitative evaluation of Ocean Thermal Energy Conversion (OTEC) for central station applications. The study concentrated on a central station power plant located in the Gulf of Mexico and delivering power to the mainland United States. The evaluation of OTEC is based on three important issues: resource availability, technical feasibility, and cost.

  1. Quantitative evaluation fo cerebrospinal fluid shunt flow

    SciTech Connect

    Chervu, S.; Chervu, L.R.; Vallabhajosyula, B.; Milstein, D.M.; Shapiro, K.M.; Shulman, K.; Blaufox, M.D.

    1984-01-01

    The authors describe a rigorous method for measuring the flow of cerebrospinal fluid (CSF) in shunt circuits implanted for the relief of obstructive hydrocephalus. Clearance of radioactivity for several calibrated flow rates was determined with a Harvard infusion pump by injecting the Rickham reservoir of a Rickham-Holter valve system with 100 ..mu..Ci of Tc-99m as pertechnetate. The elliptical and the cylindrical Holter valves used as adjunct valves with the Rickham reservoir yielded two different regression lines when the clearances were plotted against flow rats. The experimental regression lines were used to determine the in vivo flow rates from clearances calculated after injecting the Rickham reservoirs of the patients. The unique clearance characteristics of the individual shunt systems available requires that calibration curves be derived for an entire system identical to one implanted in the patient being evaluated, rather than just the injected chamber. Excellent correlation between flow rates and the clinical findings supports the reliability of this method of quantification of CSF shunt flow, and the results are fully accepted by neurosurgeons.

  2. A quantitative method for silica flux evaluation

    NASA Astrophysics Data System (ADS)

    Schonewille, R. H.; O'Connell, G. J.; Toguri, J. M.

    1993-02-01

    In the smelting of copper and copper/nickel concentrates, the role of silica flux is to aid in the removal of iron by forming a slag phase. Alternatively, the role of flux may be regarded as a means of controlling the formation of magnetite, which can severely hinder the operation of a furnace. To adequately control the magnetite level, the flux must react rapidly with all of the FeO within the bath. In the present study, a rapid method for silica flux evaluation that can be used directly in the smelter has been developed. Samples of flux are mixed with iron sulfide and magnetite and then smelted at a temperature of 1250 °C. Argon was swept over the reaction mixture and analyzed continuously for sulfur dioxide. The sulfur dioxide concentration with time was found to contain two peaks, the first one being independent of the flux content of the sample. A flux quality parameter has been defined as the height-to-time ratio of the second peak. The value of this parameter for pure silica is 5100 ppm/min. The effects of silica content, silica particle size, and silicate mineralogy were investigated. It was found that a limiting flux quality is achieved for particle sizes less than 0.1 mm in diameter and that fluxes containing feldspar are generally of a poorer quality. The relative importance of free silica and melting point was also studied using synthetic flux mixtures, with free silica displaying the strongest effect.

  3. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  4. Performance Evaluation: A Deadly Disease?

    ERIC Educational Resources Information Center

    Aluri, Rao; Reichel, Mary

    1994-01-01

    W. Edwards Deming condemned performance evaluations as a deadly disease afflicting American management. He argued that performance evaluations nourish fear, encourage short-term thinking, stifle teamwork, and are no better than lotteries. This article examines library literature from Deming's perspective. Although that literature accepts…

  5. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  6. Performance Evaluation and Benchmarking of Intelligent Systems

    SciTech Connect

    Madhavan, Raj; Messina, Elena; Tunstel, Edward

    2009-09-01

    To design and develop capable, dependable, and affordable intelligent systems, their performance must be measurable. Scientific methodologies for standardization and benchmarking are crucial for quantitatively evaluating the performance of emerging robotic and intelligent systems technologies. There is currently no accepted standard for quantitatively measuring the performance of these systems against user-defined requirements; and furthermore, there is no consensus on what objective evaluation procedures need to be followed to understand the performance of these systems. The lack of reproducible and repeatable test methods has precluded researchers working towards a common goal from exchanging and communicating results, inter-comparing system performance, and leveraging previous work that could otherwise avoid duplication and expedite technology transfer. Currently, this lack of cohesion in the community hinders progress in many domains, such as manufacturing, service, healthcare, and security. By providing the research community with access to standardized tools, reference data sets, and open source libraries of solutions, researchers and consumers will be able to evaluate the cost and benefits associated with intelligent systems and associated technologies. In this vein, the edited book volume addresses performance evaluation and metrics for intelligent systems, in general, while emphasizing the need and solutions for standardized methods. To the knowledge of the editors, there is not a single book on the market that is solely dedicated to the subject of performance evaluation and benchmarking of intelligent systems. Even books that address this topic do so only marginally or are out of date. The research work presented in this volume fills this void by drawing from the experiences and insights of experts gained both through theoretical development and practical implementation of intelligent systems in a variety of diverse application domains. The book presents

  7. Performance evaluation soil samples utilizing encapsulation technology

    DOEpatents

    Dahlgran, James R.

    1999-01-01

    Performance evaluation soil samples and method of their preparation using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.

  8. Performance evaluation soil samples utilizing encapsulation technology

    DOEpatents

    Dahlgran, J.R.

    1999-08-17

    Performance evaluation soil samples and method of their preparation uses encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration. 1 fig.

  9. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  10. Increasing productivity through performance evaluation.

    PubMed

    Lachman, V D

    1984-12-01

    Four components form the base for a performance evaluation system. A discussion of management/organizational shortcomings creating performance problems is followed by a focus on the importance of an ongoing discussion of goals between the manager and the subordinate. Six components that impact performance are identified, and practical suggestions are given to increase motivation. A coaching analysis process, as well as counseling and disciplining models, define the steps for solving performance problems.

  11. High-performance hybrid Orbitrap mass spectrometers for quantitative proteome analysis: Observations and implications.

    PubMed

    Williamson, James C; Edwards, Alistair V G; Verano-Braga, Thiago; Schwämmle, Veit; Kjeldsen, Frank; Jensen, Ole N; Larsen, Martin R

    2016-03-01

    We present basic workups and quantitative comparisons for two current generation Orbitrap mass spectrometers, the Q Exactive Plus and Orbitrap Fusion Tribrid, which are widely considered two of the highest performing instruments on the market. We assessed the performance of two quantitative methods on both instruments, namely label-free quantitation and stable isotope labeling using isobaric tags, for studying the heat shock response in Escherichia coli. We investigated the recently reported MS3 method on the Fusion instrument and the potential of MS3-based reporter ion isolation Synchronous Precursor Selection (SPS) and its impact on quantitative accuracy. We confirm that the label-free approach offers a more linear response with a wider dynamic range than MS/MS-based isobaric tag quantitation and that the MS3/SPS approach alleviates but does not eliminate dynamic range compression. We observed, however, that the choice of quantitative approach had little impact on the ability to statistically evaluate the E. coli heat shock response. We conclude that in the experimental conditions tested, MS/MS-based reporter ion quantitation provides reliable biological insight despite the issue of compressed dynamic range, an observation that significantly impacts the choice of instrument.

  12. Quantitative vs. Qualitative Approaches to Quality Special Education Program Evaluation.

    ERIC Educational Resources Information Center

    Council of Administrators of Special Education, Inc.

    One in a series of issue papers commissioned by the Council of Administrators of Special Education (CASE), this document presents a comparison of contemporary evaluation approaches for special education programs. The first section describes the two approaches to be compared: (1) traditional scientific inquiry which emphasizes quantitative methods;…

  13. Performance comparison between static and dynamic cardiac CT on perfusion quantitation and patient classification tasks

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2015-03-01

    Cardiac CT acquisitions for perfusion assessment can be performed in a dynamic or static mode. In this simulation study, we evaluate the relative classification and quantification performance of these modes for assessing myocardial blood flow (MBF). In the dynamic method, a series of low dose cardiac CT acquisitions yields data on contrast bolus dynamics over time; these data are fit with a model to give a quantitative MBF estimate. In the static method, a single CT acquisition is obtained, and the relative CT numbers in the myocardium are used to infer perfusion states. The static method does not directly yield a quantitative estimate of MBF, but these estimates can be roughly approximated by introducing assumed linear relationships between CT number and MBF, consistent with the ways such images are typically visually interpreted. Data obtained by either method may be used for a variety of clinical tasks, including 1) stratifying patients into differing categories of ischemia and 2) using the quantitative MBF estimate directly to evaluate ischemic disease severity. Through simulations, we evaluate the performance on each of these tasks. The dynamic method has very low bias in MBF estimates, making it particularly suitable for quantitative estimation. At matched radiation dose levels, ROC analysis demonstrated that the static method, with its high bias but generally lower variance, has superior performance in stratifying patients, especially for larger patients.

  14. Room for Improvement: Performance Evaluations.

    ERIC Educational Resources Information Center

    Webb, Gisela

    1989-01-01

    Describes a performance management approach to library personnel management that stresses communication, clarification of goals, and reinforcement of new practices and behaviors. Each phase of the evaluation process (preparation, rating, administrative review, appraisal interview, and follow-up) and special evaluations to be used in cases of…

  15. Evaluating Administrative/Supervisory Performance.

    ERIC Educational Resources Information Center

    Educational Research Service, Arlington, VA.

    This is a report on the third survey conducted on procedures for evaluating the performance of administrators and supervisors in local school systems. A questionnaire was sent to school systems enrolling 25,000 or more pupils, and results indicated that 84 of the 154 responding systems have formal evaluation procedures. Tables and discussions of…

  16. Colorimetric evaluation of display performance

    NASA Astrophysics Data System (ADS)

    Kosmowski, Bogdan B.

    2001-08-01

    The development of information techniques, using new technologies, physical phenomena and coding schemes, enables new application areas to be benefited form the introduction of displays. The full utilization of the visual perception of a human operator, requires the color coding process to be implemented. The evolution of displays, from achromatic (B&W) and monochromatic, to multicolor and full-color, enhances the possibilities of information coding, creating however a need for the quantitative methods of display parameter assessment. Quantitative assessment of color displays, restricted to photometric measurements of their parameters, is an estimate leading to considerable errors. Therefore, the measurements of a display's color properties have to be based on spectral measurements of the display and its elements. The quantitative assessment of the display system parameters should be made using colorimetric systems like CIE1931, CIE1976 LAB or LUV. In the paper, the constraints on the measurement method selection for the color display evaluation are discussed and the relations between their qualitative assessment and the ergonomic conditions of their application are also presented. The paper presents the examples of using LUV colorimetric system and color difference (Delta) E in the optimization of color liquid crystal displays.

  17. Quantitative evaluation of photoplethysmographic artifact reduction for pulse oximetry

    NASA Astrophysics Data System (ADS)

    Hayes, Matthew J.; Smith, Peter R.

    1999-01-01

    Motion artefact corruption of pulse oximeter output, causing both measurement inaccuracies and false alarm conditions, is a primary restriction in the current clinical practice and future applications of this useful technique. Artefact reduction in photoplethysmography (PPG), and therefore by application in pulse oximetry, is demonstrated using a novel non-linear methodology recently proposed by the authors. The significance of these processed PPG signals for pulse oximetry measurement is discussed, with particular attention to the normalization inherent in the artefact reduction process. Quantitative experimental investigation of the performance of PPG artefact reduction is then utilized to evaluate this technology for application to pulse oximetry. While the successfully demonstrated reduction of severe artefacts may widen the applicability of all PPG technologies and decrease the occurrence of pulse oximeter false alarms, the observed reduction of slight artefacts suggests that many such effects may go unnoticed in clinical practice. The signal processing and output averaging used in most commercial oximeters can incorporate these artefact errors into the output, while masking the true PPG signal corruption. It is therefore suggested that PPG artefact reduction should be incorporated into conventional pulse oximetry measurement, even in the absence of end-user artefact problems.

  18. Quantitative evaluation of phase processing approaches in susceptibility weighted imaging

    NASA Astrophysics Data System (ADS)

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2012-03-01

    Susceptibility weighted imaging (SWI) takes advantage of the local variation in susceptibility between different tissues to enable highly detailed visualization of the cerebral venous system and sensitive detection of intracranial hemorrhages. Thus, it has been increasingly used in magnetic resonance imaging studies of traumatic brain injury as well as other intracranial pathologies. In SWI, magnitude information is combined with phase information to enhance the susceptibility induced image contrast. Because of global susceptibility variations across the image, the rate of phase accumulation varies widely across the image resulting in phase wrapping artifacts that interfere with the local assessment of phase variation. Homodyne filtering is a common approach to eliminate this global phase variation. However, filter size requires careful selection in order to preserve image contrast and avoid errors resulting from residual phase wraps. An alternative approach is to apply phase unwrapping prior to high pass filtering. A suitable phase unwrapping algorithm guarantees no residual phase wraps but additional computational steps are required. In this work, we quantitatively evaluate these two phase processing approaches on both simulated and real data using different filters and cutoff frequencies. Our analysis leads to an improved understanding of the relationship between phase wraps, susceptibility effects, and acquisition parameters. Although homodyne filtering approaches are faster and more straightforward, phase unwrapping approaches perform more accurately in a wider variety of acquisition scenarios.

  19. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  20. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    SciTech Connect

    Som, P.; Oster, Z.H.

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  1. Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate

    NASA Astrophysics Data System (ADS)

    Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv

    2010-03-01

    Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and

  2. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  3. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  4. Quantitatively evaluating the CBM reservoir using logging data

    NASA Astrophysics Data System (ADS)

    Liu, Zhidi; Zhao, Jingzhou

    2016-02-01

    In order to evaluate coal bed methane (CBM) reservoirs, this paper select five parameters: porosity, permeability, CBM content, the coal structure index and effective thickness of the coal seam. Making full use of logging and the laboratory analysis data of a coal core, the logging evaluation methods of the five parameters were discussed in detail, and the comprehensive evaluation model of the CBM reservoir was established. The #5 coal seam of the Hancheng mine on the eastern edge of the Ordos Basin in China was quantitatively evaluated using this method. The results show that the CBM reservoir in the study area is better than in the central and northern regions. The actual development of CBM shows that the region with a good reservoir has high gas production—indicating that the method introduced in this paper can evaluate the CBM reservoir more effectively.

  5. Evaluation of four genes in rice for their suitability as endogenous reference standards in quantitative PCR.

    PubMed

    Wang, Chong; Jiang, Lingxi; Rao, Jun; Liu, Yinan; Yang, Litao; Zhang, Dabing

    2010-11-24

    The genetically modified (GM) food/feed quantification depends on the reliable detection systems of endogenous reference genes. Currently, four endogenous reference genes including sucrose phosphate synthase (SPS), GOS9, phospholipase D (PLD), and ppi phosphofructokinase (ppi-PPF) of rice have been used in GM rice detection. To compare the applicability of these four rice reference genes in quantitative PCR systems, we analyzed the target nucleotide sequence variation in 58 conventional rice varieties from various geographic and phylogenic origins, also their quantification performances were evaluated using quantitative real-time PCR and GeNorm analysis via a series of statistical calculation to get a "M value" which is negative correlation with the stability of genes. The sequencing analysis results showed that the reported GOS9 and PLD taqman probe regions had detectable single nucleotide polymorphisms (SNPs) among the tested rice cultivars, while no SNPs were observed for SPS and ppi-PPF amplicons. Also, poor quantitative performance was detectable in these cultivars with SNPs using GOS9 and PLD quantitative PCR systems. Even though the PCR efficiency of ppi-PPF system was slightly lower, the SPS and ppi-PPF quantitative PCR systems were shown to be applicable for rice endogenous reference assay with less variation among the C(t) values, good reproducibility in quantitative assays, and the low M values by the comprehensive quantitative PCR comparison and GeNorm analysis.

  6. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  7. Reliability and Validity of the Professional Counseling Performance Evaluation

    ERIC Educational Resources Information Center

    Shepherd, J. Brad; Britton, Paula J.; Kress, Victoria E.

    2008-01-01

    The definition and measurement of counsellor trainee competency is an issue that has received increased attention yet lacks quantitative study. This research evaluates item responses, scale reliability and intercorrelations, interrater agreement, and criterion-related validity of the Professional Performance Fitness Evaluation/Professional…

  8. Quantitative projections of a quality measure: Performance of a complex task

    NASA Astrophysics Data System (ADS)

    Christensen, K.; Kleppe, Gisle; Vold, Martin; Frette, Vidar

    2014-12-01

    Complex data series that arise during interaction between humans (operators) and advanced technology in a controlled and realistic setting have been explored. The purpose is to obtain quantitative measures that reflect quality in task performance: on a ship simulator, nine crews have solved the same exercise, and detailed maneuvering histories have been logged. There are many degrees of freedom, some of them connected to the fact that the vessels may be freely moved in any direction. To compare maneuvering histories, several measures were used: the time needed to reach the position of operation, the integrated angle between the hull direction and the direction of motion, and the extent of movement when the vessel is to be manually kept in a fixed position. These measures are expected to reflect quality in performance. We have also obtained expert quality evaluations of the crews. The quantitative measures and the expert evaluations, taken together, allow a ranking of crew performance. However, except for time and integrated angle, there is no correlation between the individual measures. This may indicate that complex situations with social and man-machine interactions need complex measures of quality in task performance. In general terms, we have established a context-dependent and flexible framework with quantitative measures in contact with a social-science concept that is hard to define. This approach may be useful for other (qualitative) concepts in social science that contain important information on the society.

  9. Chinese Middle School Teachers' Preferences Regarding Performance Evaluation Measures

    ERIC Educational Resources Information Center

    Liu, Shujie; Xu, Xianxuan; Stronge, James H.

    2016-01-01

    Teacher performance evaluation currently is receiving unprecedented attention from policy makers, scholars, and practitioners worldwide. This study is one of the few studies of teacher perceptions regarding teacher performance measures that focus on China. We employed a quantitative dominant mixed research design to investigate Chinese teachers'…

  10. Performance Criteria and Evaluation System

    1992-06-18

    The Performance Criteria and Evaluation System (PCES) was developed in order to make a data base of criteria accessible to radiation safety staff. The criteria included in the package are applicable to occupational radiation safety at DOE reactor and nonreactor nuclear facilities, but any data base of criteria may be created using the Criterion Data Base Utiliity (CDU). PCES assists personnel in carrying out oversight, line, and support activities.

  11. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  12. Metrics for Offline Evaluation of Prognostic Performance

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2010-01-01

    Prognostic performance evaluation has gained significant attention in the past few years. Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.

  13. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  14. The Nuclear Renaissance - Implications on Quantitative Nondestructive Evaluations

    SciTech Connect

    Matzie, Regis A.

    2007-03-21

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  15. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  16. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Performance evaluation... Performance evaluation. Preparation of performance evaluation reports. (a) In addition to the requirements of FAR 36.604, performance evaluation reports shall be prepared for indefinite-delivery type...

  17. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Performance evaluation... Performance evaluation. Preparation of performance evaluation reports. (a) In addition to the requirements of FAR 36.604, performance evaluation reports shall be prepared for indefinite-delivery type...

  18. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Performance evaluation... Performance evaluation. Preparation of performance evaluation reports. (a) In addition to the requirements of FAR 36.604, performance evaluation reports shall be prepared for indefinite-delivery type...

  19. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Performance evaluation... Performance evaluation. Preparation of performance evaluation reports. (a) In addition to the requirements of FAR 36.604, performance evaluation reports shall be prepared for indefinite-delivery type...

  20. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Performance evaluation... Performance evaluation. Preparation of performance evaluation reports. (a) In addition to the requirements of FAR 36.604, performance evaluation reports shall be prepared for indefinite-delivery type...

  1. Evaluation of Lung Metastasis in Mouse Mammary Tumor Models by Quantitative Real-time PCR

    PubMed Central

    Abt, Melissa A.; Grek, Christina L.; Ghatnekar, Gautam S.; Yeh, Elizabeth S.

    2016-01-01

    Metastatic disease is the spread of malignant tumor cells from the primary cancer site to a distant organ and is the primary cause of cancer associated death 1. Common sites of metastatic spread include lung, lymph node, brain, and bone 2. Mechanisms that drive metastasis are intense areas of cancer research. Consequently, effective assays to measure metastatic burden in distant sites of metastasis are instrumental for cancer research. Evaluation of lung metastases in mammary tumor models is generally performed by gross qualitative observation of lung tissue following dissection. Quantitative methods of evaluating metastasis are currently limited to ex vivo and in vivo imaging based techniques that require user defined parameters. Many of these techniques are at the whole organism level rather than the cellular level 3–6. Although newer imaging methods utilizing multi-photon microscopy are able to evaluate metastasis at the cellular level 7, these highly elegant procedures are more suited to evaluating mechanisms of dissemination rather than quantitative assessment of metastatic burden. Here, a simple in vitro method to quantitatively assess metastasis is presented. Using quantitative Real-time PCR (QRT-PCR), tumor cell specific mRNA can be detected within the mouse lung tissue. PMID:26862835

  2. A review of published quantitative experimental studies on factors affecting laboratory fume hood performance.

    PubMed

    Ahn, Kwangseog; Woskie, Susan; DiBerardinis, Louis; Ellenbecker, Michael

    2008-11-01

    This study attempted to identify the important factors that affect the performance of a laboratory fume hood and the relationship between the factors and hood performance under various conditions by analyzing and generalizing the results from other studies that quantitatively investigated fume hood performance. A literature search identified 43 studies that were published from 1966 to 2006. For each of those studies, information on the type of test methods used, the factors investigated, and the findings were recorded and summarized. Among the 43 quantitative experimental studies, 21 comparable studies were selected, and then a meta-analysis of the comparable studies was conducted. The exposure concentration variable from the resulting 617 independent test conditions was dichotomized into acceptable or unacceptable using the control level of 0.1 ppm tracer gas. Regression analysis using Cox proportional hazards models provided hood failure ratios for potential exposure determinants. The variables that were found to be statistically significant were the presence of a mannequin/human subject, the distance between a source and breathing zone, and the height of sash opening. In summary, performance of laboratory fume hoods was affected mainly by the presence of a mannequin/human subject, distance between a source and breathing zone, and height of sash opening. Presence of a mannequin/human subject in front of the hood adversely affects hood performance. Worker exposures to air contaminants can be greatly reduced by increasing the distance between the contaminant source and breathing zone and by reducing the height of sash opening. Many other factors can also affect hood performance. Checking face velocity by itself is unlikely to be sufficient in evaluating hood performance properly. An evaluation of the performance of a laboratory fume hood should be performed with a human subject or a mannequin in front of the hood and should address the effects of the activities

  3. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 236.604 Performance evaluation. Prepare a separate performance evaluation after... familiar with the architect-engineer contractor's performance....

  4. Quantitative evaluation of mefenamic acid polymorphs by terahertz-chemometrics.

    PubMed

    Otsuka, Makoto; Nishizawa, Jun-ichi; Shibata, Jiro; Ito, Masahiko

    2010-09-01

    The purpose of the present study is to measure polymorphic content in a bulk powder, mefenamic acid polymorph of pharmaceuticals, as a model drug by THz-spectrometer using frequency-tunable THz wave generators based on difference-frequency generation in gallium phosphate crystals. Mefenamic acid polymorphic forms I and II were obtained by recrystallisation. Eleven standard samples varying a various polymorphic form I content (0-100%) were prepared by physical mixing. After smoothing and area normalising, the THz-spectra of all standard samples showed an isosbestic point at 3.70 THz. After the THz-spectral data sets were arranged into five frequency ranges, and pretreated using various functions, calibration models were calculated by the partial least square regression method. The effect of spectral data management on the chemometric parameters of the calibration models was investigated. The relationship between predicted and actual form I content was the best linear plot. On the regression vector (RV) that corresponded to absorption THz-spectral data, the peak at 1.45 THz was the highest value, and the peak at 2.25 THz was the lowest on RV. THz-spectroscopy with chemometrics would be useful for the quantitative evaluation of mefenamic acid polymorphs in the pharmaceutical industry. This method is expected to provide a rapid and nondestructive quantitative analysis of polymorphs. PMID:20665848

  5. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    ERIC Educational Resources Information Center

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  6. Evaluation of Slit Sampler in Quantitative Studies of Bacterial Aerosols

    PubMed Central

    Ehrlich, Richard; Miller, Sol; Idoine, L. S.

    1966-01-01

    Quantitative studies were conducted to evaluate the efficiency of the slit sampler in collecting airborne Serratia marcescens and Bacillus subtilis var. niger, and to compare it with the collecting efficiency of the all-glass impinger AGI-30. The slit sampler was approximately 50% less efficient than the AGI-30. This ratio remained the same whether liquid or dry cultures were disseminated when the sample was taken at 2 min of aerosol cloud life. At 30 min of aerosol cloud life, this ratio was approximately 30% for B. subtilis var. niger. S. marcescens recoveries by the slit sampler were, however, only 17% lower than the AGI-30 at 30 min of cloud age, indicating a possible interaction involving the more labile vegetative cells, aerosol age, and method of collection. PMID:4961550

  7. Quantitative surface evaluation by matching experimental and simulated ronchigram images

    NASA Astrophysics Data System (ADS)

    Kantún Montiel, Juana Rosaura; Cordero Dávila, Alberto; González García, Jorge

    2011-09-01

    To estimate qualitatively the surface errors with Ronchi test, the experimental and simulated ronchigrams are compared. Recently surface errors have been obtained quantitatively matching the intersection point coordinates of ronchigrama fringes with x-axis . In this case, gaussian fit must be done for each fringe, and interference orders are used in Malacara algorithm for the simulations. In order to evaluate surface errors, we added an error function in simulations, described with cubic splines, to the sagitta function of the ideal surface. We used the vectorial transversal aberration formula and a ruling with cosinusoidal transmittance, because these rulings reproduce better experimental ronchigram fringe profiles. Several error functions are tried until the whole experimental ronchigrama image is reproduced. The optimization process was done using genetic algorithms.

  8. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  9. Quantitative Evaluation and Selection of Reference Genes for Quantitative RT-PCR in Mouse Acute Pancreatitis

    PubMed Central

    Yan, Zhaoping; Gao, Jinhang; Lv, Xiuhe; Yang, Wenjuan; Wen, Shilei; Tong, Huan; Tang, Chengwei

    2016-01-01

    The analysis of differences in gene expression is dependent on normalization using reference genes. However, the expression of many of these reference genes, as evaluated by quantitative RT-PCR, is upregulated in acute pancreatitis, so they cannot be used as the standard for gene expression in this condition. For this reason, we sought to identify a stable reference gene, or a suitable combination, for expression analysis in acute pancreatitis. The expression stability of 10 reference genes (ACTB, GAPDH, 18sRNA, TUBB, B2M, HPRT1, UBC, YWHAZ, EF-1α, and RPL-13A) was analyzed using geNorm, NormFinder, and BestKeeper software and evaluated according to variations in the raw Ct values. These reference genes were evaluated using a comprehensive method, which ranked the expression stability of these genes as follows (from most stable to least stable): RPL-13A, YWHAZ > HPRT1 > GAPDH > UBC > EF-1α > 18sRNA > B2M > TUBB > ACTB. RPL-13A was the most suitable reference gene, and the combination of RPL-13A and YWHAZ was the most stable group of reference genes in our experiments. The expression levels of ACTB, TUBB, and B2M were found to be significantly upregulated during acute pancreatitis, whereas the expression level of 18sRNA was downregulated. Thus, we recommend the use of RPL-13A or a combination of RPL-13A and YWHAZ for normalization in qRT-PCR analyses of gene expression in mouse models of acute pancreatitis. PMID:27069927

  10. Quantitative genetic activity graphical profiles for use in chemical evaluation

    SciTech Connect

    Waters, M.D.; Stack, H.F.; Garrett, N.E.; Jackson, M.A.

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  11. [Clinical evaluation of a novel HBsAg quantitative assay].

    PubMed

    Takagi, Kazumi; Tanaka, Yasuhito; Naganuma, Hatsue; Hiramatsu, Kumiko; Iida, Takayasu; Takasaka, Yoshimitsu; Mizokami, Masashi

    2007-07-01

    The clinical implication of the hepatitis B surface antigen (HBsAg) concentrations in HBV-infected individuals remains unclear. The aim of this study was to evaluate a novel fully automated Chemiluminescence Enzyme Immunoassay (Sysmex HBsAg quantitative assay) by comparative measurements of the reference serum samples versus two independent commercial assays (Lumipulse f or Architect HBsAg QT). Furthermore, clinical usefulness was assessed for monitoring of the serum HBsAg levels during antiviral therapy. A dilution test using 5 reference-serum samples showed linear correlation curve in range from 0.03 to 2,360 IU/ml. The HBsAg was measured in total of 400 serum samples and 99.8% had consistent results between Sysmex and Lumipulse f. Additionally, a positive linear correlation was observed between Sysmex and Architect. To compare the Architect and Sysmex, both methods were applied to quantify the HBsAg in serum samples with different HBV genotypes/subgenotypes, as well as in serum contained HBV vaccine escape mutants (126S, 145R). Correlation between the methods was observed in results for escape mutants and common genotypes (A, B, C) in Japan. Observed during lamivudine therapy, an increase in HBsAg and HBV DNA concentrations preceded the aminotransferase (ALT) elevation associated with drug-resistant HBV variant emergence (breakthrough hepatitis). In conclusion, reliability of the Sysmex HBsAg quantitative assay was confirmed for all HBV genetic variants common in Japan. Monitoring of serum HBsAg concentrations in addition to HBV DNA quantification, is helpful in evaluation of the response to lamivudine treatment and diagnosis of the breakthrough hepatitis.

  12. Evaluation of a virucidal quantitative carrier test for surface disinfectants.

    PubMed

    Rabenau, Holger F; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested.

  13. Evaluation of a virucidal quantitative carrier test for surface disinfectants.

    PubMed

    Rabenau, Holger F; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  14. SEASAT SAR performance evaluation study

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The performance of the SEASAT synthetic aperture radar (SAR) sensor was evaluated using data processed by the MDA digital processor. Two particular aspects are considered the location accuracy of image data, and the calibration of the measured backscatter amplitude of a set of corner reflectors. The image location accuracy was assessed by selecting identifiable targets in several scenes, converting their image location to UTM coordinates, and comparing the results to map sheets. The error standard deviation is measured to be approximately 30 meters. The amplitude was calibrated by measuring the responses of the Goldstone corner reflector array and comparing the results to theoretical values. A linear regression of the measured against theoretical values results in a slope of 0.954 with a correlation coefficient of 0.970.

  15. Quantitative evaluation of digital dental radiograph imaging systems.

    PubMed

    Hildebolt, C F; Vannier, M W; Pilgram, T K; Shrout, M K

    1990-11-01

    Two digital imaging systems, a video camera and analog-to-digital converter, and a charge-coupled device linear photodiode array slide scanner, were tested for their suitability in quantitative studies of periodontal disease. The information content in the original films was estimated, and digital systems were assessed according to these requirements. Radiometric and geometric performance criteria for the digital systems were estimated from measurements and observations. The scanner-based image acquisition (digitization) system had no detectable noise and had a modulation transfer function curve superior to that of the video-based system. The scanner-based system was equivalent to the video-based system in recording radiographic film densities and had more geometric distortion than the video-based system. The comparison demonstrated the superiority of the charge-coupled device linear array system for the quantification of periodontal disease extent and activity. PMID:2234888

  16. 13 CFR 304.4 - Performance evaluations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Performance evaluations. 304.4... ECONOMIC DEVELOPMENT DISTRICTS § 304.4 Performance evaluations. (a) EDA shall evaluate the management... of at least one (1) other District Organization in the performance evaluation on a...

  17. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Performance evaluation... Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  18. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Performance evaluation... Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  19. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Performance evaluation... Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  20. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Performance evaluation... Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  1. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  2. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Performance evaluation... Architect-Engineer Services 236.604 Performance evaluation. (a) Preparation of performance reports. Use DD Form 2631, Performance Evaluation (Architect-Engineer), instead of SF 1421. (2) Prepare a...

  3. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  4. Quantitative methods for somatosensory evaluation in atypical odontalgia.

    PubMed

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana; Bonjardim, Leonardo Rigoldi; Conti, Paulo César Rodrigues; Svensson, Peter

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection (CPT), and/or wind-up ratio (WUR). The publications meeting the inclusion criteria revealed that only mechanical allodynia tests (DMA1, DMA2, and WUR) were significantly higher and pain threshold tests to heat stimulation (HPT) were significantly lower in the affected side, compared with the contralateral side, in AO patients; however, for MDT, MPT, PPT, CDT, and WDT, the results were not significant. These data support the presence of central sensitization features, such as allodynia and temporal summation. In contrast, considerable inconsistencies between studies were found when AO patients were compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests. PMID:25627886

  5. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  6. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation.

  7. Improving Student Retention and Performance in Quantitative Courses Using Clickers

    ERIC Educational Resources Information Center

    Liu, Wallace C.; Stengel, Donald N.

    2011-01-01

    Clickers offer instructors of mathematics-related courses an opportunity to involve students actively in class sessions while diminishing the embarrassment of being wrong. This paper reports on the use of clickers in two university-level courses in quantitative analysis and business statistics. Results for student retention and examination…

  8. Improving Library Performance: Quantitative Approaches to Library Planning.

    ERIC Educational Resources Information Center

    Webster, Duane E.

    The use of analytical models and quantitative methods for both short- and long-range problem solving offer library managers an excellent opportunity to improve and rationalize decision-making for strategic and organizational planning. The first step is to identify the problems confronting the library and understand its current capabilities.…

  9. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies. PMID:25542704

  10. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  11. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme. PMID:23008259

  12. Ground truth and benchmarks for performance evaluation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Ayako; Shneier, Michael; Hong, Tsai Hong; Chang, Tommy; Scrapper, Christopher; Cheok, Geraldine S.

    2003-09-01

    Progress in algorithm development and transfer of results to practical applications such as military robotics requires the setup of standard tasks, of standard qualitative and quantitative measurements for performance evaluation and validation. Although the evaluation and validation of algorithms have been discussed for over a decade, the research community still faces a lack of well-defined and standardized methodology. The range of fundamental problems include a lack of quantifiable measures of performance, a lack of data from state-of-the-art sensors in calibrated real-world environments, and a lack of facilities for conducting realistic experiments. In this research, we propose three methods for creating ground truth databases and benchmarks using multiple sensors. The databases and benchmarks will provide researchers with high quality data from suites of sensors operating in complex environments representing real problems of great relevance to the development of autonomous driving systems. At NIST, we have prototyped a High Mobility Multi-purpose Wheeled Vehicle (HMMWV) system with a suite of sensors including a Riegl ladar, GDRS ladar, stereo CCD, several color cameras, Global Position System (GPS), Inertial Navigation System (INS), pan/tilt encoders, and odometry . All sensors are calibrated with respect to each other in space and time. This allows a database of features and terrain elevation to be built. Ground truth for each sensor can then be extracted from the database. The main goal of this research is to provide ground truth databases for researchers and engineers to evaluate algorithms for effectiveness, efficiency, reliability, and robustness, thus advancing the development of algorithms.

  13. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL... Performance evaluation. See 42.1502(f) for the requirements for preparing past performance evaluations...

  14. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL... Performance evaluation. See 42.1502(f) for the requirements for preparing past performance evaluations...

  15. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL... Performance evaluation. See 42.1502(f) for the requirements for preparing past performance evaluations...

  16. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Performance evaluation. 236.604 Section 236.604 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM... Architect-Engineer Services 236.604 Performance evaluation. Prepare a separate performance evaluation...

  17. 13 CFR 304.4 - Performance evaluations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Performance evaluations. 304.4... ECONOMIC DEVELOPMENT DISTRICTS § 304.4 Performance evaluations. (a) EDA shall evaluate the management... the District Organization continues to receive Investment Assistance. EDA's evaluation shall...

  18. Energy performance evaluation of AAC

    NASA Astrophysics Data System (ADS)

    Aybek, Hulya

    The U.S. building industry constitutes the largest consumer of energy (i.e., electricity, natural gas, petroleum) in the world. The building sector uses almost 41 percent of the primary energy and approximately 72 percent of the available electricity in the United States. As global energy-generating resources are being depleted at exponential rates, the amount of energy consumed and wasted cannot be ignored. Professionals concerned about the environment have placed a high priority on finding solutions that reduce energy consumption while maintaining occupant comfort. Sustainable design and the judicious combination of building materials comprise one solution to this problem. A future including sustainable energy may result from using energy simulation software to accurately estimate energy consumption and from applying building materials that achieve the potential results derived through simulation analysis. Energy-modeling tools assist professionals with making informed decisions about energy performance during the early planning phases of a design project, such as determining the most advantageous combination of building materials, choosing mechanical systems, and determining building orientation on the site. By implementing energy simulation software to estimate the effect of these factors on the energy consumption of a building, designers can make adjustments to their designs during the design phase when the effect on cost is minimal. The primary objective of this research consisted of identifying a method with which to properly select energy-efficient building materials and involved evaluating the potential of these materials to earn LEED credits when properly applied to a structure. In addition, this objective included establishing a framework that provides suggestions for improvements to currently available simulation software that enhance the viability of the estimates concerning energy efficiency and the achievements of LEED credits. The primary objective

  19. Evaluating GC/MS Performance

    SciTech Connect

    Alcaraz, A; Dougan, A

    2006-11-26

    and Water Check': By selecting View - Diagnostics/Vacuum Control - Vacuum - Air and Water Check. A Yes/No dialogue box will appear; select No (use current values). It is very important to select No! Otherwise the tune values are drastically altered. The software program will generate a water/air report similar to figure 3. Evaluating the GC/MS system with a performance standard: This procedure should allow the analyst to verify that the chromatographic column and associated components are working adequately to separate the various classes of chemical compounds (e.g., hydrocarbons, alcohols, fatty acids, aromatics, etc.). Use the same GC/MS conditions used to collect the system background and solvent check (part 1 of this document). Figure 5 is an example of a commercial GC/MS column test mixture used to evaluate GC/MS prior to analysis.

  20. A novel quantitative approach for evaluating contact mechanics of meniscal replacements.

    PubMed

    Linder-Ganz, E; Elsner, J J; Danino, A; Guilak, F; Shterling, A

    2010-02-01

    One of the functions of the meniscus is to distribute contact forces over the articular surfaces by increasing the joint contact areas. It is widely accepted that total/partial loss of the meniscus increases the risk of joint degeneration. A short-term method for evaluating whether degenerative arthritis can be prevented or not would be to determine if the peak pressure and contact area coverage of the tibial plateau (TP) in the knee are restored at the time of implantation. Although several published studies already utilized TP contact pressure measurements as an indicator for biomechanical performance of allograft menisci, there is a paucity of a quantitative method for evaluation of these parameters in situ with a single effective parameter. In the present study, we developed such a method and used it to assess the load distribution ability of various meniscal implant configurations in human cadaveric knees (n=3). Contact pressures under the intact meniscus were measured under compression (1200 N, 0 deg flexion). Next, total meniscectomy was performed and the protocol was repeated with meniscal implants. Resultant pressure maps were evaluated for the peak pressure value, total contact area, and its distribution pattern, all with respect to the natural meniscus output. Two other measures--implant-dislocation and implant-impingement on the ligaments--were also considered. If any of these occurred, the score was zeroed. The total implant score was based on an adjusted calculation of the aforementioned measures, where the natural meniscus score was always 100. Laboratory experiments demonstrated a good correlation between qualitative and quantitative evaluations of the same pressure map outputs, especially in cases where there were contradicting indications between different parameters. Overall, the proposed approach provides a novel, validated method for quantitative assessment of the biomechanical performance of meniscal implants, which can be used in various

  1. Towards a Confluence of Quantitative and Qualitative Approaches to Curriculum Evaluation.

    ERIC Educational Resources Information Center

    Smith, D. L.; Fraser, B. J.

    1980-01-01

    Discusses a project in which quantitative and qualitative methodologies were combined in an evaluation of the High School Education Law Project (HELP) in Australia. Qualitative and quantitative evaluation were combined in several aspects of the study including field testing of preliminary versions of HELP materials, further evaluation work on…

  2. How To Evaluate Teacher Performence.

    ERIC Educational Resources Information Center

    Wilson, Laval S.

    Teacher evaluations tend to be like clothes. Whatever is in vogue at the time is utilized extensively by those who are attempting to remain modern and current. If you stay around long enough, the "hot" methods of today will probably recycle to be the new discovery of the future. In the end, each school district develops an evaluation process that…

  3. Quantitative comparison between crowd models for evacuation planning and evaluation

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vaisagh; Lee, Chong Eu; Lees, Michael Harold; Cheong, Siew Ann; Sloot, Peter M. A.

    2014-02-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we describe a procedure to quantitatively compare different crowd models or between models and real-world data. We simulated three models: (1) the lattice gas model, (2) the social force model, and (3) the RVO2 model, and obtained the distributions of six observables: (1) evacuation time, (2) zoned evacuation time, (3) passage density, (4) total distance traveled, (5) inconvenience, and (6) flow rate. We then used the DISTATIS procedure to compute the compromise matrix of statistical distances between the three models. Projecting the three models onto the first two principal components of the compromise matrix, we find the lattice gas and RVO2 models are similar in terms of the evacuation time, passage density, and flow rates, whereas the social force and RVO2 models are similar in terms of the total distance traveled. Most importantly, we find that the zoned evacuation times of the three models to be very different from each other. Thus we propose to use this variable, if it can be measured, as the key test between different models, and also between models and the real world. Finally, we compared the model flow rates against the flow rate of an emergency evacuation during the May 2008 Sichuan earthquake, and found the social force model agrees best with this real data.

  4. Dual-band infrared thermography for quantitative nondestructive evaluation

    SciTech Connect

    Durbin, P.F.; Del Grande, N.K.; Dolan, K.W.; Perkins, D.E.; Shapiro, A.B.

    1993-04-01

    The authors have developed dual-band infrared (DBIR) thermography that is being applied to quantitative nondestructive evaluation (NDE) of aging aircraft. The DBIR technique resolves 0.2 degrees C surface temperature differences for inspecting interior flaws in heated aircraft structures. It locates cracks, corrosion sites, disbonds or delaminations in metallic laps and composite patches. By removing clutter from surface roughness effects, the authors clarify interpretation of subsurface flaws. To accomplish this, the authors ratio images recorded at two infrared bands, centered near 5 microns and 10 microns. These image ratios are used to decouple temperature patterns associated with interior flaw sites from spatially varying surface emissivity noise. They also discuss three-dimensional (3D) dynamic thermal imaging of structural flaws using dual-band infrared (DBIR) computed tomography. Conventional thermography provides single-band infrared images which are difficult to interpret. Standard procedures yield imprecise (or qualitative) information about subsurface flaw sites which are typically masked by surface clutter. They use a DBIR imaging technique pioneered at LLNL to capture the time history of surface temperature difference patterns for flash-heated targets. They relate these patterns to the location, size, shape and depth of subsurface flaws. They have demonstrated temperature accuracies of 0.2{degree}C, timing synchronization of 3 ms (after onset of heat flash) and intervals of 42 ms, between images, during an 8 s cooling (and heating) interval characterizing the front (and back) surface temperature-time history of an epoxy-glue disbond site in a flash-heated aluminum lap joint.

  5. Method for evaluating performance of clinical pharmacists.

    PubMed

    Schumock, G T; Leister, K A; Edwards, D; Wareham, P S; Burkhart, V D

    1990-01-01

    A performance-evaluation process that satisfies Joint Commission on Accreditation of Healthcare Organizations criteria and state policies is described. A three-part, criteria-based, weighted performance-evaluation tool specific for clinical pharmacists was designed for use in two institutions affiliated with the University of Washington. The three parts are self-appraisal and goal setting, peer evaluation, and supervisory evaluation. Objective criteria within each section were weighted to reflect the relative importance of that characteristic to the job that the clinical pharmacist performs. The performance score for each criterion is multiplied by the weighted value to produce an outcome score. The peer evaluation and self-appraisal/goal-setting parts of the evaluation are completed before the formal performance-evaluation interview. The supervisory evaluation is completed during the interview. For this evaluation, supervisors use both the standard university employee performance evaluation form and a set of specific criteria applicable to the clinical pharmacists in these institutions. The first performance evaluations done under this new system were conducted in May 1989. Pharmacists believed that the new system was more objective and allowed more interchange between the manager and the pharmacist. The peer-evaluation part of the system was seen as extremely constructive. This three-part, criteria-based system for evaluation of the job performance of clinical pharmacists could easily be adopted by other pharmacy departments.

  6. Using hybrid method to evaluate the green performance in uncertainty.

    PubMed

    Tseng, Ming-Lang; Lan, Lawrence W; Wang, Ray; Chiu, Anthony; Cheng, Hui-Ping

    2011-04-01

    Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm's green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed. PMID:20571885

  7. Infrared radiometric technique for rapid quantitative evaluation of heat flux distribution over large areas

    NASA Astrophysics Data System (ADS)

    Glazer, Stuart; Siebes, Georg

    1989-03-01

    This paper describes a novel approach for rapid, quantitative measurement of spatially distributed heat flux incident on a plane. The technique utilizes the spatial temperature distribution on an opaque thin film at the location of interest, as measured by an imaging infrared radiometer. Knowledge of film radiative properties, plus quantitative estimates of convection cooling permit the steady state energy balance at any location on the film sheet to be solved for the incident heat flux. Absolute accuracies on the order of 10-15 percent have been obtained in tests performed in air. The method is particularly useful for evaluation of spatial heat flux uniformity from distributed heat sources over large areas. It has recently been used in several applications at the Jet Propulsion Laboratory, including flux uniformity measurements from large distributed quartz lamp arrays used during thermal vacuum testing of several spacecraft components, and flux mapping of a low power NdYg laser beam.

  8. Quantitative polarization and flow evaluation of choroid and sclera by multifunctional Jones matrix optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Sugiyama, S.; Hong, Y.-J.; Kasaragod, D.; Makita, S.; Miura, M.; Ikuno, Y.; Yasuno, Y.

    2016-03-01

    Quantitative evaluation of optical properties of choroid and sclera are performed by multifunctional optical coherence tomography. Five normal eyes, five glaucoma eyes and one choroidal atrophy eye are examined. The refractive error was found to be correlated with choroidal birefringence, polarization uniformity, and flow in addition to scleral birefringence among normal eyes. The significant differences were observed between the normal and the glaucoma eyes, as for choroidal polarization uniformity, flow and scleral birefringence. An automatic segmentation algorithm of retinal pigment epithelium and chorioscleral interface based on multifunctional signals is also presented.

  9. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed Central

    Hertzberg, Richard C; Teuschler, Linda K

    2002-01-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions. PMID:12634126

  10. Segmentation and quantitative evaluation of brain MRI data with a multiphase 3D implicit deformable model

    NASA Astrophysics Data System (ADS)

    Angelini, Elsa D.; Song, Ting; Mensh, Brett D.; Laine, Andrew

    2004-05-01

    Segmentation of three-dimensional anatomical brain images into tissue classes has applications in both clinical and research settings. This paper presents the implementation and quantitative evaluation of a four-phase three-dimensional active contour implemented with a level set framework for automated segmentation of brain MRIs. The segmentation algorithm performs an optimal partitioning of three-dimensional data based on homogeneity measures that naturally evolves to the extraction of different tissue types in the brain. Random seed initialization was used to speed up numerical computation and avoid the need for a priori information. This random initialization ensures robustness of the method to variation of user expertise, biased a priori information and errors in input information that could be influenced by variations in image quality. Experimentation on three MRI brain data sets showed that an optimal partitioning successfully labeled regions that accurately identified white matter, gray matter and cerebrospinal fluid in the ventricles. Quantitative evaluation of the segmentation was performed with comparison to manually labeled data and computed false positive and false negative assignments of voxels for the three organs. We report high accuracy for the two comparison cases. These results demonstrate the efficiency and flexibility of this segmentation framework to perform the challenging task of automatically extracting brain tissue volume contours.

  11. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography

    NASA Astrophysics Data System (ADS)

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.

  12. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder

    PubMed Central

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7–11 years (27 males, six females) and twenty five adults participants aged 21–29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD. PMID:26797613

  13. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  14. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD. PMID:26797613

  15. On the quantitative analysis and evaluation of magnetic hysteresis data

    NASA Astrophysics Data System (ADS)

    Jackson, Mike; Solheid, Peter

    2010-04-01

    Magnetic hysteresis data are centrally important in pure and applied rock magnetism, but to date, no objective quantitative methods have been developed for assessment of data quality and of the uncertainty in parameters calculated from imperfect data. We propose several initial steps toward such assessment, using loop symmetry as an important key. With a few notable exceptions (e.g., related to field cooling and exchange bias), magnetic hysteresis loops possess a high degree of inversion symmetry (M(H) = -M(-H)). This property enables us to treat the upper and lower half-loops as replicate measurements for quantification of random noise, drift, and offsets. This, in turn, makes it possible to evaluate the statistical significance of nonlinearity, either in the high-field region (due to nonsaturation of the ferromagnetic moment) or over the complete range of applied fields (due to nonnegligible contribution of ferromagnetic phases to the total magnetic signal). It also allows us to quantify the significance of fitting errors for model loops constructed from analytical basis functions. When a statistically significant high-field nonlinearity is found, magnetic parameters must be calculated by approach-to-saturation fitting, e.g., by a model of the form M(H) = Ms + χHFH + αHβ. This nonlinear high-field inverse modeling problem is strongly ill conditioned, resulting in large and strongly covariant uncertainties in the fitted parameters, which we characterize through bootstrap analyses. For a variety of materials, including ferrihydrite and mid-ocean ridge basalts, measured in applied fields up to about 1.5 T, we find that the calculated value of the exponent β is extremely sensitive to small differences in the data or in the method of processing and that the overall uncertainty exceeds the range of physically reasonable values. The "unknowability" of β is accompanied by relatively large uncertainties in the other parameters, which can be characterized, if not

  16. A quantitative method for visual phantom image quality evaluation

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.; Liu, Xiong; O'Shea, Michael; Toto, Lawrence C.

    2000-04-01

    This work presents an image quality evaluation technique for uniform-background target-object phantom images. The Degradation-Comparison-Threshold (DCT) method involves degrading the image quality of a target-containing region with a blocking processing and comparing the resulting image to a similarly degraded target-free region. The threshold degradation needed for 92% correct detection of the target region is the image quality measure of the target. Images of American College of Radiology (ACR) mammography accreditation program phantom were acquired under varying x-ray conditions on a digital mammography machine. Five observers performed ACR and DCT evaluations of the images. A figure-of-merit (FOM) of an evaluation method was defined which takes into account measurement noise and the change of the measure as a function of x-ray exposure to the phantom. The FOM of the DCT method was 4.1 times that of the ACR method for the specks, 2.7 times better for the fibers and 1.4 times better for the masses. For the specks, inter-reader correlations on the same image set increased significantly from 87% for the ACR method to 97% for the DCT method. The viewing time per target for the DCT method was 3 - 5 minutes. The observed greater sensitivity of the DCT method could lead to more precise Quality Control (QC) testing of digital images, which should improve the sensitivity of the QC process to genuine image quality variations. Another benefit of the method is that it can measure the image quality of high detectability target objects, which is impractical by existing methods.

  17. Toward More Performance Evaluation in Chemistry

    NASA Astrophysics Data System (ADS)

    Rasp, Sharon L.

    1998-01-01

    The history of the author's experiences in testing and changes in evaluation philosophy are chronicled. Tests in her classroom have moved from solely paper-pencil, multiple-choiced/objective formats to include also lab performance evaluatiors. Examples of performance evaluations in both a traditional chemistry course and a consumer-level chmistry course are given. Analysis of test results rof students indicates the need to continue to include a variety of methods in evaluating student performance in science.

  18. INTEGRATED WATER TREATMENT SYSTEM PERFORMANCE EVALUATION

    SciTech Connect

    SEXTON RA; MEEUWSEN WE

    2009-03-12

    This document describes the results of an evaluation of the current Integrated Water Treatment System (IWTS) operation against design performance and a determination of short term and long term actions recommended to sustain IWTS performance.

  19. A new performance evaluation tool

    SciTech Connect

    Kindl, F.H.

    1996-12-31

    The paper describes a Steam Cycle Diagnostic Program (SCDP), that has been specifically designed to respond to the increasing need of electric power generators for periodic performance monitoring, and quick identification of the causes for any observed increase in fuel consumption. There is a description of program objectives, modeling and test data inputs, results, underlying program logic, validation of program accuracy by comparison with acceptance test quality data, and examples of program usage.

  20. Evaluation of static and dynamic perfusion cardiac computed tomography for quantitation and classification tasks.

    PubMed

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R; La Riviere, Patrick J; Alessio, Adam M

    2016-04-01

    Cardiac computed tomography (CT) acquisitions for perfusion assessment can be performed in a dynamic or static mode. Either method may be used for a variety of clinical tasks, including (1) stratifying patients into categories of ischemia and (2) using a quantitative myocardial blood flow (MBF) estimate to evaluate disease severity. In this simulation study, we compare method performance on these classification and quantification tasks for matched radiation dose levels and for different flow states, patient sizes, and injected contrast levels. Under conditions simulated, the dynamic method has low bias in MBF estimates (0 to [Formula: see text]) compared to linearly interpreted static assessment (0.45 to [Formula: see text]), making it more suitable for quantitative estimation. At matched radiation dose levels, receiver operating characteristic analysis demonstrated that the static method, with its high bias but generally lower variance, had superior performance ([Formula: see text]) in stratifying patients, especially for larger patients and lower contrast doses [area under the curve [Formula: see text] to 96 versus 0.86]. We also demonstrate that static assessment with a correctly tuned exponential relationship between the apparent CT number and MBF has superior quantification performance to static assessment with a linear relationship and to dynamic assessment. However, tuning the exponential relationship to the patient and scan characteristics will likely prove challenging. This study demonstrates that the selection and optimization of static or dynamic acquisition modes should depend on the specific clinical task.

  1. Quantitative, Notional, and Comprehensive Evaluations of Spontaneous Engaged Speech

    ERIC Educational Resources Information Center

    Molholt, Garry; Cabrera, Maria Jose; Kumar, V. K.; Thompsen, Philip

    2011-01-01

    This study provides specific evidence regarding the extent to which quantitative measures, common sense notional measures, and comprehensive measures adequately characterize spontaneous, although engaged, speech. As such, the study contributes to the growing body of literature describing the current limits of automatic systems for evaluating…

  2. S-191 sensor performance evaluation

    NASA Technical Reports Server (NTRS)

    Hughes, C. L.

    1975-01-01

    A final analysis was performed on the Skylab S-191 spectrometer data received from missions SL-2, SL-3, and SL-4. The repeatability and accuracy of the S-191 spectroradiometric internal calibration was determined by correlation to the output obtained from well-defined external targets. These included targets on the moon and earth as well as deep space. In addition, the accuracy of the S-191 short wavelength autocalibration was flight checked by correlation of the earth resources experimental package S-191 outputs and the Backup Unit S-191 outputs after viewing selected targets on the moon.

  3. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  4. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  5. Evaluating Economic Performance and Policies: A Comment.

    ERIC Educational Resources Information Center

    Schur, Leon M.

    1987-01-01

    Offers a critique of Thurow's paper on the evaluation of economic performance (see SO516719). Concludes that the alternative offered by Thurow is inadequate, and states that the standards developed by the "Framework" are adequate for evaluating economic performance and policies. (JDH)

  6. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  7. Steroidomic Footprinting Based on Ultra-High Performance Liquid Chromatography Coupled with Qualitative and Quantitative High-Resolution Mass Spectrometry for the Evaluation of Endocrine Disrupting Chemicals in H295R Cells.

    PubMed

    Tonoli, David; Fürstenberger, Cornelia; Boccard, Julien; Hochstrasser, Denis; Jeanneret, Fabienne; Odermatt, Alex; Rudaz, Serge

    2015-05-18

    The screening of endocrine disrupting chemicals (EDCs) that may alter steroidogenesis represents a highly important field mainly due to the numerous pathologies, such as cancer, diabetes, obesity, osteoporosis, and infertility that have been related to impaired steroid-mediated regulation. The adrenal H295R cell model has been validated to study steroidogenesis by the Organization for Economic Co-operation and Development (OECD) guideline. However, this guideline focuses solely on testosterone and estradiol monitoring, hormones not typically produced by the adrenals, hence limiting possible in-depth mechanistic investigations. The present work proposes an untargeted steroidomic footprinting workflow based on ultra-high pressure liquid chromatography (UHPLC) coupled to high-resolution MS for the screening and mechanistic investigations of EDCs in H295R cell supernatants. A suspected EDC, triclocarban (TCC), used in detergents, cosmetics, and personal care products, was selected to demonstrate the efficiency of the reported methodology, allowing the simultaneous assessment of a steroidomic footprint and quantification of a selected subset of steroids in a single analysis. The effects of exposure to increasing TCC concentrations were assessed, and the selection of features with database matching followed by multivariate analysis has led to the selection of the most salient affected steroids. Using correlation analysis, 11 steroids were associated with a high, 18 with a medium, and 8 with a relatively low sensitivity behavior to TCC. Among the candidates, 13 identified steroids were simultaneously quantified, leading to the evaluation and localization of the disruption of steroidogenesis caused by TCC upstream of the formation of pregnenolone. The remaining candidates could be associated with a specific steroid class (progestogens and corticosteroids, or androgens) and represent a specific footprint of steroidogenesis disruption by TCC. This strategy was devised to be

  8. Objective and quantitative evaluation of motor function in a monkey model of Parkinson's disease.

    PubMed

    Saiki, Hidemoto; Hayashi, Takuya; Takahashi, Ryosuke; Takahashi, Jun

    2010-07-15

    Monkeys treated with 1-methyl-4-phenyl-1,2,5,6-tetrahydropyridine (MPTP) are currently the best animal model for Parkinson's disease (PD) and have been widely used for physiological and pharmacological investigations. However, objective and quantitative assessments have not been established for grading their motor behaviors. In order to develop a method for an unbiased evaluation, we performed a video-based assessment, used qualitative rating scales, and carried out an in vivo investigation of dopamine (DA) transporter binding in systemically MPTP-treated monkeys. The video-based analysis of spontaneous movement clearly demonstrated a significant correlation with the qualitative rating score. The assessment of DA transporter (DAT) function by [(11)C]-CFT-PET showed that, when compared with normal animals, the MPTP-treated animals exhibited decreased CFT binding in the bilateral striatum, particularly in the dorsal part in the putamen and caudate. Among the MPTP-treated monkeys, an unbiased PET analysis revealed a significant correlation between CFT binding in the midbrain and qualitative rating scores or the amount of spontaneous movements. These results indicate that a video-based analysis can be a reliable tool for an objective and quantitative evaluation of motor dysfunction of MPTP-treated monkeys, and furthermore, that DAT function in the midbrain may also be important for the evaluation.

  9. Theory and Practice on Teacher Performance Evaluation

    ERIC Educational Resources Information Center

    Yonghong, Cai; Chongde, Lin

    2006-01-01

    Teacher performance evaluation plays a key role in educational personnel reform, so it has been an important yet difficult issue in educational reform. Previous evaluations on teachers failed to make strict distinction among the three dominant types of evaluation, namely, capability, achievement, and effectiveness. Moreover, teacher performance…

  10. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    SciTech Connect

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  11. Performance of two real-time PCR assays for hepatitis B virus DNA detection and quantitation.

    PubMed

    Kania, Dramane; Ottomani, Laure; Meda, Nicolas; Peries, Marianne; Dujols, Pierre; Bolloré, Karine; Rénier, Wendy; Viljoen, Johannes; Ducos, Jacques; Van de Perre, Philippe; Tuaillon, Edouard

    2014-06-01

    In-house developed real-time PCR (qPCR) techniques could be useful conjunctives to the management of hepatitis B virus (HBV) infection in resource-limited settings with high prevalence. Two qPCR assays (qPCR1 and qPCR2), based on primers/probes targeting conserved regions of the X and S genes of HBV respectively, were evaluated using clinical samples of varying HBV genotypes, and compared to the commercial Roche Cobas AmpliPrep/Cobas TaqMan HBV Test v2.0. The lower detection limit (LDL) was established at 104 IU/ml for qPCR1, and 91 IU/ml for qPCR2. Good agreement and correlation were obtained between the Roche assay and both qPCR assays (r = 0.834 for qPCR1; and r = 0.870 for qPCR2). Differences in HBV DNA load of > 0.5 Log10 IU/ml between the Roche and the qPCR assays were found in 49/122 samples of qPCR1, and 35/122 samples of qPCR2. qPCR1 tended to underestimate HBV DNA quantity in samples with a low viral load and overestimate HBV DNA concentration in samples with a high viral load when compared to the Roche test. Both molecular tools that were developed, used on an open real-time PCR system, were reliable for HBV DNA detection and quantitation. The qPCR2 performed better than the qPCR1 and had the additional advantage of various HBV genotype detection and quantitation. This low cost quantitative HBV DNA PCR assay may be an alternative solution when implementing national programmes to diagnose, monitor and treat HBV infection in low- to middle-income countries where testing for HBV DNA is not available in governmental health programmes.

  12. Quantitative measurement of porphyrins in biological tissues and evaluation of tissue porphyrins during toxicant exposures.

    PubMed

    Woods, J S; Miller, H D

    1993-10-01

    Porphyrins are formed in most eukaryotic tissues as intermediates in the biosynthesis of heme. Assessment of changes in tissue porphyrin levels occurring in response to the actions of various drugs or toxicants is potentially useful in the evaluation of chemical exposures and effects. The present paper describes a rapid and sensitive method for the extraction and quantitation of porphyrins in biological tissues which overcomes difficulties encountered in previously described methods, particularly the loss of porphyrins during extraction and interference of porphyrin quantitation by coeluting fluorescent tissue constituents. In this procedure 8- through 2-carboxyl porphyrins are quantitatively extracted from tissue homogenates using HCl and methanol and are subsequently separated from potentially interfering contaminants by sequential methanol/phosphate elution on a C-18 preparatory column. Porphyrins are then separated and measured by reversed-phase high-performance liquid chromatography and spectrofluorometric techniques. Recovery of tissue porphyrins using this method is close to 100% with an intraassay variability of less than 10%. We have employed this procedure to measure liver and kidney porphyrin concentrations in male Fischer rats and to define the distinctive changes in tissue porphyrin patterns associated with treatment with the hepatic and renal porphyrinogenic chemicals, allylisopropylacetamide, and methyl mercury hydroxide, respectively. This method is applicable to the measurement of tissue porphyrin changes resulting from drug or toxicant exposures in clinical, experimental or environmental assessments.

  13. Evaluating survival model performance: a graphical approach.

    PubMed

    Mandel, M; Galai, N; Simchen, E

    2005-06-30

    In the last decade, many statistics have been suggested to evaluate the performance of survival models. These statistics evaluate the overall performance of a model ignoring possible variability in performance over time. Using an extension of measures used in binary regression, we propose a graphical method to depict the performance of a survival model over time. The method provides estimates of performance at specific time points and can be used as an informal test for detecting time varying effects of covariates in the Cox model framework. The method is illustrated on real and simulated data using Cox proportional hazard model and rank statistics.

  14. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  15. 35-mm film scanner as an intraoral dental radiograph digitizer. I: A quantitative evaluation.

    PubMed

    Shrout, M K; Potter, B J; Yurgalavage, H M; Hildebolt, C F; Vannier, M W

    1993-10-01

    A 35-mm slide scanner digital imaging system was tested for its suitability in digitizing intraoral dental radiographic film for quantitative studies. The system (Nikon model LS-3510AF Nikon Electronic Imaging, Nikon, Inc., Melville, N.Y.) uses a charge-coupled device linear photodiode array. The data content in the original film images was evaluated, and the system performance assessed objectively with the use of specially designed test films. Radiometric and geometric performances for the digitizing system were extracted from measurements and observations, and these were compared with published data for two other film digitizing systems (video camera DAGE MTI, Michigan City, Ind. and Barneyscan 35-mm film digitizer Barneyscan, Berkeley, Calif.). The techniques used to evaluate this system are easy and suitable for evaluation of any digitizing system. This scanner system (Nikon) was superior to previously evaluated systems in transforming and recording radiographic film densities across the range (0.3 to 2.0 optical density units) of clinically relevant optical densities. The scanner offers substantial advantage over the other digitizing systems for gray scale information from clinically important optical densities. PMID:8233432

  16. Quantitative performance of a quadrupole-orbitrap-MS in targeted LC-MS determinations of small molecules.

    PubMed

    Grund, Baptiste; Marvin, Laure; Rochat, Bertrand

    2016-05-30

    High-resolution mass spectrometry (HRMS) has been associated with qualitative and research analysis and QQQ-MS with quantitative and routine analysis. This view is now challenged and for this reason, we have evaluated the quantitative LC-MS performance of a new high-resolution mass spectrometer (HRMS), a Q-orbitrap-MS, and compared the results obtained with a recent triple-quadrupole MS (QQQ-MS). High-resolution full-scan (HR-FS) and MS/MS acquisitions have been tested with real plasma extracts or pure standards. Limits of detection, dynamic range, mass accuracy and false positive or false negative detections have been determined or investigated with protease inhibitors, tyrosine kinase inhibitors, steroids and metanephrines. Our quantitative results show that today's available HRMS are reliable and sensitive quantitative instruments and comparable to QQQ-MS quantitative performance. Taking into account their versatility, user-friendliness and robustness, we believe that HRMS should be seen more and more as key instruments in quantitative LC-MS analyses. In this scenario, most targeted LC-HRMS analyses should be performed by HR-FS recording virtually "all" ions. In addition to absolute quantifications, HR-FS will allow the relative quantifications of hundreds of metabolites in plasma revealing individual's metabolome and exposome. This phenotyping of known metabolites should promote HRMS in clinical environment. A few other LC-HRMS analyses should be performed in single-ion-monitoring or MS/MS mode when increased sensitivity and/or detection selectivity will be necessary.

  17. A new approach to quantitative NMR: fluoroquinolones analysis by evaluating the chemical shift displacements.

    PubMed

    Michaleas, S; Antoniadou-Vyza, E

    2006-10-11

    Quantitative NMR spectroscopy is always an attractive goal as the identity and quantity could be simultaneously determined. Although significant advancements have been achieved in this field it is common that all reported quantitative NMR methods perform the analysis by utilizing the average integral intensities of selected signals. During the calculation of the area under NMR peaks several response problems can occur which should always be treated carefully to overcome inaccuracies. In the method proposed in this work the quantitative information is obtained utilizing the measurement of selected protons chemical shift displacements which is a quite straightforward and highly reproducible process. The (1)H NMR spectra of multiple fluoroquinolone (FQ) solutions revealed that the chemical shifts of protons, especially the aromatic ones, were concentration dependent for all tested compounds, as a result of extensive self-association phenomena. In the present work a novel methodology is described for the quantitation of several FQs based on this dependence. The proposed method was applied to Ciprofloxacin solutions over a wide range of concentrations. Evaluation of the obtained data presented acceptable characteristics regarding accuracy, precision, and robustness. The applicability limitations of this method were found to be posed by current instrumentation, mainly by the magnetic field frequency e.g. the slope of the response function achieved with a 400MHz instrument was twice the one achieved at 200MHz. The pH effect was negligible from pD 2.5 to 5.5. The phenomenon appeared in a pattern that can be applied for a plethora of drug categories revealing self-association phenomena in a range of concentration determined by the magnet strength of the instrument.

  18. Quantitative Guidance for Stove Usage and Performance to Achieve Health and Environmental Targets

    PubMed Central

    Chiang, Ranyee A.

    2015-01-01

    Background Displacing the use of polluting and inefficient cookstoves in developing countries is necessary to achieve the potential health and environmental benefits sought through clean cooking solutions. Yet little quantitative context has been provided on how much displacement of traditional technologies is needed to achieve targets for household air pollutant concentrations or fuel savings. Objectives This paper provides instructive guidance on the usage of cooking technologies required to achieve health and environmental improvements. Methods We evaluated different scenarios of displacement of traditional stoves with use of higher performing technologies. The air quality and fuel consumption impacts were estimated for these scenarios using a single-zone box model of indoor air quality and ratios of thermal efficiency. Results Stove performance and usage should be considered together, as lower performing stoves can result in similar or greater benefits than a higher performing stove if the lower performing stove has considerably higher displacement of the baseline stove. Based on the indoor air quality model, there are multiple performance–usage scenarios for achieving modest indoor air quality improvements. To meet World Health Organization guidance levels, however, three-stone fire and basic charcoal stove usage must be nearly eliminated to achieve the particulate matter target (< 1–3 hr/week), and substantially limited to meet the carbon monoxide guideline (< 7–9 hr/week). Conclusions Moderate health gains may be achieved with various performance–usage scenarios. The greatest benefits are estimated to be achieved by near-complete displacement of traditional stoves with clean technologies, emphasizing the need to shift in the long term to near exclusive use of clean fuels and stoves. The performance–usage scenarios are also provided as a tool to guide technology selection and prioritize behavior change opportunities to maximize impact. Citation

  19. Conductor gestures influence evaluations of ensemble performance

    PubMed Central

    Morrison, Steven J.; Price, Harry E.; Smedley, Eric M.; Meals, Cory D.

    2014-01-01

    Previous research has found that listener evaluations of ensemble performances vary depending on the expressivity of the conductor’s gestures, even when performances are otherwise identical. It was the purpose of the present study to test whether this effect of visual information was evident in the evaluation of specific aspects of ensemble performance: articulation and dynamics. We constructed a set of 32 music performances that combined auditory and visual information and were designed to feature a high degree of contrast along one of two target characteristics: articulation and dynamics. We paired each of four music excerpts recorded by a chamber ensemble in both a high- and low-contrast condition with video of four conductors demonstrating high- and low-contrast gesture specifically appropriate to either articulation or dynamics. Using one of two equivalent test forms, college music majors and non-majors (N = 285) viewed sixteen 30 s performances and evaluated the quality of the ensemble’s articulation, dynamics, technique, and tempo along with overall expressivity. Results showed significantly higher evaluations for performances featuring high rather than low conducting expressivity regardless of the ensemble’s performance quality. Evaluations for both articulation and dynamics were strongly and positively correlated with evaluations of overall ensemble expressivity. PMID:25104944

  20. Conductor gestures influence evaluations of ensemble performance.

    PubMed

    Morrison, Steven J; Price, Harry E; Smedley, Eric M; Meals, Cory D

    2014-01-01

    Previous research has found that listener evaluations of ensemble performances vary depending on the expressivity of the conductor's gestures, even when performances are otherwise identical. It was the purpose of the present study to test whether this effect of visual information was evident in the evaluation of specific aspects of ensemble performance: articulation and dynamics. We constructed a set of 32 music performances that combined auditory and visual information and were designed to feature a high degree of contrast along one of two target characteristics: articulation and dynamics. We paired each of four music excerpts recorded by a chamber ensemble in both a high- and low-contrast condition with video of four conductors demonstrating high- and low-contrast gesture specifically appropriate to either articulation or dynamics. Using one of two equivalent test forms, college music majors and non-majors (N = 285) viewed sixteen 30 s performances and evaluated the quality of the ensemble's articulation, dynamics, technique, and tempo along with overall expressivity. Results showed significantly higher evaluations for performances featuring high rather than low conducting expressivity regardless of the ensemble's performance quality. Evaluations for both articulation and dynamics were strongly and positively correlated with evaluations of overall ensemble expressivity. PMID:25104944

  1. Application of a quantitative carrier test to evaluate microbicides against mycobacteria.

    PubMed

    Springthorpe, V Susan; Sattar, Syed A

    2007-01-01

    Microbicides for reprocessing heat-sensitive medical devices, such as flexible endoscopes, must be mycobactericidal to reduce the risk of nosocomial infections. Suspension test methods currently used for efficacy evaluation lack the stringency required for assessing inactivation of mycobacteria on surfaces. The quantitative carrier test method reported here is based on mycobacteria-contaminated reference carrier disks of brushed stainless steel. Each disk was contaminated with 10 microL of a suspension of Mycobacterium terrae containing a soil load. Each disk with a dried inoculum was placed in a glass or Teflon vial, and then overlaid with 50 microL of the test formulation or 50 microL saline for the control carriers. Five test and 3 control disks were used in each run. At the end of the contact time, each vial received 9.95 mL neutralizer solution with 0.1% Tween-80 to stop the reaction and perform the initial microbicide dilution. The inoculum was eluted by mixing on a Vortex mixer for 60 s, and the eluates and saline used to subsequently wash the vials and the funnels were membrane-filtered. Filters were placed on plates of Middlebrook 7H11 agar and incubated at 37 degrees C for at least 30 days before colonies were counted and log10 reductions were calculated in colony-forming units. Tests with a range of commercially available products, having claims against mycobacteria, or believed to be broad-spectrum microbicides, showed that the method gave reproducible results. Products used included oxidizing agents (sodium hypochlorite and an iodophore), a phenolic, a quaternary ammonium compound, and ortho-phthalaldehyde. This method represents a much more realistic evaluation than the currently used quantitative suspension test method for the evaluation of mycobactericidal formulations for registration and, when performed at different product concentrations, allows an assessment of any safety margin or risks in using the test formulation in the field.

  2. An evaluation of recent quantitative magnetospheric magnetic field models

    NASA Technical Reports Server (NTRS)

    Walker, R. J.

    1976-01-01

    Magnetospheric field models involving dipole tilt effects are discussed, with particular reference to defined magnetopause models and boundary surface models. The models are compared with observations and with each other whenever possible. It is shown that models containing only contributions from magnetopause and tail current systems are capable of reproducing the observed quiet time field just in a qualitative way. The best quantitative agreement between models and observations take place when currents distributed in the inner magnetosphere are added to the magnetopause and tail current systems. One region in which all the models fall short is the region around the polar cusp. Obtaining physically reasonable gradients should have high priority in the development of future models.

  3. Performance Evaluation of Undulator Radiation at CEBAF

    SciTech Connect

    Chuyu Liu, Geoffrey Krafft, Guimei Wang

    2010-05-01

    The performance of undulator radiation (UR) at CEBAF with a 3.5 m helical undulator is evaluated and compared with APS undulator-A radiation in terms of brilliance, peak brilliance, spectral flux, flux density and intensity distribution.

  4. ATAMM enhancement and multiprocessor performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.; Som, Sukhamoy; Obando, Rodrigo; Malekpour, Mahyar R.; Jones, Robert L., III; Mandala, Brij Mohan V.

    1991-01-01

    ATAMM (Algorithm To Architecture Mapping Model) enhancement and multiprocessor performance evaluation is discussed. The following topics are included: the ATAMM model; ATAMM enhancement; ADM (Advanced Development Model) implementation of ATAMM; and ATAMM support tools.

  5. Using quantitative interference phase microscopy for sperm acrosome evaluation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Balberg, Michal; Kalinowski, Ksawery; Levi, Mattan; Shaked, Natan T.

    2016-03-01

    We demonstrate quantitative assessment of sperm cell morphology, primarily acrosomal volume, using quantitative interference phase microscopy (IPM). Normally, the area of the acrosome is assessed using dyes that stain the acrosomal part of the cell. We have imaged fixed individual sperm cells using IPM. Following, the sample was stained and the same cells were imaged using bright field microscopy (BFM). We identified the acrosome using the stained BFM image, and used it to define a quantitative corresponding area in the IPM image and determine a quantitative threshold for evaluating the volume of the acrosome.

  6. A Quantitative Approach to Evaluating Training Curriculum Content Sampling Adequacy.

    ERIC Educational Resources Information Center

    Bownas, David A.; And Others

    1985-01-01

    Developed and illustrated a technique depicting the fit between training curriculum content and job performance requirements for three Coast Guard schools. Generated a listing of tasks which receive undue emphasis in training, tasks not being taught, and tasks instructors intend to train, but which course graduates are unable to perform.…

  7. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  8. Improvement of Automotive Part Supplier Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Kongmunee, Chalermkwan; Chutima, Parames

    2016-05-01

    This research investigates the problem of the part supplier performance evaluation in a major Japanese automotive plant in Thailand. Its current evaluation scheme is based on experiences and self-opinion of the evaluators. As a result, many poor performance suppliers are still considered as good suppliers and allow to supply parts to the plant without further improvement obligation. To alleviate this problem, the brainstorming session among stakeholders and evaluators are formally conducted. The result of which is the appropriate evaluation criteria and sub-criteria. The analytical hierarchy process is also used to find suitable weights for each criteria and sub-criteria. The results show that a newly developed evaluation method is significantly better than the previous one in segregating between good and poor suppliers.

  9. Evaluation of high-performance computing software

    SciTech Connect

    Browne, S.; Dongarra, J.; Rowan, T.

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  10. Multicenter Evaluation of the Elecsys Hepatitis B Surface Antigen Quantitative Assay ▿

    PubMed Central

    Zacher, B. J.; Moriconi, F.; Bowden, S.; Hammond, R.; Louisirirotchanakul, S.; Phisalprapa, P.; Tanwandee, T.; Wursthorn, K.; Brunetto, M. R.; Wedemeyer, H.; Bonino, F.

    2011-01-01

    The Elecsys hepatitis B surface antigen (HBsAg) II quantitative assay is a new quantitative electrochemiluminescence immunoassay which uses onboard dilution and a simple algorithm to determine HBsAg levels expressed in international units (IU)/ml (standardized against the World Health Organization [WHO] Second International Standard). This study evaluated its performance using routine serum samples from a wide range of HBsAg carriers and patients with chronic hepatitis B (CHB). HBsAg levels were measured in serum samples collected independently by five centers in Europe, Australia, and Asia. Serial dilution analyses were performed to assess the recommended dilution algorithm and determine the assay range free of hook effect. Assay precision was also established. Following assessment of serial dilutions (1:100 to 1:1,000,000) of the 611 samples analyzed, 70.0% and 85.6% of samples tested with analyzers incorporating 1:100 (Elecsys 2010 and cobas e 411) and 1:400 (Modular Analytics E170) onboard dilution, respectively, fell within the linear range of the assay, providing a final result on the first test. No high-dose hook effect was seen up to the maximum HBsAg serum level tested (870,000 IU/ml) using the dilution algorithm. HBsAg levels were reliably determined across all hepatitis B virus (HBV) genotypes, phases of HBV infection, and stages of disease tested. Precision was high across all analyzers (% coefficient of variation [CV], 1.4 to 9.6; HBsAg concentrations, 0.1 to 37,300 IU/ml). The Elecsys HBsAg II quantitative assay accurately and reliably quantifies HBsAg in routine clinical samples. Onboard dilution minimizes retesting and reduces the potential for error. PMID:21880853

  11. Multicenter evaluation of the Elecsys hepatitis B surface antigen quantitative assay.

    PubMed

    Zacher, B J; Moriconi, F; Bowden, S; Hammond, R; Louisirirotchanakul, S; Phisalprapa, P; Tanwandee, T; Wursthorn, K; Brunetto, M R; Wedemeyer, H; Bonino, F

    2011-11-01

    The Elecsys hepatitis B surface antigen (HBsAg) II quantitative assay is a new quantitative electrochemiluminescence immunoassay which uses onboard dilution and a simple algorithm to determine HBsAg levels expressed in international units (IU)/ml (standardized against the World Health Organization [WHO] Second International Standard). This study evaluated its performance using routine serum samples from a wide range of HBsAg carriers and patients with chronic hepatitis B (CHB). HBsAg levels were measured in serum samples collected independently by five centers in Europe, Australia, and Asia. Serial dilution analyses were performed to assess the recommended dilution algorithm and determine the assay range free of hook effect. Assay precision was also established. Following assessment of serial dilutions (1:100 to 1:1,000,000) of the 611 samples analyzed, 70.0% and 85.6% of samples tested with analyzers incorporating 1:100 (Elecsys 2010 and cobas e 411) and 1:400 (Modular Analytics E170) onboard dilution, respectively, fell within the linear range of the assay, providing a final result on the first test. No high-dose hook effect was seen up to the maximum HBsAg serum level tested (870,000 IU/ml) using the dilution algorithm. HBsAg levels were reliably determined across all hepatitis B virus (HBV) genotypes, phases of HBV infection, and stages of disease tested. Precision was high across all analyzers (% coefficient of variation [CV], 1.4 to 9.6; HBsAg concentrations, 0.1 to 37,300 IU/ml). The Elecsys HBsAg II quantitative assay accurately and reliably quantifies HBsAg in routine clinical samples. Onboard dilution minimizes retesting and reduces the potential for error.

  12. Performance-Based Evaluation and School Librarians

    ERIC Educational Resources Information Center

    Church, Audrey P.

    2015-01-01

    Evaluation of instructional personnel is standard procedure in our Pre-K-12 public schools, and its purpose is to document educator effectiveness. With Race to the Top and No Child Left Behind waivers, states are required to implement performance-based evaluations that demonstrate student academic progress. This three-year study describes the…

  13. Building Leadership Talent through Performance Evaluation

    ERIC Educational Resources Information Center

    Clifford, Matthew

    2015-01-01

    Most states and districts scramble to provide professional development to support principals, but "principal evaluation" is often lost amid competing priorities. Evaluation is an important method for supporting principal growth, communicating performance expectations to principals, and improving leadership practice. It provides leaders…

  14. Reference Service Standards, Performance Criteria, and Evaluation.

    ERIC Educational Resources Information Center

    Schwartz, Diane G.; Eakin, Dottie

    1986-01-01

    Describes process by which reference service standards were developed at a university medical library and their impact on the evaluation of work of librarians. Highlights include establishment of preliminary criteria, literature review, reference service standards, performance evaluation, peer review, and staff development. Checklist of reference…

  15. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  16. Evaluating Economic Performance and Policies: A Comment.

    ERIC Educational Resources Information Center

    Walstad, William B.

    1987-01-01

    Critiques Thurow's paper on the evaluation of economic performance (see SO516719). Concludes that the Joint Council's "Framework" offers a solid foundation for teaching about economic performance if the Joint Council can persuade high school economics teachers to use it. (JDH)

  17. Quantitative pharmaco-EEG and performance after administration of brotizolam to healthy volunteers

    PubMed Central

    Saletu, B.; Grünberger, J.; Linzmayer, L.

    1983-01-01

    1 The activity of brotizolam (0.1, 0.3 and 0.5 mg) was studied in normal subjects using quantitative pharmaco-EEG, psychometric and clinical evaluation. 2 Power spectral density analysis showed no changes after placebo, while brotizolam increased beta-activity, decreased alpha-activity and increased the average frequency (anxiolytic pharmaco-EEG profile). In addition, 0.3 and 0.5 mg brotizolam augmented delta-activity indicating hypnotic activity. 3 The highest dose (0.5 mg) of brotizolam decreased attention, concentration, psychomotor performance and affectivity, and increased reaction time. The lower doses of brotizolam also caused a decrease in attention and concentration, but tended to improve psychomotor performance, shorten reaction time, and did not influence mood or affectivity. 4 Brotizolam (0.1 mg) is the minimal effective psychoactive dose with a tranquillizing effect, while 0.5 mg and to some extent 0.3 mg induce a sedative effect and may be regarded as hypnotic doses. PMID:6661379

  18. Digital holographic microscopy for quantitative cell dynamic evaluation during laser microsurgery

    PubMed Central

    Yu, Lingfeng; Mohanty, Samarendra; Zhang, Jun; Genc, Suzanne; Kim, Myung K.; Berns, Michael W.; Chen, Zhongping

    2010-01-01

    Digital holographic microscopy allows determination of dynamic changes in the optical thickness profile of a transparent object with subwavelength accuracy. Here, we report a quantitative phase laser microsurgery system for evaluation of cellular/ sub-cellular dynamic changes during laser micro-dissection. The proposed method takes advantage of the precise optical manipulation by the laser microbeam and quantitative phase imaging by digital holographic microscopy with high spatial and temporal resolution. This system will permit quantitative evaluation of the damage and/or the repair of the cell or cell organelles in real time. PMID:19582118

  19. A lighting metric for quantitative evaluation of accent lighting systems

    NASA Astrophysics Data System (ADS)

    Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.

    2014-09-01

    Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.

  20. Use of the Behaviorally Anchored Rating Scale in Evaluating Teacher Performance.

    ERIC Educational Resources Information Center

    Beebe, Robert J.

    Behaviorally anchored rating scales (BARS), a new quantitative method of employee performance evaluation, is advocated for teacher evaluation. Development of a BARS consists generally of five steps: a representative sample of potential raters generates the scales; the group identifies the broad qualities to be evaluated; the group formulates…

  1. Evaluation of a quantitative fit testing method for N95 filtering facepiece respirators.

    PubMed

    Janssen, Larry; Luinenburg, Michael D; Mullins, Haskell E; Danisch, Susan G; Nelson, Thomas J

    2003-01-01

    A method for performing quantitative fit tests (QNFT) with N95 filtering facepiece respirators was developed by earlier investigators. The method employs a simple clamping device to allow the penetration of submicron aerosols through N95 filter media to be measured. The measured value is subtracted from total penetration, with the assumption that the remaining penetration represents faceseal leakage. The developers have used the clamp to assess respirator performance. This study evaluated the clamp's ability to measure filter penetration and determine fit factors. In Phase 1, subjects were quantitatively fit-tested with elastomeric half-facepiece respirators using both generated and ambient aerosols. QNFT were done with each aerosol with both P100 and N95 filters without disturbing the facepiece. In Phase 2 of the study elastomeric half facepieces were sealed to subjects' faces to eliminate faceseal leakage. Ambient aerosol QNFT were performed with P100 and N95 filters without disturbing the facepiece. In both phases the clamp was used to measure N95 filter penetration, which was then subtracted from total penetration for the N95 QNFT. It was hypothesized that N95 fit factors corrected for filter penetration would equal the P100 fit factors. Mean corrected N95 fit factors were significantly different from the P100 fit factors in each phase of the study. In addition, there was essentially no correlation between corrected N95 fit factors and P100 fit factors. It was concluded that the clamp method should not be used to fit-test N95 filtering facepieces or otherwise assess respirator performance. PMID:12908863

  2. A Quantitative Investigation of Stakeholder Variation in Training Program Evaluation.

    ERIC Educational Resources Information Center

    Michalski, Greg V.

    A survey was conducted to investigate variation in stakeholder perceptions of training results and evaluation within the context of a high-technology product development firm (the case organization). A scannable questionnaire survey booklet was developed and scanned data were exported and analyzed. Based on an achieved sample of 280 (70% response…

  3. AMTEC RC-10 Performance Evaluation Test Program

    NASA Astrophysics Data System (ADS)

    Schuller, Michael; Reiners, Elinor; Lemire, Robert; Sievers, Robert

    1994-07-01

    The Phillips Laboratory Power and Thermal Management Division (PL/VTP), in conjunction with ORION International Technologies, initiated the Alkali Metal Thermal to Electric Conversion (AMTEC), Remote Condensed-10% efficient (RC-10) Performance Evaluation Test Program to investigate cell design variations intended to increase efficiency in AMTEC cells. The RC-10 cell, fabricated by Advanced Modular Power Systems, uses a remote condensing region to reduce radiative heat losses from the electrode. The cell has operated at 10% efficiency. PL/VTP tested the RC-10 to evaluate its performance and efficiency. The impact of temperature variations along the length of the cell wall on performance were evaluated. Testing was performed in air, with a `` guard heater'' surrounding the cell to simulate the system environment of the cell.

  4. Quantitative vertebral compression fracture evaluation using a height compass

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Burns, Joseph E.; Wiese, Tatjana; Summers, Ronald M.

    2012-03-01

    Vertebral compression fractures can be caused by even minor trauma in patients with pathological conditions such as osteoporosis, varying greatly in vertebral body location and compression geometry. The location and morphology of the compression injury can guide decision making for treatment modality (vertebroplasty versus surgical fixation), and can be important for pre-surgical planning. We propose a height compass to evaluate the axial plane spatial distribution of compression injury (anterior, posterior, lateral, and central), and distinguish it from physiologic height variations of normal vertebrae. The method includes four steps: spine segmentation and partition, endplate detection, height compass computation and compression fracture evaluation. A height compass is computed for each vertebra, where the vertebral body is partitioned in the axial plane into 17 cells oriented about concentric rings. In the compass structure, a crown-like geometry is produced by three concentric rings which are divided into 8 equal length arcs by rays which are subtended by 8 common central angles. The radius of each ring increases multiplicatively, with resultant structure of a central node and two concentric surrounding bands of cells, each divided into octants. The height value for each octant is calculated and plotted against octants in neighboring vertebrae. The height compass shows intuitive display of the height distribution and can be used to easily identify the fracture regions. Our technique was evaluated on 8 thoraco-abdominal CT scans of patients with reported compression fractures and showed statistically significant differences in height value at the sites of the fractures.

  5. Quantitative image analysis for evaluating the abrasion resistance of nanoporous silica films on glass

    PubMed Central

    Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar

    2015-01-01

    The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5–6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance. PMID:26656260

  6. Effects of Performers' External Characteristics on Performance Evaluations.

    ERIC Educational Resources Information Center

    Bermingham, Gudrun A.

    2000-01-01

    States that fairness has been a major concern in the field of music adjudication. Reviews the research literature to reveal information about three external characteristics (race, gender, and physical attractiveness) that may affect judges' performance evaluations and influence fairness of music adjudication. Includes references. (CMK)

  7. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model

    PubMed Central

    Yoshioka, S.; Matsuhana, B.; Tanaka, S.; Inouye, Y.; Oshima, N.; Kinoshita, S.

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model. PMID:20554565

  8. Introduction of a method for quantitative evaluation of spontaneous motor activity development with age in infants.

    PubMed

    Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter

    2012-04-01

    Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.

  9. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model.

    PubMed

    Yoshioka, S; Matsuhana, B; Tanaka, S; Inouye, Y; Oshima, N; Kinoshita, S

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model.

  10. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  11. Quantitative analysis combined with chromatographic fingerprint for comprehensive evaluation of Xiaoer Chaigui Tuire granules by HPLC-DAD.

    PubMed

    Liu, Hong-Ming; Nie, Lei

    2015-01-01

    Quantitative analysis of eight major components combined with chromatographic fingerprint based on high performance liquid chromatography coupled with diode array detector (HPLC-DAD) was developed for the quality evaluation of Xiaoer Chaigui Tuire granules (XCTG), a traditional Chinese medicine (TCM) preparation. Each compound was analyzed by comparing its retention time and UV spectrum of each chromatographic peak with the corresponding retention time and UV spectrum of each standard compound. Baseline separation was achieved on an Agilent Zorbax SB-C18 column with gradient elution of acetonitrile and 0.1% (v/v) phosphoric acid. The developed method was validated by linearity, precision, repeatability, stability and recovery and was subsequently applied to quality evaluation of 12 batches of XCTG with similarity analysis, principal component analysis and cluster analysis. Quantitative analysis combined with HPLC fingerprint could offer an efficient, reliable and practical approach for quality evaluation of XCTG.

  12. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  13. Computerized quantitative evaluation of mammographic accreditation phantom images

    SciTech Connect

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria, the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.

  14. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results.

  15. Quantitative Evaluation of Strain Near Tooth Fillet by Image Processing

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Yoshiizumi, Satoshi; Inoue, Katsumi

    The accurate measurement of strain and stress in a tooth is important for the reliable evaluation of the strength or life of gears. In this research, a strain measurement method which is based on image processing is applied to the analysis of strain near the tooth fillet. The loaded tooth is photographed using a CCD camera and stored as a digital image. The displacement of the point in the tooth flank is tracked by the cross-correlation method, and then, the strain is calculated. The interrogation window size of the correlation method and the overlap amount affect the accuracy and resolution. In the case of measurements at structures with complicated profiles such as fillets, the interrogation window maintains a large size and the overlap amount should be large. The surface condition also affects the accuracy. The white painted surface with a small black particle is suitable for measurement.

  16. Towards the quantitative evaluation of visual attention models.

    PubMed

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations.

  17. Towards the quantitative evaluation of visual attention models.

    PubMed

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. PMID:25951756

  18. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  19. A Quantitative Evaluation of Medication Histories and Reconciliation by Discipline

    PubMed Central

    Stewart, Michael R.; Fogg, Sarah M.; Schminke, Brandon C.; Zackula, Rosalee E.; Nester, Tina M.; Eidem, Leslie A.; Rosendale, James C.; Ragan, Robert H.; Bond, Jack A.; Goertzen, Kreg W.

    2014-01-01

    Abstract Background/Objective: Medication reconciliation at transitions of care decreases medication errors, hospitalizations, and adverse drug events. We compared inpatient medication histories and reconciliation across disciplines and evaluated the nature of discrepancies. Methods: We conducted a prospective cohort study of patients admitted from the emergency department at our 760-bed hospital. Eligible patients had their medication histories conducted and reconciled in order by the admitting nurse (RN), certified pharmacy technician (CPhT), and pharmacist (RPh). Discharge medication reconciliation was not altered. Admission and discharge discrepancies were categorized by discipline, error type, and drug class and were assigned a criticality index score. A discrepancy rating system systematically measured discrepancies. Results: Of 175 consented patients, 153 were evaluated. Total admission and discharge discrepancies were 1,461 and 369, respectively. The average number of medications per participant at admission was 8.59 (1,314) with 9.41 (1,374) at discharge. Most discrepancies were committed by RNs: 53.2% (777) at admission and 56.1% (207) at discharge. The majority were omitted or incorrect. RNs had significantly higher admission discrepancy rates per medication (0.59) compared with CPhTs (0.36) and RPhs (0.16) (P < .001). RPhs corrected significantly more discrepancies per participant than RNs (6.39 vs 0.48; P < .001); average criticality index reduction was 79.0%. Estimated prevented adverse drug events (pADEs) cost savings were $589,744. Conclusions: RPhs committed the fewest discrepancies compared with RNs and CPhTs, resulting in more accurate medication histories and reconciliation. RPh involvement also prevented the greatest number of medication errors, contributing to considerable pADE-related cost savings. PMID:25477614

  20. Smith Newton Vehicle Performance Evaluation (Brochure)

    SciTech Connect

    Not Available

    2012-08-01

    The Fleet Test and Evaluation Team at the U.S. Department of Energy's National Renewable Energy Laboratory is evaluating and documenting the performance of electric and plug-in hybrid electric drive systems in medium-duty trucks across the nation. Through this project, Smith Electric Vehicles will build and deploy 500 all-electric medium-duty trucks. The trucks will be deployed in diverse climates across the country.

  1. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  2. Quantitative evaluation of MPTP-treated nonhuman parkinsonian primates in the HALLWAY task.

    PubMed

    Campos-Romo, Aurelio; Ojeda-Flores, Rafael; Moreno-Briseño, Pablo; Fernandez-Ruiz, Juan

    2009-03-15

    Parkinson's disease (PD) is a progressive neurodegenerative disorder. An experimental model of this disease is produced in nonhuman primates by the administration of the neurotoxin 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP). In this work, we put forward a new quantitative evaluation method that uses video recordings to measure the displacement, gate, gross and fine motor performance of freely moving subjects. Four Vervet monkeys (Cercopithecus aethiops) were trained in a behavioral observation hallway while being recorded with digital video cameras from four different angles. After MPTP intoxication the animals were tested without any drug and after 30 and 90 min of Levodopa/Carbidopa administration. Using a personal computer the following behaviors were measured and evaluated from the video recordings: displacement time across the hallway, reaching time towards rewards, ingestion time, number of attempts to obtain rewards, number of rewards obtained, and level of the highest shelf reached for rewards. Our results show that there was an overall behavioral deterioration after MPTP administration and an overall improvement after Levodopa/Carbidopa treatment. This demonstrates that the HALLWAY task is a sensitive and objective method that allows detailed behavioral evaluation of freely moving monkeys in the MPTP Parkinson's disease model.

  3. Noninvasive Quantitative Evaluation of the Dentin Layer during Dental Procedures Using Optical Coherence Tomography

    PubMed Central

    Sinescu, Cosmin; Negrutiu, Meda Lavinia; Bradu, Adrian; Duma, Virgil-Florin; Podoleanu, Adrian Gh.

    2015-01-01

    A routine cavity preparation of a tooth may lead to opening the pulp chamber. The present study evaluates quantitatively, in real time, for the first time to the best of our knowledge, the drilled cavities during dental procedures. An established noninvasive imaging technique, Optical Coherence Tomography (OCT), is used. The main scope is to prevent accidental openings of the dental pulp chamber. Six teeth with dental cavities have been used in this ex vivo study. The real time assessment of the distances between the bottom of the drilled cavities and the top of the pulp chamber was performed using an own assembled OCT system. The evaluation of the remaining dentin thickness (RDT) allowed for the positioning of the drilling tools in the cavities in relation to the pulp horns. Estimations of the safe and of the critical RDT were made; for the latter, the opening of the pulp chamber becomes unavoidable. Also, by following the fractures that can occur when the extent of the decay is too large, the dentist can decide upon the right therapy to follow, endodontic or conventional filling. The study demonstrates the usefulness of OCT imaging in guiding such evaluations during dental procedures. PMID:26078779

  4. Performance and Evaluation of LISP Systems

    SciTech Connect

    Gabriel, R.P.

    1985-01-01

    The final report of the Stanford Lisp Performance Study, Performance and Evaluation of Lisp Systems is the first book to present descriptions on Lisp implementation techniques actually in use. It provides performance information using the tools of benchmarking to measure the various Lisp systems, and provides an understanding of the technical tradeoffs made during the implementation of a Lisp system. The study is divided into three parts. The first provides the theoretical background, outlining the factors that go into evaluating the performance of a Lisp system. The second part presents the Lisp implementations: MacLisp, MIT CADR, LMI Lambda, S-I Lisp, Franz Lisp, MIL, Spice Lisp, Vax Common Lisp, Portable Standard Lisp, and Xerox D-Machine. A final part describes the benchmark suite that was used during the major portion of the study and the results themselves.

  5. A new method to evaluate human-robot system performance

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  6. A new method to evaluate human-robot system performance.

    PubMed

    Rodriguez, G; Weisbin, C R

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  7. Evaluation of the National Science Foundation's Local Course Improvement Program, Volume II: Quantitative Analyses.

    ERIC Educational Resources Information Center

    Kulik, James A.; And Others

    This report is the second of three volumes describing the results of the evaluation of the National Science Foundation (NSF) Local Course Improvement (LOCI) program. This volume describes the quantitative results of the program. Evaluation of the LOCI program involved answering questions in the areas of the need for science course improvement as…

  8. Smith Newton Vehicle Performance Evaluation - Cumulative (Brochure)

    SciTech Connect

    Not Available

    2014-08-01

    The Fleet Test and Evaluation Team at the U.S. Department of Energy's National Renewable Energy Laboratory is evaluating and documenting the performance of electric and plug-in hybrid electric drive systems in medium-duty trucks across the nation. U.S. companies participating in this evaluation project received funding from the American Recovery and Reinvestment Act to cover part of the cost of purchasing these vehicles. Through this project, Smith Electric Vehicles is building and deploying 500 all-electric medium-duty trucks that will be deployed by a variety of companies in diverse climates across the country.

  9. Prospective safety performance evaluation on construction sites.

    PubMed

    Wu, Xianguo; Liu, Qian; Zhang, Limao; Skibniewski, Miroslaw J; Wang, Yanhong

    2015-05-01

    This paper presents a systematic Structural Equation Modeling (SEM) based approach for Prospective Safety Performance Evaluation (PSPE) on construction sites, with causal relationships and interactions between enablers and the goals of PSPE taken into account. According to a sample of 450 valid questionnaire surveys from 30 Chinese construction enterprises, a SEM model with 26 items included for PSPE in the context of Chinese construction industry is established and then verified through the goodness-of-fit test. Three typical types of construction enterprises, namely the state-owned enterprise, private enterprise and Sino-foreign joint venture, are selected as samples to measure the level of safety performance given the enterprise scale, ownership and business strategy are different. Results provide a full understanding of safety performance practice in the construction industry, and indicate that the level of overall safety performance situation on working sites is rated at least a level of III (Fair) or above. This phenomenon can be explained that the construction industry has gradually matured with the norms, and construction enterprises should improve the level of safety performance as not to be eliminated from the government-led construction industry. The differences existing in the safety performance practice regarding different construction enterprise categories are compared and analyzed according to evaluation results. This research provides insights into cause-effect relationships among safety performance factors and goals, which, in turn, can facilitate the improvement of high safety performance in the construction industry.

  10. Prospective safety performance evaluation on construction sites.

    PubMed

    Wu, Xianguo; Liu, Qian; Zhang, Limao; Skibniewski, Miroslaw J; Wang, Yanhong

    2015-05-01

    This paper presents a systematic Structural Equation Modeling (SEM) based approach for Prospective Safety Performance Evaluation (PSPE) on construction sites, with causal relationships and interactions between enablers and the goals of PSPE taken into account. According to a sample of 450 valid questionnaire surveys from 30 Chinese construction enterprises, a SEM model with 26 items included for PSPE in the context of Chinese construction industry is established and then verified through the goodness-of-fit test. Three typical types of construction enterprises, namely the state-owned enterprise, private enterprise and Sino-foreign joint venture, are selected as samples to measure the level of safety performance given the enterprise scale, ownership and business strategy are different. Results provide a full understanding of safety performance practice in the construction industry, and indicate that the level of overall safety performance situation on working sites is rated at least a level of III (Fair) or above. This phenomenon can be explained that the construction industry has gradually matured with the norms, and construction enterprises should improve the level of safety performance as not to be eliminated from the government-led construction industry. The differences existing in the safety performance practice regarding different construction enterprise categories are compared and analyzed according to evaluation results. This research provides insights into cause-effect relationships among safety performance factors and goals, which, in turn, can facilitate the improvement of high safety performance in the construction industry. PMID:25746166

  11. SAT-M Performance of Women Intending Quantitative Fields of Study.

    ERIC Educational Resources Information Center

    Ethington, Corinna A.

    This study assessed patterns of differences in quantitative performance across groups of intended undergraduate majors consistent with those previously found for students who had completed their undergraduate study. Data were drawn from the College Board Admissions Testing Program's national sample of 10,000 college-bound high school seniors in…

  12. A statistical model for assessing performance standards for quantitative and semiquantitative disinfectant test methods.

    PubMed

    Parker, Albert E; Hamilton, Martin A; Tomasino, Stephen F

    2014-01-01

    A performance standard for a disinfectant test method can be evaluated by quantifying the (Type I) pass-error rate for ineffective products and the (Type II) fail-error rate for highly effective products. This paper shows how to calculate these error rates for test methods where the log reduction in a microbial population is used as a measure of antimicrobial efficacy. The calculations can be used to assess performance standards that may require multiple tests of multiple microbes at multiple laboratories. Notably, the error rates account for among-laboratory variance of the log reductions estimated from a multilaboratory data set and the correlation among tests of different microbes conducted in the same laboratory. Performance standards that require that a disinfectant product pass all tests or multiple tests on average, are considered. The proposed statistical methodology is flexible and allows for a different acceptable outcome for each microbe tested, since, for example, variability may be different for different microbes. The approach can also be applied to semiquantitative methods for which product efficacy is reported as the number of positive carriers out of a treated set and the density of the microbes on control carriers is quantified, thereby allowing a log reduction to be calculated. Therefore, using the approach described in this paper, the error rates can also be calculated for semiquantitative method performance standards specified solely in terms of the maximum allowable number of positive carriers per test. The calculations are demonstrated in a case study of the current performance standard for the semiquantitative AOAC Use-Dilution Methods for Pseudomonas aeruginosa (964.02) and Staphylococcus aureus (955.15), which allow up to one positive carrier out of a set of 60 inoculated and treated carriers in each test. A simulation study was also conducted to verify the validity of the model's assumptions and accuracy. Our approach, easily implemented

  13. Quantitative Evaluation of the Stability of Engineered Water Soluble Nanoparticles

    NASA Astrophysics Data System (ADS)

    Mulvihill, M. J.; Habas, S.; Mokari, T.; Wan, J.

    2009-12-01

    Stability of nanoparticle solutions is a key factor dictating the bioavailability and transport characteristics of nanoparticles (NPs) in the environment. The synthesis of materials with dimensions less than 100 nm relies on the ability to stabilize surfaces. If the stabilization of the material is disrupted by aggregation, precipitation, or dissolution, the chemical and physical properties often revert to the properties of the bulk material or molecular constituents. We synthesized CdSe and gold NPs, and studied their aggregation rate and the critical coagulation concentration (CCC) using Dynamic Light Scattering (DLS). The chemical and physical properties of our NPs have been characterized by Transmission Electron Microscopy (TEM), UV-VIS spectroscopy, IR spectroscopy, Zeta potential measurements, and Nuclear Magnetic Resonance (NMR) measurements. This comprehensive approach to synthesis and characterization enables the isolation of design parameters with greater precision that can be obtained using commercially available NPs. This research evaluates NP design parameters including composition, size, and surface coating, as a function of concentration, pH, and ionic strength, to determine which factors most affect NP stability. The aggregation characteristics of both gold NPs and cadmium selinide NPs, which are between 2-12 nm in diameter, and have been capped with various ligands, have been studied. While previous work demonstrates that these variables influence stability, it does not systematically compare their relative significance. Our results indicate that changing the ligand shell radically affects the stability of NP as a function of both pH and ionic strength, while changing the material from CdSe to gold has only a moderate influence on the stability and aggregation characteristics of our particles. Additionally, the ligand charge, length, and binding affinity all significantly effect NP stability. Funding was provided by the U.S. Department of Energy

  14. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  15. Hypersonic Interceptor Performance Evaluation Center aero-optics performance predictions

    NASA Astrophysics Data System (ADS)

    Sutton, George W.; Pond, John E.; Snow, Ronald; Hwang, Yanfang

    1993-06-01

    This paper describes the Hypersonic Interceptor Performance Evaluation Center's (HIPEC) aerooptics performance predictions capability. It includes code results for three dimensional shapes and comparisons to initial experiments. HIPEC consists of a collection of aerothermal, aerodynamic computational codes which are capable of covering the entire flight regime from subsonic to hypersonic flow and include chemical reactions and turbulence. Heat transfer to the various surfaces is calculated as an input to cooling and ablation processes. HIPEC also has aero-optics codes to determine the effect of the mean flowfield and turbulence on the tracking and imaging capability of on-board optical sensors. The paper concentrates on the latter aspects.

  16. A quantitative evaluation of dry-sensor electroencephalography

    NASA Astrophysics Data System (ADS)

    Uy, E. Timothy

    Neurologists, neuroscientists, and experimental psychologists study electrical activity within the brain by recording voltage fluctuations at the scalp. This is electroencephalography (EEG). In conventional or "wet" EEG, scalp abrasion and use of electrolytic paste are required to insure good electrical connection between sensor and skin. Repeated abrasion quickly becomes irritating to subjects, severely limiting the number and frequency of sessions. Several groups have produced "dry" EEG sensors that do not require abrasion or conductive paste. These, in addition to sidestepping the issue of abrasion, promise to reduce setup time from about 30 minutes with a technician to less than 30 seconds without one. The availability of such an instrument would (1) reduce the cost of brain-related medical care, (2) lower the barrier of entry on brain experimentation, and (3) allow individual subjects to contribute substantially more data without fear of abrasion or fatigue. Accuracy of the EEG is paramount in the medical diagnosis of epilepsy, in experimental psychology and in the burgeoning field of brain-computer interface. Without a sufficiently accurate measurement, the advantages of dry sensors remain a moot point. However, even after nearly a decade, demonstrations of dry EEG accuracy with respect to wet have been limited to visual comparison of short snippets of spontaneous EEG, averaged event-related potentials or plots of power spectrum. In this dissertation, I propose a detailed methodology based on single-trial EEG classification for comparing dry EEG sensors to their wet counterparts. Applied to a set of commercially fabricated dry sensors, this work reveals that dry sensors can perform as well their wet counterparts with careful screening and attention to the bandwidth of interest.

  17. Evaluating Performance Portability of OpenACC

    SciTech Connect

    Sabne, Amit J; Sakdhnagool, Putt; Lee, Seyong; Vetter, Jeffrey S

    2015-01-01

    Accelerator-based heterogeneous computing is gaining momentum in High Performance Computing arena. However, the increased complexity of the accelerator architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle the problem. While the abstraction endowed by OpenACC offers productivity, it raises questions on its portability. This paper evaluates the performance portability obtained by OpenACC on twelve OpenACC programs on NVIDIA CUDA, AMD GCN, and Intel MIC architectures. We study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.

  18. Performance evaluation of an air solar collector

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Indoor tests on signal-glazed flat-plate collector are described in report. Marhsall Space Flight Center solar simulator is used to make tests. Test included evaluations on thermal performance under various combinations of flow rate, incident flux, inlet temperature, and wind speed. Results are presented in graph/table form.

  19. ASBESTOS IN DRINKING WATER PERFORMANCE EVALUATION STUDIES

    EPA Science Inventory

    Performance evaluations of laboratories testing for asbestos in drinking water according to USEPA Test Method 100.1 or 100.2 are complicated by the difficulty of providing stable sample dispersions of asbestos in water. Reference samples of a graduated series of chrysotile asbes...

  20. ASBESTOS IN DRINKING WATER PERFORMANCE EVALUATION STUDIES

    EPA Science Inventory

    Performance evaluations of laboratories testing for asbestos in drinking water according to USEPA Test Method 100.1 or 100.2 are complicated by the difficulty of providing stable sample dispersions of asbestos in water. Reference samples of a graduated series of chrysotile asbest...

  1. A New Approach to Evaluating Performance.

    PubMed

    Bleich, Michael R

    2016-09-01

    A leadership task is evaluating the performance of individuals for organizational fit. Traditional approaches have included leader-subordinate reviews, self-review, and peer review. A new approach is evolving in team-based organizations, introduced in this article. J Contin Educ Nurs. 2016;47(9):393-394. PMID:27580504

  2. An Evaluation of a Performance Contract.

    ERIC Educational Resources Information Center

    Dembo, Myron H.; Wilson, Donald E.

    This paper reports an evaluation of a performance contract in reading with 2,500 seventh-grade students. Seventy-five percent of the students were to increase their reading speed five times over their beginning level with ten percent more comprehension after three months of instruction. Results indicated that only thirteen percent of the students…

  3. GENERAL METHODS FOR REMEDIAL PERFORMANCE EVALUATIONS

    EPA Science Inventory

    This document was developed by an EPA-funded project to explain technical considerations and principles necessary to evaluated the performance of ground-water contamination remediations at hazardous waste sites. This is neither a "cookbook", nor an encyclopedia of recommended fi...

  4. EVALUATION OF CONFOCAL MICROSCOPY SYSTEM PERFORMANCE

    EPA Science Inventory

    BACKGROUND. The confocal laser scanning microscope (CLSM) has enormous potential in many biological fields. Currently there is a subjective nature in the assessment of a confocal microscope's performance by primarily evaluating the system with a specific test slide provided by ea...

  5. Simulation-based evaluation of the resolution and quantitative accuracy of temperature-modulated fluorescence tomography

    PubMed Central

    Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C.; Gulsen, Gultekin

    2016-01-01

    Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed “temperature-modulated fluorescence tomography” (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm × W :100 mm) is recovered as an elongated object in the conventional FT (x = 4.5 mm; y = 10.4 mm), while TM-FT recovers it successfully in both directions (x = 3.8 mm; y = 4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT. PMID:26368884

  6. Performance and race in evaluating minority mayors.

    PubMed

    Howell, S E

    2001-01-01

    This research compares a performance model to a racial model in explaining approval of a black mayor. The performance model emphasizes citizen evaluations of conditions in the city and the mayor's perceived effectiveness in dealing with urban problems. The racial model stipulates that approval of a black mayor is based primarily on racial identification or racism. A model of mayoral approval is tested with two surveys over different years of citizens in a city that has had 20 years' experience with black mayors. Findings indicate that performance matters when evaluating black mayors, indicating that the national performance models of presidential approval are generalizable to local settings with black executives. Implications for black officeholders are discussed. However, the racial model is alive and well, as indicated by its impact on approval and the finding that, in this context, performance matters more to white voters than to black voters. A final, highly tentative conclusion is offered that context conditions the relative power of these models. The performance model may explain more variation in approval of the black mayor than the racial model in a context of rapidly changing city conditions that focuses citizen attention on performance, but during a period of relative stability the two models are evenly matched.

  7. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  8. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Pollutants: Reinforced Plastic Composites Production Testing and Initial Compliance Requirements § 63.5850... performance test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies....

  9. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Pollutants: Reinforced Plastic Composites Production Testing and Initial Compliance Requirements § 63.5850... performance test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies....

  10. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... performance test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies. (c... and under the specific conditions that 40 CFR part 63, subpart SS, specifies. (d) You may not...

  11. Blind Source Parameters for Performance Evaluation of Despeckling Filters.

    PubMed

    Biradar, Nagashettappa; Dewal, M L; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  12. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    PubMed Central

    Biradar, Nagashettappa; Dewal, M. L.; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  13. Behavioral patterns of environmental performance evaluation programs.

    PubMed

    Li, Wanxin; Mauerhofer, Volker

    2016-11-01

    During the past decades numerous environmental performance evaluation programs have been developed and implemented on different geographic scales. This paper develops a taxonomy of environmental management behavioral patterns in order to provide a practical comparison tool for environmental performance evaluation programs. Ten such programs purposively selected are mapped against the identified four behavioral patterns in the form of diagnosis, negotiation, learning, and socialization and learning. Overall, we found that schemes which serve to diagnose environmental abnormalities are mainly externally imposed and have been developed as a result of technical debates concerning data sources, methodology and ranking criteria. Learning oriented scheme is featured by processes through which free exchange of ideas, mutual and adaptive learning can occur. Scheme developed by higher authority for influencing behaviors of lower levels of government has been adopted by the evaluated to signal their excellent environmental performance. The socializing and learning classified evaluation schemes have incorporated dialogue, participation, and capacity building in program design. In conclusion we consider the 'fitness for purpose' of the various schemes, the merits of our analytical model and the future possibilities of fostering capacity building in the realm of wicked environmental challenges. PMID:27513220

  14. Behavioral patterns of environmental performance evaluation programs.

    PubMed

    Li, Wanxin; Mauerhofer, Volker

    2016-11-01

    During the past decades numerous environmental performance evaluation programs have been developed and implemented on different geographic scales. This paper develops a taxonomy of environmental management behavioral patterns in order to provide a practical comparison tool for environmental performance evaluation programs. Ten such programs purposively selected are mapped against the identified four behavioral patterns in the form of diagnosis, negotiation, learning, and socialization and learning. Overall, we found that schemes which serve to diagnose environmental abnormalities are mainly externally imposed and have been developed as a result of technical debates concerning data sources, methodology and ranking criteria. Learning oriented scheme is featured by processes through which free exchange of ideas, mutual and adaptive learning can occur. Scheme developed by higher authority for influencing behaviors of lower levels of government has been adopted by the evaluated to signal their excellent environmental performance. The socializing and learning classified evaluation schemes have incorporated dialogue, participation, and capacity building in program design. In conclusion we consider the 'fitness for purpose' of the various schemes, the merits of our analytical model and the future possibilities of fostering capacity building in the realm of wicked environmental challenges.

  15. Error Reduction Program. [combustor performance evaluation codes

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.

    1985-01-01

    The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.

  16. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    PubMed

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills.

  17. Performance evaluation of fingerprint verification systems.

    PubMed

    Cappelli, Raffaele; Maio, Dario; Maltoni, Davide; Wayman, James L; Jain, Anil K

    2006-01-01

    This paper is concerned with the performance evaluation of fingerprint verification systems. After an initial classification of biometric testing initiatives, we explore both the theoretical and practical issues related to performance evaluation by presenting the outcome of the recent Fingerprint Verification Competition (FVC2004). FVC2004 was organized by the authors of this work for the purpose of assessing the state-of-the-art in this challenging pattern recognition application and making available a new common benchmark for an unambiguous comparison of fingerprint-based biometric systems. FVC2004 is an independent, strongly supervised evaluation performed at the evaluators' site on evaluators' hardware. This allowed the test to be completely controlled and the computation times of different algorithms to be fairly compared. The experience and feedback received from previous, similar competitions (FVC2000 and FVC2002) allowed us to improve the organization and methodology of FVC2004 and to capture the attention of a significantly higher number of academic and commercial organizations (67 algorithms were submitted for FVC2004). A new, "Light" competition category was included to estimate the loss of matching performance caused by imposing computational constraints. This paper discusses data collection and testing protocols, and includes a detailed analysis of the results. We introduce a simple but effective method for comparing algorithms at the score level, allowing us to isolate difficult cases (images) and to study error correlations and algorithm "fusion." The huge amount of information obtained, including a structured classification of the submitted algorithms on the basis of their features, makes it possible to better understand how current fingerprint recognition systems work and to delineate useful research directions for the future.

  18. Performance Evaluation of a Data Validation System

    NASA Technical Reports Server (NTRS)

    Wong, Edmond (Technical Monitor); Sowers, T. Shane; Santi, L. Michael; Bickford, Randall L.

    2005-01-01

    Online data validation is a performance-enhancing component of modern control and health management systems. It is essential that performance of the data validation system be verified prior to its use in a control and health management system. A new Data Qualification and Validation (DQV) Test-bed application was developed to provide a systematic test environment for this performance verification. The DQV Test-bed was used to evaluate a model-based data validation package known as the Data Quality Validation Studio (DQVS). DQVS was employed as the primary data validation component of a rocket engine health management (EHM) system developed under NASA's NGLT (Next Generation Launch Technology) program. In this paper, the DQVS and DQV Test-bed software applications are described, and the DQV Test-bed verification procedure for this EHM system application is presented. Test-bed results are summarized and implications for EHM system performance improvements are discussed.

  19. OMPS SDR Status and Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Pan, S.; Weng, F.; Wu, X.; Flynn, L. E.; Jaross, G.; Buss, R. H.; Niu, J.; Seftor, C. J.

    2012-12-01

    Launched on October 28, 2011, OMPS has successfully passed different operational phases from the Early Observation and Activation (LEO&A) to Early Orbit Checkout (EOC), and is currently in the Intensive CAL/Val (ICV) phase. OMPS data gathered during the on-orbit calibration and validation activities allow us to evaluate the instrument on-orbit performance and validate Sensor Data Records (SDRs). Detector performance shows that offset, gain, and dark current rate trends remain within 0.2% of the pre-launch values with significant margin below sensor requirements. Detector gain and offset performance trends are generally stable and observed solar irradiance is within an average of 2% of predicted values. This presentation will update the status of the OMPS SDRs with newly established calibration measurements. Examples of analysis of dark calibration, linearity performance, solar irradiance validation, sensor noise and wavelength change are provided.

  20. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  1. Performance Evaluation Methods for Assistive Robotic Technology

    NASA Astrophysics Data System (ADS)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  2. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  3. Evaluation of Performance Management in State Schools: A Case of North Cyprus

    ERIC Educational Resources Information Center

    Atamturk, Hakan; Aksal, Fahriye A.; Gazi, Zehra A.; Atamturk, A. Nurdan

    2011-01-01

    The research study aims to evaluate performance management in the state secondary schools in North Cyprus. This study is significant by shedding a light on perceptions of teachers and headmasters regarding quality control of schools through performance management. In this research, quantitative research was employed, and a survey was conducted to…

  4. Performance evaluation of two OCR systems

    SciTech Connect

    Chen, S.; Subramaniam, S.; Haralick, R.M.; Phillips, I.T.

    1994-12-31

    An experimental protocol for the performance evaluation of Optical Character Recognition (OCR) algorithms is described. The protocol is intended to serve as a model for using the University of Washington English Document Image Database-I to evaluate OCR systems. The plain text zones (without special symbols) in this database have over 2,300,000 characters. The performances of two UNIX-based OCR systems, namely Caere OCR v109a and Xerox ScanWorX v2.0, are measured. The results suggest that Caere OCR outperforms ScanWorX in terms of recognition accuracy; however, ScanWorX is more robust in the presence of image flaws.

  5. A performance evaluation of biometric identification devices

    SciTech Connect

    Holmes, J.P.; Maxwell, R.L.; Wright, L.J.

    1990-06-01

    A biometric identification device is an automatic device that can verify a person's identity from a measurement of a physical feature or repeatable action of the individual. A reference measurement of the biometric is obtained when the individual is enrolled on the device. Subsequent verifications are made by comparing the submitted biometric feature against the reference sample. Sandia Laboratories has been evaluating the relative performance of several identity verifiers, using volunteer test subjects. Sandia testing methods and results are discussed.

  6. Automated Laser Seeker Performance Evaluation System (ALSPES)

    NASA Astrophysics Data System (ADS)

    Martin, Randal G.; Robinson, Elisa L.

    1988-01-01

    The Automated Laser Seeker Performance Evaluation System (ALSPES), which supports the Hellfire missile and Copperhead projectile laser seekers, is discussed. The ALSPES capabilities in manual and automatic operation are described, and the ALSPES test hardware is examined, including the computer system, the laser/attenuator, optics systems, seeker test fixture, and the measurement and test equipment. The calibration of laser energy and test signals in ALSPES is considered.

  7. Performance evaluation and clinical applications of 3D plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Decker, Ryan; Shademan, Azad; Opfermann, Justin; Leonard, Simon; Kim, Peter C. W.; Krieger, Axel

    2015-06-01

    The observation and 3D quantification of arbitrary scenes using optical imaging systems is challenging, but increasingly necessary in many fields. This paper provides a technical basis for the application of plenoptic cameras in medical and medical robotics applications, and rigorously evaluates camera integration and performance in the clinical setting. It discusses plenoptic camera calibration and setup, assesses plenoptic imaging in a clinically relevant context, and in the context of other quantitative imaging technologies. We report the methods used for camera calibration, precision and accuracy results in an ideal and simulated surgical setting. Afterwards, we report performance during a surgical task. Test results showed the average precision of the plenoptic camera to be 0.90mm, increasing to 1.37mm for tissue across the calibrated FOV. The ideal accuracy was 1.14mm. The camera showed submillimeter error during a simulated surgical task.

  8. Evaluation of Fourier Transform Profilometry for Quantitative Waste Volume Determination under Simulated Hanford Tank Conditions

    SciTech Connect

    Etheridge, J.A.; Jang, P.R.; Leone, T.; Long, Z.; Norton, O.P.; Okhuysen, W.P.; Monts, D.L.; Coggins, T.L.

    2008-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of chemical and radioactive species of interest. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We are conducting a multi-stage performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. The successive stages impose aspects that present increasing difficulty and increasingly more accurate approximations of in-tank environments. In this paper, we report our investigations of the dependence of the analyst upon FTP volume determination results and of the

  9. Group 3: Performance evaluation and assessment

    NASA Technical Reports Server (NTRS)

    Frink, A.

    1981-01-01

    Line-oriented flight training provides a unique learning experience and an opportunity to look at aspects of performance other types of training did not provide. Areas such as crew coordination, resource management, leadership, and so forth, can be readily evaluated in such a format. While individual performance is of the utmost importance, crew performance deserves equal emphasis, therefore, these areas should be carefully observed by the instructors as an rea for discussion in the same way that individual performane is observed. To be effective, it must be accepted by the crew members, and administered by the instructors as pure training-learning through experience. To keep open minds, to benefit most from the experience, both in the doing and in the follow-on discussion, it is essential that it be entered into with a feeling of freedom, openness, and enthusiasm. Reserve or defensiveness because of concern for failure must be inhibit participation.

  10. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  11. Evaluating Algorithm Performance Metrics Tailored for Prognostics

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2009-01-01

    Prognostics has taken a center stage in Condition Based Maintenance (CBM) where it is desired to estimate Remaining Useful Life (RUL) of the system so that remedial measures may be taken in advance to avoid catastrophic events or unwanted downtimes. Validation of such predictions is an important but difficult proposition and a lack of appropriate evaluation methods renders prognostics meaningless. Evaluation methods currently used in the research community are not standardized and in many cases do not sufficiently assess key performance aspects expected out of a prognostics algorithm. In this paper we introduce several new evaluation metrics tailored for prognostics and show that they can effectively evaluate various algorithms as compared to other conventional metrics. Specifically four algorithms namely; Relevance Vector Machine (RVM), Gaussian Process Regression (GPR), Artificial Neural Network (ANN), and Polynomial Regression (PR) are compared. These algorithms vary in complexity and their ability to manage uncertainty around predicted estimates. Results show that the new metrics rank these algorithms in different manner and depending on the requirements and constraints suitable metrics may be chosen. Beyond these results, these metrics offer ideas about how metrics suitable to prognostics may be designed so that the evaluation procedure can be standardized. 1

  12. A Method for Quantitative Evaluation of the Results of Postural Tests.

    PubMed

    Alifirova, V M; Brazovskii, K S; Zhukova, I A; Pekker, Ya S; Tolmachev, I V; Fokin, V A

    2016-07-01

    A method for quantitative evaluation of the results of postural tests is proposed. The method is based on contact-free measurements of 3D coordinates of body point movements. The result can serve as an integral test based on the Mahalanobis distance. PMID:27492397

  13. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  14. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  15. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  16. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  17. Diagnostic performance of quantitative coronary computed tomography angiography and quantitative coronary angiography to predict hemodynamic significance of intermediate-grade stenoses.

    PubMed

    Ghekiere, Olivier; Dewilde, Willem; Bellekens, Michel; Hoa, Denis; Couvreur, Thierry; Djekic, Julien; Coolen, Tim; Mancini, Isabelle; Vanhoenacker, Piet K; Dendale, Paul; Nchimi, Alain

    2015-12-01

    Fractional flow reserve (FFR) during invasive coronary angiography has become an established tool for guiding treatment. However, only one-third of intermediate-grade coronary artery stenosis (ICAS) are hemodynamically significant and require coronary revascularization. Additionally, the severity of stenosis visually established by coronary computed tomography angiography (CCTA) does not reliably correlate with the functional severity. Therefore, additional angiographic morphologic descriptors affecting hemodynamic significance are required. To evaluate quantitative stenosis analysis and plaque descriptors by CCTA in predicting the hemodynamic significance of ICAS and to compare it with quantitative catheter coronary angiography (QCA). QCA was performed in 65 patients (mean age 63 ± 9 years; 47 men) with 76 ICAS (40-70%) on CCTA. Plaque descriptors were determined including circumferential extent of calcification, plaque composition, minimal lumen diameter (MLD) and area, diameter stenosis percentage (Ds %), area stenosis percentage and stenosis length on CCTA. MLD and Ds % were also analyzed on QCA. FFR was measured on 52 ICAS lesions on CCTA and QCA. The diagnostic values of the best CCTA and QCA descriptors were calculated for ICAS with FFR ≤ 0.80. Of the 76 ICAS on CCTA, 52 (68%) had a Ds % between 40 and 70% on QCA. Significant intertechnique correlations were found between CCTA and QCA for MLD and Ds % (p < 0.001). In 17 (33%) of the 52 ICAS lesions on QCA, FFR values were ≤ 0.80. Calcification circumference extent (p = 0.50) and plaque composition assessment (p = 0.59) did not correlate with the hemodynamic significance. Best predictors for FFR ≤ 0.80 stenosis were ≤ 1.35 mm MLD (82% sensitivity, 66% specificity), and ≤ 2.3 mm(²) minimal lumen area (88% sensitivity, 60% specificity) on CCTA, and ≤ 1.1 mm MLD (59% sensitivity, 77% specificity) on QCA. Quantitative CCTA and QCA poorly predict hemodynamic significance of ICAS, though CCTA seems to

  18. Evaluation of Quantitative Precipitation Estimations (QPE) and Hydrological Modelling in IFloodS Focal Basins

    NASA Astrophysics Data System (ADS)

    Wu, H.; Adler, R. F.; Huffman, G. J.; Tian, Y.

    2015-12-01

    A hydrology approach based on the intercomparisons of multiple-product-driven hydrological simulations was implemented for reliably evaluating both quantitative precipitation estimations (QPE) and hydrological modelling at river basin and subbasin scales in the IFloodS focal basin, Iowa-Cedar River Basin (ICRB), over a long-term (2002-2013) and a short-term period (Apr. 1-June 30, 2013). A reference precipitation dataset was created for the evaluation first by reversing the mean annual precipitation from independent observed streamflow and satellite-based ET product and then it was disaggregated to 3-hourly time steps based on NLDAS2. The intercomparisons from different perspectives consistently showed the QPE products with less bias leaded to better streamflow simulation. The reference dataset leaded to overall the best model performance, slightly better than original NLDAS2 (biased -4%) which derived better model performance than all other products in the long-term simulations with daily and monthly NSC of 0.81 and 0.88 respectively and MARE of -2% when compared to observed streamflow at the ICRB outlet, while having reasonable water budgets simulation. Other products (CPC-U, StageIV and TMPARP and satellite-only products) derived gradually decreased performance. All products with long-term records showed consistent merit over the IFloodS period, while the Q2 seemed to be the best estimation for the short-term period. Good correlation between the bias in precipitation and streamflow was found at all from annual to daily scales while the relation and the slope depended on seasons, river basin concentration (routing) time and antecedent river basin water storage condition. Precipitation products also showed significant impacts on streamflow and peak timing. Although satellite-only products could derive even better simulations (than conventional products) for some sub-basins from the short-term evaluation, it had less temporal-spatial quality consistency. While the

  19. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Reinforced Plastic Composites Production Testing and Initial Compliance Requirements § 63.5850 How do I... test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to you... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies....

  20. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to you... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies. (c... and under the specific conditions that 40 CFR part 63, subpart SS, specifies. (d) You may not...

  1. Performance Evaluation of Triangulation Based Range Sensors

    PubMed Central

    Guidi, Gabriele; Russo, Michele; Magrassi, Grazia; Bordegoni, Monica

    2010-01-01

    The performance of 2D digital imaging systems depends on several factors related with both optical and electronic processing. These concepts have originated standards, which have been conceived for photographic equipment and bi-dimensional scanning systems, and which have been aimed at estimating different parameters such as resolution, noise or dynamic range. Conversely, no standard test protocols currently exist for evaluating the corresponding performances of 3D imaging systems such as laser scanners or pattern projection range cameras. This paper is focused on investigating experimental processes for evaluating some critical parameters of 3D equipment, by extending the concepts defined by the ISO standards to the 3D domain. The experimental part of this work concerns the characterization of different range sensors through the extraction of their resolution, accuracy and uncertainty from sets of 3D data acquisitions of specifically designed test objects whose geometrical characteristics are known in advance. The major objective of this contribution is to suggest an easy characterization process for generating a reliable comparison between the performances of different range sensors and to check if a specific piece of equipment is compliant with the expected characteristics. PMID:22163599

  2. Performance evaluation of an automotive thermoelectric generator

    NASA Astrophysics Data System (ADS)

    Dubitsky, Andrei O.

    Around 40% of the total fuel energy in typical internal combustion engines (ICEs) is rejected to the environment in the form of exhaust gas waste heat. Efficient recovery of this waste heat in automobiles can promise a fuel economy improvement of 5%. The thermal energy can be harvested through thermoelectric generators (TEGs) utilizing the Seebeck effect. In the present work, a versatile test bench has been designed and built in order to simulate conditions found on test vehicles. This allows experimental performance evaluation and model validation of automotive thermoelectric generators. An electrically heated exhaust gas circuit and a circulator based coolant loop enable integrated system testing of hot and cold side heat exchangers, thermoelectric modules (TEMs), and thermal interface materials at various scales. A transient thermal model of the coolant loop was created in order to design a system which can maintain constant coolant temperature under variable heat input. Additionally, as electrical heaters cannot match the transient response of an ICE, modelling was completed in order to design a relaxed exhaust flow and temperature history utilizing the system thermal lag. This profile reduced required heating power and gas flow rates by over 50%. The test bench was used to evaluate a DOE/GM initial prototype automotive TEG and validate analytical performance models. The maximum electrical power generation was found to be 54 W with a thermal conversion efficiency of 1.8%. It has been found that thermal interface management is critical for achieving maximum system performance, with novel designs being considered for further improvement.

  3. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  4. Evaluating iterative reconstruction performance in computed tomography

    SciTech Connect

    Chen, Baiyu Solomon, Justin; Ramirez Giraldo, Juan Carlos; Samei, Ehsan

    2014-12-15

    Purpose: Iterative reconstruction (IR) offers notable advantages in computed tomography (CT). However, its performance characterization is complicated by its potentially nonlinear behavior, impacting performance in terms of specific tasks. This study aimed to evaluate the performance of IR with both task-specific and task-generic strategies. Methods: The performance of IR in CT was mathematically assessed with an observer model that predicted the detection accuracy in terms of the detectability index (d′). d′ was calculated based on the properties of the image noise and resolution, the observer, and the detection task. The characterizations of image noise and resolution were extended to accommodate the nonlinearity of IR. A library of tasks was mathematically modeled at a range of sizes (radius 1–4 mm), contrast levels (10–100 HU), and edge profiles (sharp and soft). Unique d′ values were calculated for each task with respect to five radiation exposure levels (volume CT dose index, CTDI{sub vol}: 3.4–64.8 mGy) and four reconstruction algorithms (filtered backprojection reconstruction, FBP; iterative reconstruction in imaging space, IRIS; and sinogram affirmed iterative reconstruction with strengths of 3 and 5, SAFIRE3 and SAFIRE5; all provided by Siemens Healthcare, Forchheim, Germany). The d′ values were translated into the areas under the receiver operating characteristic curve (AUC) to represent human observer performance. For each task and reconstruction algorithm, a threshold dose was derived as the minimum dose required to achieve a threshold AUC of 0.9. A task-specific dose reduction potential of IR was calculated as the difference between the threshold doses for IR and FBP. A task-generic comparison was further made between IR and FBP in terms of the percent of all tasks yielding an AUC higher than the threshold. Results: IR required less dose than FBP to achieve the threshold AUC. In general, SAFIRE5 showed the most significant dose reduction

  5. Evaluation of Large Scale Quantitative Proteomic Assay Development Using Peptide Affinity-based Mass Spectrometry*

    PubMed Central

    Whiteaker, Jeffrey R.; Zhao, Lei; Abbatiello, Susan E.; Burgess, Michael; Kuhn, Eric; Lin, ChenWei; Pope, Matthew E.; Razavi, Morteza; Anderson, N. Leigh; Pearson, Terry W.; Carr, Steven A.; Paulovich, Amanda G.

    2011-01-01

    Stable isotope standards and capture by antipeptide antibodies (SISCAPA) couples affinity enrichment of peptides with stable isotope dilution and detection by multiple reaction monitoring mass spectrometry to provide quantitative measurement of peptides as surrogates for their respective proteins. In this report, we describe a feasibility study to determine the success rate for production of suitable antibodies for SISCAPA assays in order to inform strategies for large-scale assay development. A workflow was designed that included a multiplex immunization strategy in which up to five proteotypic peptides from a single protein target were used to immunize individual rabbits. A total of 403 proteotypic tryptic peptides representing 89 protein targets were used as immunogens. Antipeptide antibody titers were measured by ELISA and 220 antipeptide antibodies representing 89 proteins were chosen for affinity purification. These antibodies were characterized with respect to their performance in SISCAPA-multiple reaction monitoring assays using trypsin-digested human plasma matrix. More than half of the assays generated were capable of detecting the target peptide at concentrations of less than 0.5 fmol/μl in human plasma, corresponding to protein concentrations of less than 100 ng/ml. The strategy of multiplexing five peptide immunogens was successful in generating a working assay for 100% of the targeted proteins in this evaluation study. These results indicate it is feasible for a single laboratory to develop hundreds of assays per year and allow planning for cost-effective generation of SISCAPA assays. PMID:21245105

  6. Quantitative evaluation of radiation-induced changes in sperm morphology and chromatin distribution

    SciTech Connect

    Aubele, M.; Juetting, U.R.; Rodenacker, K.; Gais, P.; Burger, G.; Hacker-Klom, U. )

    1990-01-01

    Sperm head cytometry provides a useful assay for the detection of radiation-induced damage in mouse germ cells. Exposure of the gonads to radiation is known to lead to an increase of diploid and higher polyploid sperm and of sperm with head shape abnormalities. In the pilot studies reported here quantitative analysis of the total DNA content, the morphology, and the chromatin distribution of mouse sperm was performed. The goal was to evaluate the discriminative power of features derived by high resolution image cytometry in distinguishing sperm of control and irradiated mice. Our results suggest that besides the induction of the above mentioned variations in DNA content and shape of sperm head, changes of the nonhomogeneous chromatin distribution within the sperm may also be used to quantify the radiation effect on sperm cells. Whereas the chromatin distribution features show larger variations for sperm 21 days after exposure (dpr), the shape parameters seem to be more important to discriminate sperm 35 dpr. This may be explained by differentiation processes, which take place in different stages during mouse spermatogenesis.

  7. Flexor and extensor muscle tone evaluated using the quantitative pendulum test in stroke and parkinsonian patients.

    PubMed

    Huang, Han-Wei; Ju, Ming-Shaung; Lin, Chou-Ching K

    2016-05-01

    The aim of this study was to evaluate the flexor and extensor muscle tone of the upper limbs in patients with spasticity or rigidity and to investigate the difference in hypertonia between spasticity and rigidity. The two experimental groups consisted of stroke patients and parkinsonian patients. The control group consisted of age and sex-matched normal subjects. Quantitative upper limb pendulum tests starting from both flexed and extended joint positions were conducted. System identification with a simple linear model was performed and model parameters were derived. The differences between the three groups and two starting positions were investigated by these model parameters and tested by two-way analysis of variance. In total, 57 subjects were recruited, including 22 controls, 14 stroke patients and 21 parkinsonian patients. While stiffness coefficient showed no difference among groups, the number of swings, relaxation index and damping coefficient showed changes suggesting significant hypertonia in the two patient groups. There was no difference between these two patient groups. The test starting from the extended position constantly manifested higher muscle tone in all three groups. In conclusion, the hypertonia of parkinsonian and stroke patients could not be differentiated by the modified pendulum test; the elbow extensors showed a higher muscle tone in both control and patient groups; and hypertonia of both parkinsonian and stroke patients is velocity dependent.

  8. The Evaluation and Quantitation of Dihydrogen Metabolism Using Deuterium Isotope in Rats

    PubMed Central

    Hyspler, Radomir; Ticha, Alena; Schierbeek, Henk; Galkin, Alexander; Zadak, Zdenek

    2015-01-01

    Purpose Despite the significant interest in molecular hydrogen as an antioxidant in the last eight years, its quantitative metabolic parameters in vivo are still lacking, as is an appropriate method for determination of hydrogen effectivity in the mammalian organism under various conditions. Basic Procedures Intraperitoneally-applied deuterium gas was used as a metabolic tracer and deuterium enrichment was determined in the body water pool. Also, in vitro experiments were performed using bovine heart submitochondrial particles to evaluate superoxide formation in Complex I of the respiratory chain. Main Findings A significant oxidation of about 10% of the applied dose was found under physiological conditions in rats, proving its antioxidant properties. Hypoxia or endotoxin application did not exert any effect, whilst pure oxygen inhalation reduced deuterium oxidation. During in vitro experiments, a significant reduction of superoxide formation by Complex I of the respiratory chain was found under the influence of hydrogen. The possible molecular mechanisms of the beneficial effects of hydrogen are discussed, with an emphasis on the role of iron sulphur clusters in reactive oxygen species generation and on iron species-dihydrogen interaction. Principal Conclusions According to our findings, hydrogen may be an efficient, non-toxic, highly bioavailable and low-cost antioxidant supplement for patients with pathological conditions involving ROS-induced oxidative stress. PMID:26103048

  9. Flexor and extensor muscle tone evaluated using the quantitative pendulum test in stroke and parkinsonian patients.

    PubMed

    Huang, Han-Wei; Ju, Ming-Shaung; Lin, Chou-Ching K

    2016-05-01

    The aim of this study was to evaluate the flexor and extensor muscle tone of the upper limbs in patients with spasticity or rigidity and to investigate the difference in hypertonia between spasticity and rigidity. The two experimental groups consisted of stroke patients and parkinsonian patients. The control group consisted of age and sex-matched normal subjects. Quantitative upper limb pendulum tests starting from both flexed and extended joint positions were conducted. System identification with a simple linear model was performed and model parameters were derived. The differences between the three groups and two starting positions were investigated by these model parameters and tested by two-way analysis of variance. In total, 57 subjects were recruited, including 22 controls, 14 stroke patients and 21 parkinsonian patients. While stiffness coefficient showed no difference among groups, the number of swings, relaxation index and damping coefficient showed changes suggesting significant hypertonia in the two patient groups. There was no difference between these two patient groups. The test starting from the extended position constantly manifested higher muscle tone in all three groups. In conclusion, the hypertonia of parkinsonian and stroke patients could not be differentiated by the modified pendulum test; the elbow extensors showed a higher muscle tone in both control and patient groups; and hypertonia of both parkinsonian and stroke patients is velocity dependent. PMID:26765753

  10. A performance evaluation of personnel identity verifiers

    SciTech Connect

    Maxwell, R.L.; Wright, L.J.

    1987-01-01

    Personnel identity verification devices, which are based on the examination and assessment of a body feature or a unique repeatable personal action, are steadily improving. These biometric devices are becoming more practical with respect to accuracy, speed, user compatibility, reliability and cost, but more development is necessary to satisfy the varied and sometimes ill-defined future requirements of the security industry. In an attempt to maintain an awareness of the availability and the capabilities of identity verifiers for the DOE security community, Sandia Laboratories continue to comparatively evaluate the capabilities and improvements of developing devices. An evaluation of several recently available verifiers is discussed in this paper. Operating environments and procedures more typical of physical access control use can reveal performance substantially different from the basic laboratory tests.

  11. A performance evaluation of personnel identity verifiers

    SciTech Connect

    Maxwell, R.L.; Wright, L.J.

    1987-07-01

    Personnel identity verification devices, which are based on the examination and assessment of a body feature or a unique repeatable personal action, are steadily improving. These biometric devices are becoming more practical with respect to accuracy, speed, user compatibility, reliability and cost, but more development is necessary to satisfy the varied and sometimes ill-defined future requirements of the security industry. In an attempt to maintain an awareness of the availability and the capabilities of identity verifiers for the DOE security community, Sandia Laboratories continues to comparatively evaluate the capabilities and improvements of developing devices. An evaluation of several recently available verifiers is discussed in this paper. Operating environments and procedures more typical of physical access control use can reveal performance substantially different from the basic laboratory tests.

  12. Seismic Performance Evaluation of Concentrically Braced Frames

    NASA Astrophysics Data System (ADS)

    Hsiao, Po-Chien

    Concentrically braced frames (CBFs) are broadly used as lateral-load resisting systems in buildings throughout the US. In high seismic regions, special concentrically braced frames (SCBFs) where ductility under seismic loading is necessary. Their large elastic stiffness and strength efficiently sustains the seismic demands during smaller, more frequent earthquakes. During large, infrequent earthquakes, SCBFs exhibit highly nonlinear behavior due to brace buckling and yielding and the inelastic behavior induced by secondary deformation of the framing system. These response modes reduce the system demands relative to an elastic system without supplemental damping. In design the re reduced demands are estimated using a response modification coefficient, commonly termed the R factor. The R factor values are important to the seismic performance of a building. Procedures put forth in FEMAP695 developed to R factors through a formalized procedure with the objective of consistent level of collapse potential for all building types. The primary objective of the research was to evaluate the seismic performance of SCBFs. To achieve this goal, an improved model including a proposed gusset plate connection model for SCBFs that permits accurate simulation of inelastic deformations of the brace, gusset plate connections, beams and columns and brace fracture was developed and validated using a large number of experiments. Response history analyses were conducted using the validated model. A series of different story-height SCBF buildings were designed and evaluated. The FEMAP695 method and an alternate procedure were applied to SCBFs and NCBFs. NCBFs are designed without ductile detailing. The evaluation using P695 method shows contrary results to the alternate evaluation procedure and the current knowledge in which short-story SCBF structures are more venerable than taller counterparts and NCBFs are more vulnerable than SCBFs.

  13. Quantitative evaluation of oligonucleotide surface concentrations using polymerization-based amplification

    PubMed Central

    Hansen, Ryan R.; Avens, Heather J.; Shenoy, Raveesh

    2008-01-01

    Quantitative evaluation of minimal polynucleotide concentrations has become a critical analysis among a myriad of applications found in molecular diagnostic technology. Development of high-throughput, nonenzymatic assays that are sensitive, quantitative and yet feasible for point-of-care testing are thus beneficial for routine implementation. Here, we develop a nonenzymatic method for quantifying surface concentrations of labeled DNA targets by coupling regulated amounts of polymer growth to complementary biomolecular binding on array-based biochips. Polymer film thickness measurements in the 20–220 nm range vary logarithmically with labeled DNA surface concentrations over two orders of magnitude with a lower limit of quantitation at 60 molecules/μm2 (∼106 target molecules). In an effort to develop this amplification method towards compatibility with fluorescence-based methods of characterization, incorporation of fluorescent nanoparticles into the polymer films is also evaluated. The resulting gains in fluorescent signal enable quantification using detection instrumentation amenable to point-of-care settings. Figure Polymerization-based amplification for quantitative evaluation of 3’ biotinylated oligonucleotide surface concentrations PMID:18661123

  14. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  15. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory.

  16. Performance evaluation of swimmers: scientific tools.

    PubMed

    Smith, David J; Norris, Stephen R; Hogg, John M

    2002-01-01

    The purpose of this article is to provide a critical commentary of the physiological and psychological tools used in the evaluation of swimmers. The first-level evaluation should be the competitive performance itself, since it is at this juncture that all elements interplay and provide the 'highest form' of assessment. Competition video analysis of major swimming events has progressed to the point where it has become an indispensable tool for coaches, athletes, sport scientists, equipment manufacturers, and even the media. The breakdown of each swimming performance at the individual level to its constituent parts allows for comparison with the predicted or sought after execution, as well as allowing for comparison with identified world competition levels. The use of other 'on-going' monitoring protocols to evaluate training efficacy typically involves criterion 'effort' swims and specific training sets where certain aspects are scrutinised in depth. Physiological parameters that are often examined alongside swimming speed and technical aspects include oxygen uptake, heart rate, blood lactate concentration, blood lactate accumulation and clearance rates. Simple and more complex procedures are available for in-training examination of technical issues. Strength and power may be quantified via several modalities although, typically, tethered swimming and dry-land isokinetic devices are used. The availability of a 'swimming flume' does afford coaches and sport scientists a higher degree of flexibility in the type of monitoring and evaluation that can be undertaken. There is convincing evidence that athletes can be distinguished on the basis of their psychological skills and emotional competencies and that these differences become further accentuated as the athlete improves. No matter what test format is used (physiological, biomechanical or psychological), similar criteria of validity must be ensured so that the test provides useful and associative information

  17. Performance evaluation of swimmers: scientific tools.

    PubMed

    Smith, David J; Norris, Stephen R; Hogg, John M

    2002-01-01

    The purpose of this article is to provide a critical commentary of the physiological and psychological tools used in the evaluation of swimmers. The first-level evaluation should be the competitive performance itself, since it is at this juncture that all elements interplay and provide the 'highest form' of assessment. Competition video analysis of major swimming events has progressed to the point where it has become an indispensable tool for coaches, athletes, sport scientists, equipment manufacturers, and even the media. The breakdown of each swimming performance at the individual level to its constituent parts allows for comparison with the predicted or sought after execution, as well as allowing for comparison with identified world competition levels. The use of other 'on-going' monitoring protocols to evaluate training efficacy typically involves criterion 'effort' swims and specific training sets where certain aspects are scrutinised in depth. Physiological parameters that are often examined alongside swimming speed and technical aspects include oxygen uptake, heart rate, blood lactate concentration, blood lactate accumulation and clearance rates. Simple and more complex procedures are available for in-training examination of technical issues. Strength and power may be quantified via several modalities although, typically, tethered swimming and dry-land isokinetic devices are used. The availability of a 'swimming flume' does afford coaches and sport scientists a higher degree of flexibility in the type of monitoring and evaluation that can be undertaken. There is convincing evidence that athletes can be distinguished on the basis of their psychological skills and emotional competencies and that these differences become further accentuated as the athlete improves. No matter what test format is used (physiological, biomechanical or psychological), similar criteria of validity must be ensured so that the test provides useful and associative information

  18. Comparative Evaluation of Software Features and Performances.

    PubMed

    Cecconi, Daniela

    2016-01-01

    Analysis of two-dimensional gel images is a crucial step for the determination of changes in the protein expression, but at present, it still represents one of the bottlenecks in 2-DE studies. Over the years, different commercial and academic software packages have been developed for the analysis of 2-DE images. Each of these shows different advantageous characteristics in terms of quality of analysis. In this chapter, the characteristics of the different commercial software packages are compared in order to evaluate their main features and performances.

  19. Performance evaluation of TCP over ABT protocols

    NASA Astrophysics Data System (ADS)

    Ata, Shingo; Murata, Masayuki; Miyahara, Hideo

    1998-10-01

    ABT is promising for effectively transferring a highly bursty data traffic in ATM networks. Most of past studies focused on the data transfer capability of ABT within the ATM layer. In actual, however, we need to consider the upper layer transport protocol since the transport layer protocol also supports a network congestion control mechanism. One such example is TCP, which is now widely used in the Internet. In this paper, we evaluate the performance of TCP over ABT protocols. Simulation results show that the retransmission mechanism of ABT can effectively overlay the TCP congestion control mechanism so that TCP operates in a stable fashion and works well only as an error recovery mechanism.

  20. MSAD actuator solenoid, performance evaluation and modification

    SciTech Connect

    North, G.

    1983-04-19

    A small conical-faced solenoid actuator is tested in order to develop design criteria for improved performance including increased pull sensitivity. In addition to increased pull for the normal electrical inputs, a reduction in pull response to short duration electrical noise pulses is also required. Along with dynamic testing of the solenoid, a linear circuit model is developed. This model permits calculation of the dynamic forces and currents which can be expected with various electrical inputs. The model parameters are related to the actual solenoid and allow the effects of winding density and shading rings to be evaluated.

  1. Sandia solar dryer: preliminary performance evaluation

    SciTech Connect

    Glass, J.S.; Holm-Hansen, T.; Tills, J.; Pierce, J.D.

    1986-01-01

    Preliminary performance evaluations were conducted with the prototype modular solar dryer for wastewater sludge at Sandia National Laboratories. Operational parameters which appeared to influence sludge drying efficiency included condensation system capacity and air turbulence at the sludge surface. Sludge heating profiles showed dependencies on sludge moisture content, sludge depth and seasonal variability in available solar energy. Heat-pasteurization of sludge in the module was demonstrated in two dynamic-processing experiments. Through balanced utilization of drying and heating functions, the facility has the potential for year-round sludge treatment application.

  2. Performance Evaluation of Phasor Measurement Systems

    SciTech Connect

    Huang, Zhenyu; Kasztenny, Bogdan; Madani, Vahid; Martin, Kenneth E.; Meliopoulos, Sakis; Novosel, Damir; Stenbakken, Jerry

    2008-07-20

    After two decades of phasor network deployment, phasor measurements are now available at many major substations and power plants. The North American SynchroPhasor Initiative (NASPI), supported by both the US Department of Energy and the North American Electricity Reliability Council (NERC), provides a forum to facilitate the efforts in phasor technology in North America. Phasor applications have been explored and some are in today’s utility practice. IEEE C37.118 Standard is a milestone in standardizing phasor measurements and defining performance requirements. To comply with IEEE C37.118 and to better understand the impact of phasor quality on applications, the NASPI Performance and Standards Task Team (PSTT) initiated and accomplished the development of two important documents to address characterization of PMUs and instrumentation channels, which leverage prior work (esp. in WECC) and international experience. This paper summarizes the accomplished PSTT work and presents the methods for phasor measurement evaluation.

  3. Factory performance evaluations of engineering controls for asphalt paving equipment.

    PubMed

    Mead, K R; Mickelsen, R L; Brumagin, T E

    1999-08-01

    This article describes a unique analytical tool to assist the development and implementation of engineering controls for the asphalt paving industry. Through an agreement with the U.S. Department of Transportation, the National Asphalt Pavement Association (NAPA) requested that the National Institute for Occupational Safety and Health (NIOSH) assist U.S. manufacturers of asphalt paving equipment with the development and evaluation of engineering controls. The intended function of the controls was to capture and remove asphalt emissions generated during the paving process. NIOSH engineers developed a protocol to evaluate prototype engineering controls using qualitative smoke and quantitative tracer gas methods. Video recordings documented each prototype's ability to capture theatrical smoke under "managed" indoor conditions. Sulfur hexafluoride (SF6), released as a tracer gas, enabled quantification of the capture efficiency and exhaust flow rate for each prototype. During indoor evaluations, individual prototypes' capture efficiencies averaged from 7 percent to 100 percent. Outdoor evaluations resulted in average capture efficiencies ranging from 81 percent down to 1 percent as wind gusts disrupted the ability of the controls to capture the SF6. The tracer gas testing protocol successfully revealed deficiencies in prototype designs which otherwise may have gone undetected. It also showed that the combination of a good enclosure and higher exhaust ventilation rate provided the highest capture efficiency. Some manufacturers used the stationary evaluation results to compare performances among multiple hood designs. All the manufacturers identified areas where their prototype designs were susceptible to cross-draft interferences. These stationary performance evaluations proved to be a valuable method to identify strengths and weaknesses in individual designs and subsequently optimize those designs prior to expensive analytical field studies. PMID:10462852

  4. Evaluating cryostat performance for naval applications

    NASA Astrophysics Data System (ADS)

    Knoll, David; Willen, Dag; Fesmire, James; Johnson, Wesley; Smith, Jonathan; Meneghelli, Barry; Demko, Jonathan; George, Daniel; Fowler, Brian; Huber, Patti

    2012-06-01

    The Navy intends to use High Temperature Superconducting Degaussing (HTSDG) coil systems on future Navy platforms. The Navy Metalworking Center (NMC) is leading a team that is addressing cryostat configuration and manufacturing issues associated with fabricating long lengths of flexible, vacuum-jacketed cryostats that meet Navy shipboard performance requirements. The project includes provisions to evaluate the reliability performance, as well as proofing of fabrication techniques. Navy cryostat performance specifications include less than 1 Wm-1 heat loss, 2 MPa working pressure, and a 25-year vacuum life. Cryostat multilayer insulation (MLI) systems developed on the project have been validated using a standardized cryogenic test facility and implemented on 5-meterlong test samples. Performance data from these test samples, which were characterized using both LN2 boiloff and flow-through measurement techniques, will be presented. NMC is working with an Integrated Project Team consisting of Naval Sea Systems Command, Naval Surface Warfare Center-Carderock Division, Southwire Company, nkt cables, Oak Ridge National Laboratory (ORNL), ASRC Aerospace, and NASA Kennedy Space Center (NASA-KSC) to complete these efforts. Approved for public release; distribution is unlimited. This material is submitted with the understanding that right of reproduction for governmental purposes is reserved for the Office of Naval Research, Arlington, Virginia 22203-1995.

  5. Performance evaluation of bound diamond ring tools

    SciTech Connect

    Piscotty, M.A.; Taylor, J.S.; Blaedel, K.L.

    1995-07-14

    LLNL is collaborating with the Center for Optics Manufacturing (COM) and the American Precision Optics Manufacturers Association (APOMA) to optimize bound diamond ring tools for the spherical generation of high quality optical surfaces. An important element of this work is establishing an experimentally-verified link between tooling properties and workpiece quality indicators such as roughness, subsurface damage and removal rate. In this paper, we report on a standardized methodology for assessing ring tool performance and its preliminary application to a set of commercially-available wheels. Our goals are to (1) assist optics manufacturers (users of the ring tools) in evaluating tools and in assessing their applicability for a given operation, and (2) provide performance feedback to wheel manufacturers to help optimize tooling for the optics industry. Our paper includes measurements of wheel performance for three 2-4 micron diamond bronze-bond wheels that were supplied by different manufacturers to nominally- identical specifications. Preliminary data suggests that the difference in performance levels among the wheels were small.

  6. 40 CFR 35.9055 - Evaluation of recipient performance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Evaluation of recipient performance. 35... Evaluation of recipient performance. The Regional Administrator will oversee each recipient's performance... schedule for evaluation in the assistance agreement and will evaluate recipient performance and...

  7. 48 CFR 436.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Construction 436.201 Evaluation of contractor performance. Preparation of performance evaluation reports. In addition to the requirements of FAR 36.201, performance evaluation reports shall be prepared for indefinite... of services to be ordered exceeds $500,000.00. For these contracts, performance evaluation...

  8. Performance Assessment of Human and Cattle Associated Quantitative Real-time PCR Assays - slides

    EPA Science Inventory

    The presentation overview is (1) Single laboratory performance assessment of human- and cattle associated PCR assays and (2) A Field Study: Evaluation of two human fecal waste management practices in Ohio watershed.

  9. Style-independent document labeling: design and performance evaluation

    NASA Astrophysics Data System (ADS)

    Mao, Song; Kim, Jong Woo; Thoma, George R.

    2003-12-01

    The Medical Article Records System or MARS has been developed at the U.S. National Library of Medicine (NLM) for automated data entry of bibliographical information from medical journals into MEDLINE, the premier bibliographic citation database at NLM. Currently, a rule-based algorithm (called ZoneCzar) is used for labeling important bibliographical fields (title, author, affiliation, and abstract) on medical journal article page images. While rules have been created for medical journals with regular layout types, new rules have to be manually created for any input journals with arbitrary or new layout types. Therefore, it is of interest to label any journal articles independent of their layout styles. In this paper, we first describe a system (called ZoneMatch) for automated generation of crucial geometric and non-geometric features of important bibliographical fields based on string-matching and clustering techniques. The rule based algorithm is then modified to use these features to perform style-independent labeling. We then describe a performance evaluation method for quantitatively evaluating our algorithm and characterizing its error distributions. Experimental results show that the labeling performance of the rule-based algorithm is significantly improved when the generated features are used.

  10. Quantitative evaluation of simulated human enamel caries kinetics using photothermal radiometry and modulated luminescence

    NASA Astrophysics Data System (ADS)

    Hellen, Adam; Mandelis, Andreas; Finer, Yoav; Amaechi, Bennett T.

    2011-03-01

    Photothermal radiometry and modulated luminescence (PTR-LUM) is a non-destructive methodology applied toward the detection, monitoring and quantification of dental caries. The purpose of this study was to evaluate the efficacy of PTRLUM to detect incipient caries lesions and quantify opto-thermophysical properties as a function of treatment time. Extracted human molars (n=15) were exposed to an acid demineralization gel (pH 4.5) for 10 or 40 days in order to simulate incipient caries lesions. PTR-LUM frequency scans (1 Hz - 1 kHz) were performed prior to and during demineralization. Transverse Micro-Radiography (TMR) analysis followed at treatment conclusion. A coupled diffusephoton- density-wave and thermal-wave theoretical model was applied to PTR experimental amplitude and phase data across the frequency range of 4 Hz - 354 Hz, to quantitatively evaluate changes in thermal and optical properties of sound and demineralized enamel. Excellent fits with small residuals were observed experimental and theoretical data illustrating the robustness of the computational algorithm. Increased scattering coefficients and poorer thermophysical properties were characteristic of demineralized lesion bodies. Enhanced optical scattering coefficients of demineralized lesions resulted in poorer luminescence yield due to scattering of both incident and converted luminescent photons. Differences in the rate of lesion progression for the 10-day and 40-day samples points to a continuum of surface and diffusion controlled mechanism of lesion formation. PTR-LUM sensitivity to changes in tooth mineralization coupled with opto-thermophysical property extraction illustrates the technique's potential for non-destructive quantification of enamel caries.

  11. Quantitative Evaluation of Liver Fibrosis Using Multi-Rayleigh Model with Hypoechoic Component

    NASA Astrophysics Data System (ADS)

    Higuchi, Tatsuya; Hirata, Shinnosuke; Yamaguchi, Tadashi; Hachiya, Hiroyuki

    2013-07-01

    To realize a quantitative diagnosis method of liver fibrosis, we have been developing a modeling method for the probability density function of the echo amplitude. In our previous model, the approximation accuracy is insufficient in regions with hypoechoic tissue such as a nodule or a blood vessel. In this study, we examined a multi-Rayleigh model with three Rayleigh distributions, corresponding to the distribution of the echo amplitude from hypoechoic, normal, and fibrous tissue. We showed quantitatively that the proposed model can model the amplitude distribution of liver fibrosis echo data with hypoechoic tissue adequately using Kullback-Leibler (KL) divergence, which is an index of the difference between two probability distributions. We also found that fibrous indices can be estimated stably using the proposed model even if hypoechoic tissue is included in the region of interest. We conclude that the multi-Rayleigh model with three components can be used to evaluate the progress of liver fibrosis quantitatively.

  12. Combining qualitative and quantitative imaging evaluation for the assessment of genomic DNA integrity: The SPIDIA experience.

    PubMed

    Ciniselli, Chiara Maura; Pizzamiglio, Sara; Malentacchi, Francesca; Gelmini, Stefania; Pazzagli, Mario; Hartmann, Christina C; Ibrahim-Gawel, Hady; Verderio, Paolo

    2015-06-15

    In this note, we present an ad hoc procedure that combines qualitative (visual evaluation) and quantitative (ImageJ software) evaluations of Pulsed-Field Gel Electrophoresis (PFGE) images to assess the genomic DNA (gDNA) integrity of analyzed samples. This procedure could be suitable for the analysis of a large number of images by taking into consideration both the expertise of researchers and the objectiveness of the software. We applied this procedure on the first SPIDIA DNA External Quality Assessment (EQA) samples. Results show that the classification obtained by this ad hoc procedure allows a more accurate evaluation of gDNA integrity with respect to a single approach.

  13. Evaluation of stroke performance in tennis.

    PubMed

    Vergauwen, L; Spaepen, A J; Lefevre, J; Hespel, P

    1998-08-01

    In the present studies, the Leuven Tennis Performance Test (LTPT), a newly developed test procedure to measure stroke performance in match-like conditions in elite tennis players, was evaluated as to its value for research purposes. The LTPT is enacted on a regular tennis court. It consists of first and second services, and of returning balls projected by a machine to target zones indicated by a lighted sign. Neutral, defensive, and offensive tactical situations are elicited by appropriately programming the machine. Stroke quality is determined from simultaneous measurements of error rate, ball velocity, and precision of ball placement. A velocity/precision (VP) an a velocity/precision/error (VPE) index are also calculated. The validity and sensitivity of the LTPT were determined by verifying whether LTPT scores reflect minor differences in tennis ranking on the one hand and the effects of fatigue on the other hand. Compared with lower ranked players, higher ones made fewer errors (P < 0.05). In addition, stroke velocity was higher (P < 0.05), and lateral stroke precision, VP, and VPE scores were better (P < 0.05) in the latter. Furthermore, fatigue induced by a prolonged tennis load increased (P < 0.05) error rate and decreased (P < 0.05) stroke velocity and the VP and VPE indices. It is concluded that the LTPT is an accurate, reliable, and valid instrument for the evaluation of stroke quality in high-level tennis players. PMID:9710870

  14. Evaluation of quantitative 90Y SPECT based on experimental phantom studies

    NASA Astrophysics Data System (ADS)

    Minarik, D.; Sjögreen Gleisner, K.; Ljungberg, M.

    2008-10-01

    In SPECT imaging of pure beta emitters, such as 90Y, the acquired spectrum is very complex, which increases the demands on the imaging protocol and the reconstruction. In this work, we have evaluated the quantitative accuracy of bremsstrahlung SPECT with focus on the reconstruction algorithm including model-based attenuation, scatter and collimator-detector response (CDR) compensations. The scatter and CDR compensation methods require pre-calculated point-spread functions, which were generated with the SIMIND MC program. The SIMIND program is dedicated for simulation of scintillation camera imaging and only handles photons. The aim of this work was therefore twofold. The first aim was to implement simulation of bremsstrahlung imaging into the SIMIND code and to validate simulations against experimental measurements. The second was to investigate the quality of bremsstrahlung SPECT imaging and to evaluate the possibility of quantifying the activity in differently shaped sources. In addition, a feasibility test was performed on a patient that underwent treatment with 90Y-Ibritumomab tiuxetan (Zevalin®). The MCNPX MC program was used to generate bremsstrahlung photon spectra which were used as source input in the SIMIND program. The obtained bremsstrahlung spectra were separately validated by experimental measurement using a HPGe detector. Validation of the SIMIND generated images was done by a comparison to gamma camera measurements of a syringe containing 90Y. Results showed a slight deviation between simulations and measurements in image regions outside the source, but the agreement was sufficient for the purpose of generating scatter and CDR kernels. For the bremsstrahlung SPECT experiment, the RSD torso phantom with 90Y in the liver insert was measured with and without background activities. Projection data were obtained using a GE VH/Hawkeye system. Image reconstruction was performed by using the OSEM algorithm with and without different combinations of model

  15. Evaluation of Infiltration Basin Performance in Florida

    NASA Astrophysics Data System (ADS)

    Bean, E.

    2012-12-01

    Infiltration basins are commonly utilized to reduce or eliminate urban runoff in Florida. For permitting purposes, basins are required to recover their design volume, runoff from a one inch rainfall event, within 72 hours to satisfy the design criteria and are not required to account for groundwater mounding if volume recovery can be accomplished by filling of soil porosity by vertical infiltration below the basin surface. Forty infiltration basins were included in a field study to determine whether basin hydraulic performance was significantly different from their designed performance. Basins ranged in age from less than one year to over twenty years and land uses were equally divided between Florida Department of Transportation (FDOT) and residential developments. Six test sites within each basin were typically selected to measure infiltration rates using a double ring infiltrometer (DRI), a common method for infiltration basin sizing. Measured rates were statistically compared to designed infiltration rates, taking into account factors of safety. In addition, a surface soil boring was collected from each of the test sites for a series of analyses, including soil texture, bulk density, and organic matter content. Eleven of the 40 evaluated basins were monitored between March 2008 and January 2012 to evaluate whether basins recovered their volumes from one inch events within 72 hours and to evaluate the effectiveness of using DRI rates to evaluate basin performance. Based on DRI rates, 16 (40%) basins had rates less than their designed rates, 10 (25%) had rates equal to their designed rates, and 14 (35%) basins had rates greater than their designed rates. Additionally, basins with coarser soils were also more likely to have DRI rates greater than designs and FDOT basins were more likely than residential basins to have infiltration rates at or above their designed rates. Five of the eleven monitored basins were expected to function as designed by recovering their

  16. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples.

    PubMed

    Lebrón-Aguilar, R; Soria, A C; Quintanilla-López, J E

    2016-10-28

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography-mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods.This article is part of the themed issue 'Quantitative mass spectrometry'. PMID:27644978

  17. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples.

    PubMed

    Lebrón-Aguilar, R; Soria, A C; Quintanilla-López, J E

    2016-10-28

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography-mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods.This article is part of the themed issue 'Quantitative mass spectrometry'.

  18. Qualitative and quantitative evaluation of Simon™, a new CE-based automated Western blot system as applied to vaccine development.

    PubMed

    Rustandi, Richard R; Loughney, John W; Hamm, Melissa; Hamm, Christopher; Lancaster, Catherine; Mach, Anna; Ha, Sha

    2012-09-01

    Many CE-based technologies such as imaged capillary IEF, CE-SDS, CZE, and MEKC are well established for analyzing proteins, viruses, or other biomolecules such as polysaccharides. For example, imaged capillary isoelectric focusing (charge-based protein separation) and CE-SDS (size-based protein separation) are standard replacement methods in biopharmaceutical industries for tedious and labor intensive IEF and SDS-PAGE methods, respectively. Another important analytical tool for protein characterization is a Western blot, where after size-based separation in SDS-PAGE the proteins are transferred to a membrane and blotted with specific monoclonal or polyclonal antibodies. Western blotting analysis is applied in many areas such as biomarker research, therapeutic target identification, and vaccine development. Currently, the procedure is very manual, laborious, and time consuming. Here, we evaluate a new technology called Simple Western™ (or Simon™) for performing automated Western analysis. This new technology is based on CE-SDS where the separated proteins are attached to the wall of capillary by a proprietary photo activated chemical crosslink. Subsequent blotting is done automatically by incubating and washing the capillary with primary and secondary antibodies conjugated with horseradish peroxidase and detected with chemiluminescence. Typically, Western blots are not quantitative, hence we also evaluated the quantitative aspect of this new technology. We demonstrate that Simon™ can quantitate specific components in one of our vaccine candidates and it provides good reproducibility and intermediate precision with CV <10%. PMID:22965727

  19. Performance characteristics of a quantitative, homogeneous TaqMan RT-PCR test for HCV RNA.

    PubMed

    Kleiber, J; Walter, T; Haberhausen, G; Tsang, S; Babiel, R; Rosenstraus, M

    2000-08-01

    We developed a homogeneous format reverse transcription-polymerase chain reaction assay for quantitating hepatitis C virus (HCV) RNA based on the TaqMan principle, in which signal is generated by cleaving a target-specific probe during amplification. The test uses two probes, one specific for HCV and one specific for an internal control, containing fluorophores with different emission spectra. Titers are calculated in international units (IU)/ml by comparing the HCV signal generated by test samples to that generated by a set of external standards. Endpoint titration experiments demonstrated that samples containing 28 IU/ml give positive results 95% of the time. Based on these data, the limit of detection was set conservatively at 40 IU/ml. All HCV genotypes were amplified with equal efficiency and accurately quantitated: when equal quantities of RNA were tested, each genotype produced virtually identical fluorescent signals. The test exhibited a linear range extending from 64 to 4,180,000 IU/ml and excellent reproducibility, with coefficients of variation ranging from 21.6 to 30.4%, which implies that titers that differ by a factor of twofold (0.3 log10) are statistically significant (P = 0.005). The test did not react with other organisms likely to co-infect patients with hepatitis C and exhibited a specificity of 99% when evaluated on a set of samples from HCV seronegative blood donors. In interferon-treated patients, the patterns of viral load changes revealed by the TaqMan HCV quantitative test distinguished responders from nonresponders and responder-relapsers. These data indicate that the TaqMan quantitative HCV test provides an attractive alternative for measuring HCV viral load and should prove useful for prognosis and for monitoring the efficacy of antiviral treatments.

  20. Image analysis techniques. The problem of the quantitative evaluation of thechromatin ultrastructure.

    PubMed

    Maraldi, N M; Marinelli, F; Squarzoni, S; Santi, S; Barbieri, M

    1991-02-01

    The application of image analysis methods to conventional thin sections for electron microscopy to analyze the chromatin arrangement are quite limited. We developed a method which utilizes freeze-fractured samples; the results indicate that the method is suitable for identifying the changes in the chromatin arrangement which occur in physiological, experimental and pathological conditions. The modern era of image analysis begins in 1964, when pictures of the moon transmitted by Ranger 7 were processed by a computer. This processing improved the original picture by enhancing and restoring the image affected by various types of distorsion. These performances have been allowed by the third-generation of computers having the speed and the storage capabilities required for practical use of image processing algorithms. Each image can be converted into a two-dimensional light intensity function: f (x, y), where x and y are the spatial coordinates and f value is proportional to the gray level of the image at that point. The digital image is therefore a matrix whose elements are the pixels (picture elements). A typical digital image can be obtained with a quality comparable to monochrome TV, with a 512×512 pixel array with 64 gray levels. The magnetic disks of commercial minicomputers are thus capable of storing some tenths of images which can be elaborated by the image processor, converting the signal into digital form. In biological images, obtained by light microscopy, the digitation converts the chromatic differences into gray level intensities, thus allowing to define the contours of the cytoplasm, of the nucleus and of the nucleoli. The use of a quantitative staining method for the DNA, the Feulgen reaction, permits to evaluate the ratio between condensed chromatin (stained) and euchromatin (unstained). The digitized images obtained by transmission electron microscopy are rich in details at high resolution. However, the application of image analysis techniques to

  1. Quantitative analysis of three chiral pesticide enantiomers by high-performance column liquid chromatography.

    PubMed

    Wang, Peng; Liu, Donghui; Gu, Xu; Jiang, Shuren; Zhou, Zhiqiang

    2008-01-01

    Methods for the enantiomeric quantitative determination of 3 chiral pesticides, paclobutrazol, myclobutanil, and uniconazole, and their residues in soil and water are reported. An effective chiral high-performance liquid chromatographic (HPLC)-UV method using an amylose-tris(3,5-dimethylphenylcarbamate; AD) column was developed for resolving the enantiomers and quantitative determination. The enantiomers were identified by a circular dichroism detector. Validation involved complete resolution of each of the 2 enantiomers, plus determination of linearity, precision, and limit of detection (LOD). The pesticide enantiomers were isolated by solvent extraction from soil and C18 solid-phase extraction from water. The 2 enantiomers of the 3 pesticides could be completely separated on the AD column using n-hexane isopropanol mobile phase. The linearity and precision results indicated that the method was reliable for the quantitative analysis of the enantiomers. LODs were 0.025, 0.05, and 0.05 mg/kg for each enantiomer of paclobutrazol, myclobutanil, and uniconazole, respectively. Recovery and precision data showed that the pretreatment procedures were satisfactory for enantiomer extraction and cleanup. This method can be used for optical purity determination of technical material and analysis of environmental residues.

  2. Quantitative evaluation of ozone and selected climate parameters in a set of EMAC simulations

    NASA Astrophysics Data System (ADS)

    Righi, M.; Eyring, V.; Gottschaldt, K.-D.; Klinger, C.; Frank, F.; Jöckel, P.; Cionni, I.

    2014-10-01

    Four simulations with the ECHAM/MESSy Atmospheric Chemistry (EMAC) model have been evaluated with the Earth System Model Validation Tool (ESMValTool) to identify differences in simulated ozone and selected climate parameters that resulted from (i) different setups of the EMAC model (nudged vs. free-running) and (ii) different boundary conditions (emissions, sea surface temperatures (SSTs) and sea-ice concentrations (SICs)). To assess the relative performance of the simulations, quantitative performance metrics are calculated consistently for the climate parameters and ozone. This is important for the interpretation of the evaluation results since biases in climate can impact on biases in chemistry and vice versa. The observational datasets used for the evaluation include ozonesonde and aircraft data, meteorological reanalyses and satellite measurements. The results from a previous EMAC evaluation of a model simulation with weak nudging towards realistic meteorology in the troposphere have been compared to new simulations with different model setups and updated emission datasets in free-running timeslice and nudged Quasi Chemistry-Transport Model (QCTM) mode. The latter two configurations are particularly important for chemistry-climate projections and for the quantification of individual sources (e.g. transport sector) that lead to small chemical perturbations of the climate system, respectively. With the exception of some specific features which are detailed in this study, no large differences that could be related to the different setups of the EMAC simulations (nudged vs. free-running) were found, which offers the possibility to evaluate and improve the overall model with the help of shorter nudged simulations. The main differences between the two setups is a better representation of the tropospheric and stratospheric temperature in the nudged simulations, which also better reproduce stratospheric water vapour concentrations, due to the improved simulation of

  3. Quantitative evaluation of ozone and selected climate parameters in a set of EMAC simulations

    NASA Astrophysics Data System (ADS)

    Righi, M.; Eyring, V.; Gottschaldt, K.-D.; Klinger, C.; Frank, F.; Jöckel, P.; Cionni, I.

    2015-03-01

    Four simulations with the ECHAM/MESSy Atmospheric Chemistry (EMAC) model have been evaluated with the Earth System Model Validation Tool (ESMValTool) to identify differences in simulated ozone and selected climate parameters that resulted from (i) different setups of the EMAC model (nudged vs. free-running) and (ii) different boundary conditions (emissions, sea surface temperatures (SSTs) and sea ice concentrations (SICs)). To assess the relative performance of the simulations, quantitative performance metrics are calculated consistently for the climate parameters and ozone. This is important for the interpretation of the evaluation results since biases in climate can impact on biases in chemistry and vice versa. The observational data sets used for the evaluation include ozonesonde and aircraft data, meteorological reanalyses and satellite measurements. The results from a previous EMAC evaluation of a model simulation with nudging towards realistic meteorology in the troposphere have been compared to new simulations with different model setups and updated emission data sets in free-running time slice and nudged quasi chemistry-transport model (QCTM) mode. The latter two configurations are particularly important for chemistry-climate projections and for the quantification of individual sources (e.g., the transport sector) that lead to small chemical perturbations of the climate system, respectively. With the exception of some specific features which are detailed in this study, no large differences that could be related to the different setups (nudged vs. free-running) of the EMAC simulations were found, which offers the possibility to evaluate and improve the overall model with the help of shorter nudged simulations. The main differences between the two setups is a better representation of the tropospheric and stratospheric temperature in the nudged simulations, which also better reproduce stratospheric water vapor concentrations, due to the improved simulation of

  4. Manipulator Performance Evaluation Using Fitts' Taping Task

    SciTech Connect

    Draper, J.V.; Jared, B.C.; Noakes, M.W.

    1999-04-25

    Metaphorically, a teleoperator with master controllers projects the user's arms and hands into a re- mote area, Therefore, human users interact with teleoperators at a more fundamental level than they do with most human-machine systems. Instead of inputting decisions about how the system should func- tion, teleoperator users input the movements they might make if they were truly in the remote area and the remote machine must recreate their trajectories and impedance. This intense human-machine inter- action requires displays and controls more carefully attuned to human motor capabilities than is neces- sary with most systems. It is important for teleoperated manipulators to be able to recreate human trajectories and impedance in real time. One method for assessing manipulator performance is to observe how well a system be- haves while a human user completes human dexterity tasks with it. Fitts' tapping task has been, used many times in the past for this purpose. This report describes such a performance assessment. The International Submarine Engineering (ISE) Autonomous/Teleoperated Operations Manipulator (ATOM) servomanipulator system was evalu- ated using a generic positioning accuracy task. The task is a simple one but has the merits of (1) pro- ducing a performance function estimate rather than a point estimate and (2) being widely used in the past for human and servomanipulator dexterity tests. Results of testing using this task may, therefore, allow comparison with other manipulators, and is generically representative of a broad class of tasks. Results of the testing indicate that the ATOM manipulator is capable of performing the task. Force reflection had a negative impact on task efficiency in these data. This was most likely caused by the high resistance to movement the master controller exhibited with the force reflection engaged. Measurements of exerted forces were not made, so it is not possible to say whether the force reflection helped partici- pants

  5. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788. PMID:21071908

  6. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788.

  7. Quantitative and confirmative performance of liquid chromatography coupled to high-resolution mass spectrometry compared to tandem mass spectrometry.

    PubMed

    Kaufmann, Anton; Butcher, Patrick; Maden, Kathryn; Walker, Stephan; Widmer, Miryam

    2011-04-15

    The quantitative and confirmative performance of two different mass spectrometry (MS) techniques (high-resolution MS and tandem MS) was critically compared. Evaluated was a new extraction and clean-up protocol which was developed to cover more than 100 different veterinary drugs at trace levels in a number of animal tissues and honey matrices. Both detection techniques, high-resolution mass spectrometry (HRMS) (single-stage Orbitrap instrument operated at 50 000 full width at half maximum) and tandem mass spectrometry (MS/MS) (quadrupole technology) were used to validate the method according to the EU Commission Decision 2002/657/EEC. Equal or even a slightly better quantitative performance was observed for the HRMS-based approach. Sensitivity is higher for unit mass resolution MS/MS if only a subset of the 100 compounds has to be monitored. Confirmation of suspected positive findings can be done by evaluating the intensity ratio between different MS/MS transitions, or by accurate mass based product ion traces (no precursor selection applied). MS/MS relies on compound-specific optimized transitions; hence the second, confirmatory transition generally shows relatively high ion abundance (fragmentation efficacy). This is often not the case in single-stage HRMS, since a generic (not compound-optimized) collision energy is applied. Hence, confirmation of analytes present at low levels is superior when performed by MS/MS. Slightly better precision, but poorer accuracy (fortified matrix extracts versus pure standard solution) of ion ratios were observed when comparing data obtained by HRMS versus MS/MS. PMID:21416536

  8. A Method for Missile Autopilot Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Eguchi, Hirofumi

    The essential benefit of HardWare-In-the-Loop (HWIL) simulation can be summarized as that the performance of autopilot system is evaluated realistically without the modeling error by using actual hardware such as seeker systems, autopilot systems and servo equipments. HWIL simulation, however, requires very expensive facilities: in these facilities, the target model generator is the indispensable subsystem. In this paper, one example of HWIL simulation facility with a target model generator for RF seeker systems is introduced at first. But this generator has the functional limitation on the line-of-sight angle as almost other generators, then, a test method to overcome the line-of-sight angle limitation is proposed.

  9. [Drifts and pernicious effects of the quantitative evaluation of research: the misuse of bibliometrics].

    PubMed

    Gingras, Yves

    2015-06-01

    The quantitative evaluation of scientific research relies increasingly on bibliometric indicators of publications and citations. We present the issues raised by the simplistic use of these methods and recall the dangers of using poorly built indicators and technically defective rankings that do not measure the dimensions they are supposed to measure, for example the of publications, laboratories or universities. We show that francophone journals are particularly susceptible to suffer from the bad uses of too simplistic bibliometric rankings of scientific journals.

  10. Performance evaluation of salivary amylase activity monitor.

    PubMed

    Yamaguchi, Masaki; Kanemori, Takahiro; Kanemaru, Masashi; Takai, Noriyasu; Mizuno, Yasufumi; Yoshida, Hiroshi

    2004-10-15

    In order to quantify psychological stress and to distinguish eustress and distress, we have been investigating the establishment of a method that can quantify salivary amylase activity (SMA). Salivary glands not only act as amplifiers of a low level of norepinephrine, but also respond more quickly and sensitively to psychological stress than cortisol levels. Moreover, the time-course changes of the salivary amylase activity have a possibility to distinguish eustress and distress. Thus, salivary amylase activity can be utilized as an excellent index for psychological stress. However, in dry chemistry system, a method for quantification of the enzymatic activity still needs to be established that can provide with sufficient substrate in a testing tape as well as can control enzymatic reaction time. Moreover, it is necessary to develop a method that has the advantages of using saliva, such as ease of collection, rapidity of response, and able to use at any time. In order to establish an easy method to monitor the salivary amylase activity, a salivary transcription device was fabricated to control the enzymatic reaction time. A fabricated salivary amylase activity monitor consisted of three devices, the salivary transcription device, a testing-strip and an optical analyzer. By adding maltose as a competitive inhibitor to a substrate Ga1-G2-CNP, a broad-range activity testing-strip was fabricated that could measure the salivary amylase activity with a range of 0-200 kU/l within 150 s. The calibration curve of the monitor for the salivary amylase activity showed R2=0.941, indicating that it was possible to use this monitor for the analysis of the salivary amylase activity without the need to determine the salivary volume quantitatively. In order to evaluate the assay variability of the monitor, salivary amylase activity was measured using Kraepelin psychodiagnostic test as a psychological stressor. A significant difference of salivary amylase activity was recognized

  11. Performance Evaluations of Ceramic Wafer Seals

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.; DeMange, Jeffrey J.; Steinetz, Bruce M.

    2006-01-01

    Future hypersonic vehicles will require high temperature, dynamic seals in advanced ramjet/scramjet engines and on the vehicle airframe to seal the perimeters of movable panels, flaps, and doors. Seal temperatures in these locations can exceed 2000 F, especially when the seals are in contact with hot ceramic matrix composite sealing surfaces. NASA Glenn Research Center is developing advanced ceramic wafer seals to meet the needs of these applications. High temperature scrub tests performed between silicon nitride wafers and carbon-silicon carbide rub surfaces revealed high friction forces and evidence of material transfer from the rub surfaces to the wafer seals. Stickage between adjacent wafers was also observed after testing. Several design changes to the wafer seals were evaluated as possible solutions to these concerns. Wafers with recessed sides were evaluated as a potential means of reducing friction between adjacent wafers. Alternative wafer materials are also being considered as a means of reducing friction between the seals and their sealing surfaces and because the baseline silicon nitride wafer material (AS800) is no longer commercially available.

  12. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  13. 48 CFR 236.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONTRACTS Special Aspects of Contracting for Construction 236.201 Evaluation of contractor performance. (a) Preparation of performance evaluation reports. Use DD Form 2626, Performance Evaluation (Construction... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Evaluation of...

  14. A quantitative measure for degree of automation and its relation to system performance and mental load.

    PubMed

    Wei, Z G; Macwan, A P; Wieringa, P A

    1998-06-01

    In this paper we quantitatively model degree of automation (DofA) in supervisory control as a function of the number and nature of tasks to be performed by the operator and automation. This model uses a task weighting scheme in which weighting factors are obtained from task demand load, task mental load, and task effect on system performance. The computation of DofA is demonstrated using an experimental system. Based on controlled experiments using operators, analyses of the task effect on system performance, the prediction and assessment of task demand load, and the prediction of mental load were performed. Each experiment had a different DofA. The effect of a change in DofA on system performance and mental load was investigated. It was found that system performance became less sensitive to changes in DofA at higher levels of DofA. The experimental data showed that when the operator controlled a partly automated system, perceived mental load could be predicted from the task mental load for each task component, as calculated by analyzing a situation in which all tasks were manually controlled. Actual or potential applications of this research include a methodology to balance and optimize the automation of complex industrial systems.

  15. 48 CFR 1252.216-72 - Performance evaluation plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....216-72 Performance evaluation plan. As prescribed in (TAR) 48 CFR 1216.406(b), insert the following clause: Performance Evaluation Plan (OCT 1994) (a) A Performance Evaluation Plan shall be unilaterally... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Performance...

  16. Evaluating the Performance of Calculus Classes Using Operational Research Tools.

    ERIC Educational Resources Information Center

    Soares de Mello, Joao Carlos C. B.; Lins, Marcos P. E.; Soares de Mello, Maria Helena C.; Gomes, Eliane G.

    2002-01-01

    Compares the efficiency of calculus classes and evaluates two kinds of classes: traditional and others that use computational methods in teaching. Applies quantitative evaluation methods using two operational research tools, multicriteria decision aid methods (mainly using the MACBETH approach) and data development analysis. (Author/YDS)

  17. Quantitative Evaluation of Microdistortions in Bowman's Layer and Corneal Deformation after Small Incision Lenticule Extraction

    PubMed Central

    Shroff, Rushad; Francis, Mathew; Pahuja, Natasha; Veeboy, Leio; Shetty, Rohit; Roy, Abhijit Sinha

    2016-01-01

    Purpose To quantitatively evaluate microdistortions in Bowman's layer and change in corneal stiffness after small incision lenticule extraction (SMILE). Methods This was a prospective, longitudinal, and interventional study. Thirty eyes of 30 patients were screened preoperatively and underwent SMILE for treatment of myopia with astigmatism. Visual acuity, refraction, optical coherence tomography (OCT; Bioptigen, Inc., Morrisville, NC) imaging of the layer and air-puff applanation (Corvis-ST, OCULUS Optikgeräte Gmbh, Germany) was performed before and after surgery (1 day, 1 week, and 1 month). The Bowman's Roughness Index (BRI) was defined as the enclosed area between the actual and an ideal smooth layer to quantify the microdistortions. A viscoelastic model was used to quantify the change in corneal stiffness using applanation. Results Uncorrected distance visual acuity improved (P < 0.001) and refractive error decreased (P < 0.0001) after SMILE. BRI increased from preoperative levels (1.81 × 10−3 mm2) to 1 week (3.14 × 10−3 mm2) after SMILE (P < 0.05) and then decreased up to a month (2.43 × 10−3 mm2; P < 0.05). Increase in the magnitude of the index correlated positively with refractive error (P = 0.02). However, corneal stiffness reduced after SMILE (105.86 ± 1.4 N/m versus 97.97 ± 1.21 N/m at 1 month, P = 0.001). The decrease in corneal stiffness did not correlate with refractive error (P = 0.61). Conclusions BRI correlated positively the magnitude of refractive error. However, decrease in corneal stiffness, assessed by air-puff applanation, may not be related to microdistortions after SMILE. Translational Relevance An objective method of quantification of Bowman's layer microdistortions using OCT was developed to monitor corneal wound healing and improve lenticule extraction methods. PMID:27777827

  18. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  19. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  20. Quantitative morphological evaluation of laser ablation on calculus using full-field optical coherence microscopy

    NASA Astrophysics Data System (ADS)

    Xiao, Q.; Lü, T.; Li, Z.; Fu, L.

    2011-10-01

    The quantitative morphological evaluation at high resolution is of significance for the study of laser-tissue interaction. In this paper, a full-field optical coherence microscopy (OCM) system with high resolution of ˜2 μm was developed to investigate the ablation on urinary calculus by a free-running Er:YAG laser. We studied the morphological variation quantitatively corresponding to change of energy setting of the Er:YAG laser. The experimental results show that the full-field OCM enables quantitative evaluation of the morphological shape of craters and material removal, and particularly the fine structure. We also built a heat conduction model to simulate the process of laser-calculus interaction by using finite element method. Through the simulation, the removal region of the calculus was calculated according to the temperature distribution. As a result, the depth, width, volume, and the cross-sectional profile of the crater in calculus measured by full-field OCM matched well with the theoretical results based on the heat conduction model. Both experimental and theoretical results confirm that the thermal interaction is the dominant effect in the ablation of calculus by Er:YAG laser, demonstrating the effectiveness of full-field OCM in studying laser-tissue interactions.

  1. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. PMID:27566933

  2. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective.

  3. Quantitative morphologic evaluation of magnetic resonance imaging during and after treatment of childhood leukemia

    PubMed Central

    Reddick, Wilburn E.; Laningham, Fred H.; Glass, John O.; Pui, Ching-Hon

    2008-01-01

    Introduction Medical advances over the last several decades, including CNS prophylaxis, have greatly increased survival in children with leukemia. As survival rates have increased, clinicians and scientists have been afforded the opportunity to further develop treatments to improve the quality of life of survivors by minimizing the long-term adverse effects. When evaluating the effect of antileukemia therapy on the developing brain, magnetic resonance (MR) imaging has been the preferred modality because it quantifies morphologic changes objectively and noninvasively. Method and results Computer-aided detection of changes on neuroimages enables us to objectively differentiate leukoencephalopathy from normal maturation of the developing brain. Quantitative tissue segmentation algorithms and relaxometry measures have been used to determine the prevalence, extent, and intensity of white matter changes that occur during therapy. More recently, diffusion tensor imaging has been used to quantify microstructural changes in the integrity of the white matter fiber tracts. MR perfusion imaging can be used to noninvasively monitor vascular changes during therapy. Changes in quantitative MR measures have been associated, to some degree, with changes in neurocognitive function during and after treatment Conclusion In this review, we present recent advances in quantitative evaluation of MR imaging and discuss how these methods hold the promise to further elucidate the pathophysiologic effects of treatment for childhood leukemia. PMID:17653705

  4. Quantitative evaluation of the cutting quality and abrasive resistance of scalers.

    PubMed

    Kaya, H; Fujimura, T; Kimura, S

    1995-01-01

    An automatic scaling apparatus that simulated the scaling process of hand instrumentation was developed to quantitatively analyze the cutting quality and abrasive resistance of scalers. We first tested 4 synthetic resins as the abraded material. Of the 4 synthetic resins tested, polycarbonate resin proved most similar to dentin. The effects of lateral scaling forces (700, 500, and 300 dyne) and scaler angles (70 degrees to 95 degrees) on the cutting quality and abrasive resistance of scalers were evaluated quantitatively by the amount of the abraded material worn away in 1,000 strokes. Comparison of the 3 scaling forces showed a greater amount of abrasion at higher force than that at lower force. This suggests that the decrease in the amount due to abrasion could be compensated by increasing the lateral scaling force. Regarding the scaler angle, results indicated that the amount of material removed increased with an increase of the scaler angle up to 70 degrees, but then rapidly decreased at an angle of 90 degrees or more. The most effective scaling angle was 87 degrees, and this was not affected by scaling force. These results suggest that a greater amount of removal could be obtained at a scaling angle of 87 degrees and a scaling force of 700 dyne. The present findings suggested the automatic scaling apparatus could be a useful tool for quantitatively evaluating the cutting quality and abrasive resistance of scalers.

  5. High-Performance Monopropellants and Catalysts Evaluated

    NASA Technical Reports Server (NTRS)

    Reed, Brian D.

    2004-01-01

    The NASA Glenn Research Center is sponsoring efforts to develop advanced monopropellant technology. The focus has been on monopropellant formulations composed of an aqueous solution of hydroxylammonium nitrate (HAN) and a fuel component. HAN-based monopropellants do not have a toxic vapor and do not need the extraordinary procedures for storage, handling, and disposal required of hydrazine (N2H4). Generically, HAN-based monopropellants are denser and have lower freezing points than N2H4. The performance of HAN-based monopropellants depends on the selection of fuel, the HAN-to-fuel ratio, and the amount of water in the formulation. HAN-based monopropellants are not seen as a replacement for N2H4 per se, but rather as a propulsion option in their own right. For example, HAN-based monopropellants would prove beneficial to the orbit insertion of small, power-limited satellites because of this propellant's high performance (reduced system mass), high density (reduced system volume), and low freezing point (elimination of tank and line heaters). Under a Glenn-contracted effort, Aerojet Redmond Rocket Center conducted testing to provide the foundation for the development of monopropellant thrusters with an I(sub sp) goal of 250 sec. A modular, workhorse reactor (representative of a 1-lbf thruster) was used to evaluate HAN formulations with catalyst materials. Stoichiometric, oxygen-rich, and fuelrich formulations of HAN-methanol and HAN-tris(aminoethyl)amine trinitrate were tested to investigate the effects of stoichiometry on combustion behavior. Aerojet found that fuelrich formulations degrade the catalyst and reactor faster than oxygen-rich and stoichiometric formulations do. A HAN-methanol formulation with a theoretical Isp of 269 sec (designated HAN269MEO) was selected as the baseline. With a combustion efficiency of at least 93 percent demonstrated for HAN-based monopropellants, HAN269MEO will meet the I(sub sp) 250 sec goal.

  6. DRACS thermal performance evaluation for FHR

    SciTech Connect

    Lv, Q.; Lin, H. C.; Kim, I. H.; Sun, X.; Christensen, R. N.; Blue, T. E.; Yoder, G. L.; Wilson, D. F.; Sabharwall, P.

    2015-03-01

    Direct Reactor Auxiliary Cooling System (DRACS) is a passive decay heat removal system proposed for the Fluoride-salt-cooled High-temperature Reactor (FHR) that combines coated particle fuel and a graphite moderator with a liquid fluoride salt as the coolant. The DRACS features three coupled natural circulation/convection loops, relying completely on buoyancy as the driving force. These loops are coupled through two heat exchangers, namely, the DRACS Heat Exchanger and the Natural Draft Heat Exchanger. In addition, a fluidic diode is employed to minimize the parasitic flow into the DRACS primary loop and correspondingly the heat loss to the DRACS during normal operation of the reactor, and to keep the DRACS ready for activation, if needed, during accidents. To help with the design and thermal performance evaluation of the DRACS, a computer code using MATLAB has been developed. This code is based on a one-dimensional formulation and its principle is to solve the energy balance and integral momentum equations. By discretizing the DRACS system in the axial direction, a bulk mean temperature is assumed for each mesh cell. The temperatures of all the cells, as well as the mass flow rates in the DRACS loops, are predicted by solving the governing equations that are obtained by integrating the energy conservation equation over each cell and integrating the momentum conservation equation over each of the DRACS loops. In addition, an intermediate heat transfer loop equipped with a pump has also been modeled in the code. This enables the study of flow reversal phenomenon in the DRACS primary loop, associated with the pump trip process. Experimental data from a High-Temperature DRACS Test Facility (HTDF) are not available yet to benchmark the code. A preliminary code validation is performed by using natural circulation experimental data available in the literature, which are as closely relevant as possible. The code is subsequently applied to the HTDF that is under

  7. Quantitative evaluation of periprosthetic infection by real-time polymerase chain reaction: a comparison with conventional methods.

    PubMed

    Miyamae, Yushi; Inaba, Yutaka; Kobayashi, Naomi; Choe, Hyonmin; Ike, Hiroyuki; Momose, Takako; Fujiwara, Shusuke; Saito, Tomoyuki

    2012-10-01

    Several recent studies have demonstrated the limited accuracy of conventional culture methods for diagnosing periprosthetic infections. We have applied real-time polymerase chain reaction (PCR) assays for the rapid identification of bacteria around implants and reported its utility. However, the capability of quantification is also a useful feature of this type of assay. The aim of our study was to validate the usefulness of quantitative analyses using real-time PCR of cases with clinical periprosthetic infections in comparison with more established tests, such as C-reactive protein (CRP) levels, microbiologic cultures, and histopathology. Fifty-six joints with suspected infections were reviewed retrospectively. A universal PCR assay was used to perform the quantitative analyses. The differences in the threshold cycles between clinical samples and a negative control (∆Ct) in each case were calculated. The results of the quantitative PCR assay were compared with CRP levels, microbiologic cultures, and histopathology. There was a significant correlation found between the CRP and ∆Ct values. There were also significant differences found in the ∆Ct values according to CRP levels, with higher CRP levels showing higher ∆Ct values. Similarly, there were significant differences in the ∆Ct measurements in our culture results and among our pathologic evaluations. We confirmed that quantification by universal PCR based on the ∆Ct correlated with preoperative CRP levels and was associated with the microbiologic culture results and pathologic severity. This quantification method may be valuable for assessing infection severity.

  8. A new dynamic myocardial phantom for evaluation of SPECT and PET quantitation in systolic and diastolic conditions

    SciTech Connect

    Dreuille, O. de; Bendriem, B.; Riddell, C.

    1996-12-31

    We present a new dynamic myocardial phantom designed to evaluate SPECT and PET imaging in systolic and diastolic conditions. The phantom includes a thoracic attenuating media and the myocardial wall thickness varying during the scan can be performed. In this study the phantom was used with three different wall thickness characteristic of a systolic, end-diastolic and pathologic end-diastolic condition. The myocardium was filled with {sup 99m}Tc, {sup 18}F and Gd and imaged by SPECT, PET and MRI. SPECT attenuation correction was performed using a modified PET transmission. A bull`s eyes image was obtained for all data and wall ROI were then drawn for analysis. Using MRI as a reference, error from PET, SPECT and attenuation corrected SPECT were calculated. Systolic PET performances agree with MRI. Quantitation loss due to wall thickness reduction compared to the systole. Attenuation correction in SPECT leads to significant decrease of the error both in systole (from 29% to 14%) and diastole (35% to 22%). This is particularly sensitive for septum and inferior walls. SPECT residual errors (14% in systole and 22% in pathologic end-diastole) are likely caused by scatter, noise and depth dependent resolution effect. The results obtained with this dynamical phantom demonstrate the quantitation improvement achieved in SPECT with attenuation correction and also reinforce the need for variable resolution correction in addition to attenuation correction.

  9. Rapid Quantitation of Furanocoumarins and Flavonoids in Grapefruit Juice using Ultra Performance Liquid Chromatography

    PubMed Central

    VanderMolen, Karen M.; Cech, Nadja B.; Paine, Mary F.

    2013-01-01

    Introduction Grapefruit juice can increase or decrease the systemic exposure of myriad oral medications, leading to untoward effects or reduced efficacy. Furanocoumarins in grapefruit juice have been established as inhibitors of cytochrome P450 3A (CYP3A)-mediated metabolism and P-glycoprotein (P-gp)-mediated efflux, while flavonoids have been implicated as inhibitors of organic anion transporting polypeptide (OATP)-mediated absorptive uptake in the intestine. The potential for drug interactions with a food product necessitates an understanding of the expected concentrations of a suite of structurally diverse and potentially bioactive compounds. Objective Develop methods for the rapid quantitation of two furanocoumarins (bergamottin and 6′,7′-dihydroxybergamottin) and four flavonoids (naringin, naringenin, narirutin, and hesperidin) in five grapefruit juice products using ultra performance liquid chromatography (UPLC). Methodology Grapefruit juice products were extracted with ethyl acetate; the concentrated extract was analyzed by UPLC using acetonitrile:water gradients and a C18 column. Analytes were detected using a photodiode array detector, set at 250 nm (furanocoumarins) and 310 nm (flavonoids). Intraday and interday precision and accuracy and limits of detection and quantitation were determined. Results Rapid (<5.0 min) UPLC methods were developed to measure the aforementioned furanocoumarins and flavonoids. R2 values for the calibration curves of all analytes were >0.999. Considerable between-juice variation in the concentrations of these compounds was observed, and the quantities measured were in agreement with the concentrations published in HPLC studies. Conclusion These analytical methods provide an expedient means to quantitate key furanocoumarins and flavonoids in grapefruit juice and other foods used in dietary substance-drug interaction studies. PMID:23780830

  10. LANDSAT-4 horizon scanner performance evaluation

    NASA Technical Reports Server (NTRS)

    Bilanow, S.; Chen, L. C.; Davis, W. M.; Stanley, J. P.

    1984-01-01

    Representative data spans covering a little more than a year since the LANDSAT-4 launch were analyzed to evaluate the flight performance of the satellite's horizon scanner. High frequency noise was filtered out by 128-point averaging. The effects of Earth oblateness and spacecraft altitude variations are modeled, and residual systematic errors are analyzed. A model for the predicted radiance effects is compared with the flight data and deficiencies in the radiance effects modeling are noted. Correction coefficients are provided for a finite Fourier series representation of the systematic errors in the data. Analysis of the seasonal dependence of the coefficients indicates the effects of some early mission problems with the reference attitudes which were computed by the onboard computer using star trackers and gyro data. The effects of sun and moon interference, unexplained anomalies in the data, and sensor noise characteristics and their power spectrum are described. The variability of full orbit data averages is shown. Plots of the sensor data for all the available data spans are included.

  11. Coherent lidar airborne windshear sensor: performance evaluation.

    PubMed

    Targ, R; Kavaya, M J; Huffaker, R M; Bowles, R L

    1991-05-20

    National attention has focused on the critical problem of detecting and avoiding windshear since the crash on 2 Aug. 1985 of a Lockheed L-1011 at Dallas/Fort Worth International Airport. As part of the NASA/FAA National Integrated Windshear Program, we have defined a measurable windshear hazard index that can be remotely sensed from an aircraft, to give the pilot information about the wind conditions he will experience at some later time if he continues along the present flight path. A technology analysis and end-to-end performance simulation measuring signal-to-noise ratios and resulting wind velocity errors for competing coherent laser radar (lidar) systems have been carried out. The results show that a Ho:YAG lidar at a wavelength of 2.1 microm and a CO(2) lidar at 10.6 microm can give the pilot information about the line-of-sight component of a windshear threat from his present position to a region extending 2-4 km in front of the aircraft. This constitutes a warning time of 20-40 s, even in conditions of moderately heavy precipitation. Using these results, a Coherent Lidar Airborne Shear Sensor (CLASS) that uses a Q-switched CO(2) laser at 10.6 microm is being designed and developed for flight evaluation in the fall of 1991.

  12. Performance evaluation of an infrared thermocouple.

    PubMed

    Chen, Chiachung; Weng, Yu-Kai; Shen, Te-Ching

    2010-01-01

    The measurement of the leaf temperature of forests or agricultural plants is an important technique for the monitoring of the physiological state of crops. The infrared thermometer is a convenient device due to its fast response and nondestructive measurement technique. Nowadays, a novel infrared thermocouple, developed with the same measurement principle of the infrared thermometer but using a different detector, has been commercialized for non-contact temperature measurement. The performances of two-kinds of infrared thermocouples were evaluated in this study. The standard temperature was maintained by a temperature calibrator and a special black cavity device. The results indicated that both types of infrared thermocouples had good precision. The error distribution ranged from -1.8 °C to 18 °C as the reading values served as the true values. Within the range from 13 °C to 37 °C, the adequate calibration equations were the high-order polynomial equations. Within the narrower range from 20 °C to 35 °C, the adequate equation was a linear equation for one sensor and a two-order polynomial equation for the other sensor. The accuracy of the two kinds of infrared thermocouple was improved by nearly 0.4 °C with the calibration equations. These devices could serve as mobile monitoring tools for in situ and real time routine estimation of leaf temperatures.

  13. Quantitative evaluation of six graph based semi-automatic liver tumor segmentation techniques using multiple sets of reference segmentation

    NASA Astrophysics Data System (ADS)

    Su, Zihua; Deng, Xiang; Chefd'hotel, Christophe; Grady, Leo; Fei, Jun; Zheng, Dong; Chen, Ning; Xu, Xiaodong

    2011-03-01

    Graph based semi-automatic tumor segmentation techniques have demonstrated great potential in efficiently measuring tumor size from CT images. Comprehensive and quantitative validation is essential to ensure the efficacy of graph based tumor segmentation techniques in clinical applications. In this paper, we present a quantitative validation study of six graph based 3D semi-automatic tumor segmentation techniques using multiple sets of expert segmentation. The six segmentation techniques are Random Walk (RW), Watershed based Random Walk (WRW), LazySnapping (LS), GraphCut (GHC), GrabCut (GBC), and GrowCut (GWC) algorithms. The validation was conducted using clinical CT data of 29 liver tumors and four sets of expert segmentation. The performance of the six algorithms was evaluated using accuracy and reproducibility. The accuracy was quantified using Normalized Probabilistic Rand Index (NPRI), which takes into account of the variation of multiple expert segmentations. The reproducibility was evaluated by the change of the NPRI from 10 different sets of user initializations. Our results from the accuracy test demonstrated that RW (0.63) showed the highest NPRI value, compared to WRW (0.61), GWC (0.60), GHC (0.58), LS (0.57), GBC (0.27). The results from the reproducibility test indicated that GBC is more sensitive to user initialization than the other five algorithms. Compared to previous tumor segmentation validation studies using one set of reference segmentation, our evaluation methods use multiple sets of expert segmentation to address the inter or intra rater variability issue in ground truth annotation, and provide quantitative assessment for comparing different segmentation algorithms.

  14. Importance of Purity Evaluation and the Potential of Quantitative 1H NMR as a Purity Assay

    PubMed Central

    2015-01-01

    In any biomedical and chemical context, a truthful description of chemical constitution requires coverage of both structure and purity. This qualification affects all drug molecules, regardless of development stage (early discovery to approved drug) and source (natural product or synthetic). Purity assessment is particularly critical in discovery programs and whenever chemistry is linked with biological and/or therapeutic outcome. Compared with chromatography and elemental analysis, quantitative NMR (qNMR) uses nearly universal detection and provides a versatile and orthogonal means of purity evaluation. Absolute qNMR with flexible calibration captures analytes that frequently escape detection (water, sorbents). Widely accepted structural NMR workflows require minimal or no adjustments to become practical 1H qNMR (qHNMR) procedures with simultaneous qualitative and (absolute) quantitative capability. This study reviews underlying concepts, provides a framework for standard qHNMR purity assays, and shows how adequate accuracy and precision are achieved for the intended use of the material. PMID:25295852

  15. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. PMID:25079489

  16. Aerobic bioremediation of chlorobenzene source-zone soil in flow-through columns: performance assessment using quantitative PCR.

    PubMed

    Dominguez, Rosa F; da Silva, Marcio L B; McGuire, Travis M; Adamson, David; Newell, Charles J; Alvarez, Pedro J J

    2008-07-01

    Flow-through aquifer columns were operated for 12 weeks to evaluate the benefits of aerobic biostimulation for the bioremediation of source-zone soil contaminated with chlorobenzenes (CBs). Quantitative Polymerase Chain Reaction (qPCR) was used to measure the concentration of total bacteria (16S rRNA gene) and oxygenase genes involved in the biodegradation of aromatic compounds (i.e., toluene dioxygenase, ring hydroxylating monooxygenase, naphthalene dioxygenase, phenol hydroxylase, and biphenyl dioxygenase). Monochlorobenzene, which is much more soluble than dichlorobenzenes, was primarily removed by flushing, and biostimulation showed little benefit. In contrast, dichlorobenzene removal was primarily due to biodegradation, and the removal efficiency was much higher in oxygen-amended columns compared to a control column. To our knowledge, this is the first report that oxygen addition can enhance CB source-zone soil bioremediation. Analysis by qPCR showed that whereas the biphenyl and toluene dioxygenase biomarkers were most abundant, increases in the concentration of the phenol hydroxylase gene reflected best the higher dichlorobenzene removal due to aerobic biostimulation. This suggests that quantitative molecular microbial ecology techniques could be useful to assess CB source-zone bioremediation performance.

  17. Quantitative Evaluation of Vascularity Using 2-D Power Doppler Ultrasonography May Not Identify Malignancy of the Thyroid.

    PubMed

    Yoon, Jung Hyun; Shin, Hyun Joo; Kim, Eun-Kyung; Moon, Hee Jung; Roh, Yun Ho; Kwak, Jin Young

    2015-11-01

    The purpose of this study was to evaluate the usefulness of a quantitative vascular index in predicting thyroid malignancy. A total of 1309 thyroid nodules in 1257 patients (mean age: 50.2 y, range: 18-83 y) were included. The vascularity pattern and vascular index (VI) measured by quantification software for each nodule were obtained from 2-D power Doppler ultrasonography (US). Gray-scale US + vascularity pattern was compared with gray-scale US + VI with respect to diagnostic performance. Of the 1309 thyroid nodules, 927 (70.8%) were benign and 382 (29.2%) were malignant. The area under the receiver operating characteristics curve (Az) for gray-scale US (0.82) was significantly higher than that for US combined with vascularity pattern (0.77) or VI (0.70, all p < 0.001). Quantified VIs were higher in benign nodules, but did not improve the performance of 2-D US in diagnosing thyroid malignancy.

  18. Evaluation of absolute quantitation by nonlinear regression in probe-based real-time PCR

    PubMed Central

    Goll, Rasmus; Olsen, Trine; Cui, Guanglin; Florholmen, Jon

    2006-01-01

    Background In real-time PCR data analysis, the cycle threshold (CT) method is currently the gold standard. This method is based on an assumption of equal PCR efficiency in all reactions, and precision may suffer if this condition is not met. Nonlinear regression analysis (NLR) or curve fitting has therefore been suggested as an alternative to the cycle threshold method for absolute quantitation. The advantages of NLR are that the individual sample efficiency is simulated by the model and that absolute quantitation is possible without a standard curve, releasing reaction wells for unknown samples. However, the calculation method has not been evaluated systematically and has not previously been applied to a TaqMan platform. Aim: To develop and evaluate an automated NLR algorithm capable of generating batch production regression analysis. Results Total RNA samples extracted from human gastric mucosa were reverse transcribed and analysed for TNFA, IL18 and ACTB by TaqMan real-time PCR. Fluorescence data were analysed by the regular CT method with a standard curve, and by NLR with a positive control for conversion of fluorescence intensity to copy number, and for this purpose an automated algorithm was written in SPSS syntax. Eleven separate regression models were tested, and the output data was subjected to Altman-Bland analysis. The Altman-Bland analysis showed that the best regression model yielded quantitative data with an intra-assay variation of 58% vs. 24% for the CT derived copy numbers, and with a mean inter-method deviation of × 0.8. Conclusion NLR can be automated for batch production analysis, but the CT method is more precise for absolute quantitation in the present setting. The observed inter-method deviation is an indication that assessment of the fluorescence conversion factor used in the regression method can be improved. However, the versatility depends on the level of precision required, and in some settings the increased cost effectiveness of NLR

  19. Quantitative evaluation of mask phase defects from through-focus EUV aerial images

    SciTech Connect

    Mochi, Iacopo; Yamazoe, Kenji; Neureuther, Andrew; Goldberg, Kenneth A.

    2011-02-21

    Mask defects inspection and imaging is one of the most important issues for any pattern transfer lithography technology. This is especially true for EUV lithography where the wavelength-specific properties of masks and defects necessitate actinic inspection for a faithful prediction of defect printability and repair performance. In this paper we will present a technique to obtain a quantitative characterization of mask phase defects from EUV aerial images. We apply this technique to measure the aerial image phase of native defects on a blank mask, measured with the SEMATECH Berkeley Actinic Inspection Tool (AIT) an EUV zoneplate microscope that operates at Lawrence Berkeley National Laboratory. The measured phase is compared with predictions made from AFM top-surface measurements of those defects. While amplitude defects are usually easy to recognize and quantify with standard inspection techniques like scanning electron microscopy (SEM), defects or structures that have a phase component can be much more challenging to inspect. A phase defect can originate from the substrate or from any level of the multilayer. In both cases its effect on the reflected field is not directly related to the local topography of the mask surface, but depends on the deformation of the multilayer structure. Using the AIT, we have previously showed that EUV inspection provides a faithful and reliable way to predict the appearance of mask defect on the printed wafer; but to obtain a complete characterization of the defect we need to evaluate quantitatively its phase component. While aerial imaging doesn't provide a direct measurement of the phase of the object, this information is encoded in the through focus evolution of the image intensity distribution. Recently we developed a technique that allows us to extract the complex amplitude of EUV mask defects using two aerial images from different focal planes. The method for the phase reconstruction is derived from the Gerchberg-Saxton (GS

  20. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    PubMed

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174

  1. Genetic algorithm based image binarization approach and its quantitative evaluation via pooling

    NASA Astrophysics Data System (ADS)

    Hu, Huijun; Liu, Ya; Liu, Maofu

    2015-12-01

    The binarized image is very critical to image visual feature extraction, especially shape feature, and the image binarization approaches have been attracted more attentions in the past decades. In this paper, the genetic algorithm is applied to optimizing the binarization threshold of the strip steel defect image. In order to evaluate our genetic algorithm based image binarization approach in terms of quantity, we propose the novel pooling based evaluation metric, motivated by information retrieval community, to avoid the lack of ground-truth binary image. Experimental results show that our genetic algorithm based binarization approach is effective and efficiency in the strip steel defect images and our quantitative evaluation metric on image binarization via pooling is also feasible and practical.

  2. Quantitative analysis of topoisomerase IIalpha to rapidly evaluate cell proliferation in brain tumors.

    PubMed

    Oda, Masashi; Arakawa, Yoshiki; Kano, Hideyuki; Kawabata, Yasuhiro; Katsuki, Takahisa; Shirahata, Mitsuaki; Ono, Makoto; Yamana, Norikazu; Hashimoto, Nobuo; Takahashi, Jun A

    2005-06-17

    Immunohistochemical cell proliferation analyses have come into wide use for evaluation of tumor malignancy. Topoisomerase IIalpha (topo IIalpha), an essential nuclear enzyme, has been known to have cell cycle coupled expression. We here show the usefulness of quantitative analysis of topo IIalpha mRNA to rapidly evaluate cell proliferation in brain tumors. A protocol to quantify topo IIalpha mRNA was developed with a real-time RT-PCR. It took only 3 h to quantify from a specimen. A total of 28 brain tumors were analyzed, and the level of topo IIalpha mRNA was significantly correlated with its immuno-staining index (p<0.0001, r=0.9077). Furthermore, it sharply detected that topo IIalpha mRNA decreased in growth-inhibited glioma cell. These results support that topo IIalpha mRNA may be a good and rapid indicator to evaluate cell proliferate potential in brain tumors.

  3. Space Suit Performance: Methods for Changing the Quality of Quantitative Data

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew; Benson, Elizabeth; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. To verify that new suits will enable astronauts to perform to their maximum capacity, prototype suits must be built and tested with human subjects. However, engineers and flight surgeons often have difficulty understanding and applying traditional representations of human data without training. To overcome these challenges, NASA is developing modern simulation and analysis techniques that focus on 3D visualization. Early understanding of actual performance early on in the design cycle is extremely advantageous to increase performance capabilities, reduce the risk of injury, and reduce costs. The primary objective of this project was to test modern simulation and analysis techniques for evaluating the performance of a human operating in extra-vehicular space suits.

  4. 48 CFR 8.406-7 - Contractor Performance Evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Performance Evaluation. Ordering activities must prepare an evaluation of contractor performance for each... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Contractor Performance Evaluation. 8.406-7 Section 8.406-7 Federal Acquisition Regulations System FEDERAL ACQUISITION...

  5. 48 CFR 1552.209-76 - Contractor performance evaluations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 1552.209-76 Contractor performance evaluations. As prescribed in section 1509.170-1, insert the following clause in all applicable solicitations and contracts. Contractor Performance Evaluations (OCT 2002... compliance with safety standards performance categories if deemed appropriate for the evaluation or...

  6. 10 CFR 1045.9 - RD classification performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false RD classification performance evaluation. 1045.9 Section... classification performance evaluation. (a) Heads of agencies shall ensure that RD management officials and those... RD or FRD documents shall have their personnel performance evaluated with respect to...

  7. 24 CFR 570.491 - Performance and evaluation report.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 3 2010-04-01 2010-04-01 false Performance and evaluation report... Development Block Grant Program § 570.491 Performance and evaluation report. The annual performance and evaluation report shall be submitted in accordance with 24 CFR part 91. (Approved by the Office of...

  8. 24 CFR 570.491 - Performance and evaluation report.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Development Block Grant Program § 570.491 Performance and evaluation report. The annual performance and evaluation report shall be submitted in accordance with 24 CFR part 91. (Approved by the Office of Management... 24 Housing and Urban Development 3 2011-04-01 2010-04-01 true Performance and evaluation...

  9. Quantitative determination of tilmicosin in canine serum by high performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Herrera, Michael; Ding, Haiqing; McClanahan, Robert; Owens, Jane G; Hunter, Robert P

    2007-09-15

    A highly sensitive and quantitative LC/MS/MS assay for the determination of tilmicosin in serum has been developed and validated. For sample preparation, 0.2 mL of canine serum was extracted with 3 mL of methyl tert-butyl ether. The organic layer was transferred to a new vessel and dried under nitrogen. The sample was then reconstituted for analysis by high performance liquid chromatography-tandem mass spectrometry. A Phenomenex Luna C8(2) analytical column was used for the chromatographic separation. The eluent was subsequently introduced to the mass spectrometer by electrospray ionization. A single range was validated for 50-5000 ng/mL for support of toxicokinetic studies. The inter-day relative error (inaccuracy) for the LLOQ samples ranged from -5.5% to 0.3%. The inter-day relative standard deviations (imprecision) at the respective LLOQ levels were < or =10.1%.

  10. A methodology to quantitatively evaluate the safety of a glazing robot.

    PubMed

    Lee, Seungyeol; Yu, Seungnam; Choi, Junho; Han, Changsoo

    2011-03-01

    A new construction method using robots is spreading widely among construction sites in order to overcome labour shortages and frequent construction accidents. Along with economical efficiency, safety is a very important factor for evaluating the use of construction robots in construction sites. However, the quantitative evaluation of safety is difficult compared with that of economical efficiency. In this study, we suggested a safety evaluation methodology by defining the 'worker' and 'work conditions' as two risk factors, defining the 'worker' factor as posture load and the 'work conditions' factor as the work environment and the risk exposure time. The posture load evaluation reflects the risk of musculoskeletal disorders which can be caused by work posture and the risk of accidents which can be caused by reduced concentration. We evaluated the risk factors that may cause various accidents such as falling, colliding, capsizing, and squeezing in work environments, and evaluated the operational risk by considering worker exposure time to risky work environments. With the results of the evaluations for each factor, we calculated the general operational risk and deduced the improvement ratio in operational safety by introducing a construction robot. To verify these results, we compared the safety of the existing human manual labour and the proposed robotic labour construction methods for manipulating large glass panels.

  11. Evaluating a Performance-Ideal vs. Great Performance

    ERIC Educational Resources Information Center

    Bar-Elli, Gilead

    2004-01-01

    Based on a conception in which a musical composition determines aesthetic-normative properties, a distinction is drawn between two notions of performance: the "autonomous", in which a performance is regarded as a musical work on its own, and the "intentionalistic", in which it is regarded as essentially of a particular work. An ideal…

  12. Flexible pavement performance evaluation using deflection criteria

    NASA Astrophysics Data System (ADS)

    Wedner, R. J.

    1980-04-01

    Flexible pavement projects in Nebraska were monitored for dynamic deflections, roughness, and distress for six consecutive years. Present surface conditions were characterized and data for evaluating rehabilitation needs, including amount of overlay, were provided. Data were evaluated and factors were isolated for determining the structural adequacy of flexible pavements, evaluating existing pavement strength and soil subgrade conditions, and determining overlay thickness requirements. Terms for evaluating structural condition for pavement sufficiently ratings were developed and existing soil support value and subgrade strength province maps were evaluated.

  13. Quantitative determination of triterpenoid glycosides in Fatsia japonica Decne. & Planch. using high performance liquid chromatography.

    PubMed

    Ye, Xuewei; Yu, Siran; Lian, Xiao-Yuan; Zhang, Zhizhen

    2014-01-01

    Fatsia japonica Decne. & Planch. is a triterpenoid glycoside-rich herb with anti-inflammatory activity for the treatment of rheumatoid arthritis. A method for quantitative analysis of the complex triterpenoid glycosides in this medicinal plant has not been established so far. In this study, a high performance liquid chromatography (HPLC) method was developed for simultaneous qualification of 11 glycosides in F. japonica. The analysis was performed on an ODS-2 Hypersil column (250mm×4.6mm, 5μm) with a binary gradient mobile phase of water and acetonitrile. The established HPLC method was validated in terms of linearity, sensitivity, stability, precision, accuracy, and recovery. Results showed that this method had good linearity with R(2) at 0.99992-0.99999 in the test range of 0.04-9.00μg/μL. The limit of detection (LOD) and limit of quantification (LOQ) for the standard compounds were 0.013-0.020μg/μL and 0.040-0.060μg/μL. The relative standard deviations (RSDs%) of run variations were 0.83-1.40% for intra-day and 0.84-3.59% for inter-day. The analyzed compounds in the samples were stable for at least 36h, and the spike recoveries of the detected glycosides were 99.67-103.11%. The developed HPLC method was successfully applied for the measurements of the contents of 11 triterpenoid glycoside in different parts of F. japonica. Taken together, the HPLC method newly developed in this study could be used for qualitative and quantitative analysis of the bioactive triterpenoid glycosides in F. japonica and its products.

  14. Clinical value of real-time elastography quantitative parameters in evaluating the stage of liver fibrosis and cirrhosis

    PubMed Central

    GE, LAN; SHI, BAOMIN; SONG, YE; LI, YUAN; WANG, SHUO; WANG, XIUYAN

    2015-01-01

    The aim of the present study was to assess the value of real-time elastography (RTE) quantitative parameters, namely the liver fibrosis (LF) index and the ratio of blue area (%AREA), in evaluating the stage of liver fibrosis. RTE quantitative analysis software was used to examine 120 patients with chronic hepatitis in order to obtain the values for 12 quantitative parameters from the elastic images. The diagnostic performance of two such parameters, the LF index and %AREA, were assessed with a receiver operating characteristic (ROC) curve to determine the optimal diagnostic cut-off values for liver cirrhosis and fibrosis. A good correlation was observed between the LF index and %AREA with the fibrosis stage. The areas under the ROC curve for the LF index were 0.985 for the diagnosis of liver cirrhosis and 0.790 for liver fibrosis. With regard to %AREA, the areas under the ROC curve for the diagnosis of liver cirrhosis and fibrosis were 0.963 and 0.770, respectively. An LF index of >3.25 and a %AREA of >28.83 for the diagnosis of cirrhosis stage resulted in sensitivity values of 100 and 100%, specificity values of 88.9 and 85.9% and accuracy values of 90.8 and 88.3%, respectively. The LF index and %AREA parameters exhibited higher reliability in the diagnosis of liver cirrhosis compared with the diagnosis of the liver fibrosis stage. However, the two parameters possessed a similar efficacy in the diagnosis of liver cirrhosis and the stage of liver fibrosis. Therefore, the quantitative RTE parameters of the LF index and %AREA may be clinically applicable as reliable indices for the early diagnosis of liver cirrhosis, without the requirement of an invasive procedure. PMID:26622426

  15. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  16. Evaluation of ViroCyt® Virus Counter for rapid filovirus quantitation.

    PubMed

    Rossi, Cynthia A; Kearney, Brian J; Olschner, Scott P; Williams, Priscilla L; Robinson, Camenzind G; Heinrich, Megan L; Zovanyi, Ashley M; Ingram, Michael F; Norwood, David A; Schoepp, Randal J

    2015-03-01

    Development and evaluation of medical countermeasures for diagnostics, vaccines, and therapeutics requires production of standardized, reproducible, and well characterized virus preparations. For filoviruses this includes plaque assay for quantitation of infectious virus, transmission electron microscopy (TEM) for morphology and quantitation of virus particles, and real-time reverse transcription PCR for quantitation of viral RNA (qRT-PCR). The ViroCyt® Virus Counter (VC) 2100 (ViroCyt, Boulder, CO, USA) is a flow-based instrument capable of quantifying virus particles in solution. Using a proprietary combination of fluorescent dyes that stain both nucleic acid and protein in a single 30 min step, rapid, reproducible, and cost-effective quantification of filovirus particles was demonstrated. Using a seed stock of Ebola virus variant Kikwit, the linear range of the instrument was determined to be 2.8E+06 to 1.0E+09 virus particles per mL with coefficient of variation ranging from 9.4% to 31.5% for samples tested in triplicate. VC particle counts for various filovirus stocks were within one log of TEM particle counts. A linear relationship was established between the plaque assay, qRT-PCR, and the VC. VC results significantly correlated with both plaque assay and qRT-PCR. These results demonstrated that the VC is an easy, fast, and consistent method to quantify filoviruses in stock preparations. PMID:25710889

  17. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  18. Experimental Evaluation of Quantitative Diagnosis Technique for Hepatic Fibrosis Using Ultrasonic Phantom

    NASA Astrophysics Data System (ADS)

    Koriyama, Atsushi; Yasuhara, Wataru; Hachiya, Hiroyuki

    2012-07-01

    Since clinical diagnosis using ultrasonic B-mode images depends on the skill of the doctor, the realization of a quantitative diagnosis method using an ultrasound echo signal is highly required. We have been investigating a quantitative diagnosis technique, mainly for hepatic disease. In this paper, we present the basic experimental evaluation results on the accuracy of the proposed quantitative diagnosis technique for hepatic fibrosis by using a simple ultrasonic phantom. As a region of interest crossed on the boundary between two scatterer areas with different densities in a phantom, we can simulate the change of the echo amplitude distribution from normal tissue to fibrotic tissue in liver disease. The probability density function is well approximated by our fibrosis distribution model that is a mixture of normal and fibrotic tissue. The fibrosis parameters of the amplitude distribution model can be estimated relatively well at a mixture rate from 0.2 to 0.6. In the inversion processing, the standard deviation of the estimated fibrosis results at mixture ratios of less than 0.2 and larger than 0.6 are relatively large. Although the probability density is not large at high amplitude, the estimated variance ratio and mixture rate of the model are strongly affected by higher amplitude data.

  19. Comparison of Diagnostic Performance of Semi-Quantitative Knee Ultrasound and Knee Radiography with MRI: Oulu Knee Osteoarthritis Study

    PubMed Central

    Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W.; Arokoski, Jari P.; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T.; Tervonen, Osmo; Koski, Juhani M.; Saarakkala, Simo

    2016-01-01

    Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level. PMID:26926836

  20. Comparison of Diagnostic Performance of Semi-Quantitative Knee Ultrasound and Knee Radiography with MRI: Oulu Knee Osteoarthritis Study.

    PubMed

    Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W; Arokoski, Jari P; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T; Tervonen, Osmo; Koski, Juhani M; Saarakkala, Simo

    2016-01-01

    Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level. PMID:26926836

  1. A genome-wide quantitative trait loci scan of neurocognitive performances in families with schizophrenia.

    PubMed

    Lien, Y-J; Liu, C-M; Faraone, S V; Tsuang, M T; Hwu, H-G; Hsiao, P-C; Chen, W J

    2010-10-01

    Patients with schizophrenia frequently display neurocognitive dysfunction, and genetic studies suggest it to be an endophenotype for schizophrenia. Genetic studies of such traits may thus help elucidate the biological pathways underlying genetic susceptibility to schizophrenia. This study aimed to identify loci influencing neurocognitive performance in schizophrenia. The sample comprised of 1207 affected individuals and 1035 unaffected individuals of Han Chinese ethnicity from 557 sib-pair families co-affected with DSM-IV (Diagnostic and Statistical Manual, Fourth Edition) schizophrenia. Subjects completed a face-to-face semi-structured interview, the continuous performance test (CPT) and the Wisconsin card sorting test (WCST), and were genotyped with 386 microsatellite markers across the genome. A series of autosomal genome-wide multipoint nonparametric quantitative trait loci (QTL) linkage analysis were performed in affected individuals only. Determination of genome-wide empirical significance was performed using 1000 simulated genome scans. One linkage peak attaining genome-wide significance was identified: 12q24.32 for undegraded CPT hit rate [nonparametric linkage z (NPL-Z) scores = 3.32, genome-wide empirical P = 0.03]. This result was higher than the peak linkage signal obtained in the previous genome-wide scan using a dichotomous diagnosis of schizophrenia. The identification of 12q24.32 as a QTL has not been consistently implicated in previous linkage studies on schizophrenia, which suggests that the analysis of endophenotypes provides additional information from what is seen in analyses that rely on diagnoses. This region with linkage to a particular neurocognitive feature may inform functional hypotheses for further genetic studies for schizophrenia.

  2. EVALUATION OF QUANTITATIVE REAL TIME PCR FOR THE MEASUREMENT OF HELICOBATER PYLORI AT LOW CONCENTRATIONS IN DRINKING WATER

    EPA Science Inventory

    Aims: To determine the performance of a rapid, real time polymerase chain reaction (PCR) method for the detection and quantitative analysis Helicobacter pylori at low concentrations in drinking water.

    Methods and Results: A rapid DNA extraction and quantitative PCR (QPCR)...

  3. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  4. Quantitative evaluation study of four-dimensional gated cardiac SPECT reconstruction †

    PubMed Central

    Jin, Mingwu; Yang, Yongyi; Niu, Xiaofeng; Marin, Thibault; Brankov, Jovan G.; Feng, Bing; Pretorius, P. Hendrik; King, Michael A.; Wernick, Miles N.

    2013-01-01

    In practice gated cardiac SPECT images suffer from a number of degrading factors, including distance-dependent blur, attenuation, scatter, and increased noise due to gating. Recently we proposed a motion-compensated approach for four-dimensional (4D) reconstruction for gated cardiac SPECT, and demonstrated that use of motion-compensated temporal smoothing could be effective for suppressing the increased noise due to lowered counts in individual gates. In this work we further develop this motion-compensated 4D approach by also taking into account attenuation and scatter in the reconstruction process, which are two major degrading factors in SPECT data. In our experiments we conducted a thorough quantitative evaluation of the proposed 4D method using Monte Carlo simulated SPECT imaging based on the 4D NURBS-based cardiac-torso (NCAT) phantom. In particular we evaluated the accuracy of the reconstructed left ventricular myocardium using a number of quantitative measures including regional bias-variance analyses and wall intensity uniformity. The quantitative results demonstrate that use of motion-compensated 4D reconstruction can improve the accuracy of the reconstructed myocardium, which in turn can improve the detectability of perfusion defects. Moreover, our results reveal that while traditional spatial smoothing could be beneficial, its merit would become diminished with the use of motion-compensated temporal regularization. As a preliminary demonstration, we also tested our 4D approach on patient data. The reconstructed images from both simulated and patient data demonstrated that our 4D method can improve the definition of the LV wall. PMID:19724094

  5. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  6. Evaluation of green coffee beans quality using near infrared spectroscopy: a quantitative approach.

    PubMed

    Santos, João Rodrigo; Sarraguça, Mafalda C; Rangel, António O S S; Lopes, João A

    2012-12-01

    Characterisation of coffee quality based on bean quality assessment is associated with the relative amount of defective beans among non-defective beans. It is therefore important to develop a methodology capable of identifying the presence of defective beans that enables a fast assessment of coffee grade and that can become an analytical tool to standardise coffee quality. In this work, a methodology for quality assessment of green coffee based on near infrared spectroscopy (NIRS) is proposed. NIRS is a green chemistry, low cost, fast response technique without the need of sample processing. The applicability of NIRS was evaluated for Arabica and Robusta varieties from different geographical locations. Partial least squares regression was used to relate the NIR spectrum to the mass fraction of defective and non-defective beans. Relative errors around 5% show that NIRS can be a valuable analytical tool to be used by coffee roasters, enabling a simple and quantitative evaluation of green coffee quality in a fast way.

  7. Quantitative contribution of resistance sources of components to stack performance for planar solid oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Jin, Le; Guan, Wanbing; Ma, Xiao; Zhai, Huijuan; Wang, Wei Guo

    2014-05-01

    This study detects the resistance that influences the stack performance of SOFCs with composition of Ni-YSZ/YSZ/LSC-YSZ and investigates the variation patterns of the resistances of the stack repeating unit (SRU) during operation and their quantitative contributions to its performance at 700 °C, 750 °C and 800 °C. The results indicate that when the cell cathode contacts the interconnect well, the cell resistance accounts for 70.1-79.7% of that of the SRU, and the contact resistance (CR) between the cathode current-collecting layer (CCCL) and the interconnect accounts for 20.0-28.9%. The CR between the anode current-collecting layer (ACCL) and the interconnect together with the resistance of the interconnect can be neglected during instantaneous I-V testing. When the stack is discharged at constant current for 600 h, cell resistance increases by 28.3%, accounting for 93.3% of the SRU degradation, the anodic CR increases by 36.4%, accounting for 6.7% of the SRU degradation, and the resistances of the cathode contact and its neighbor interconnect remain unchanged. Therefore, the increase of the cell resistance is the main reason causing the SRU degradation, and the anodic contact is also an influencing factor that cannot be neglected during stable operation.

  8. Quantitative evaluation of an experimental inflammation induced with Freund's complete adjuvant in dogs.

    PubMed

    Botrel, M A; Haak, T; Legrand, C; Concordet, D; Chevalier, R; Toutain, P L

    1994-10-01

    A chronic inflammation model in dogs was induced by intraarticular injection of Freund's Complete Adjuvant in the stifle. After a primary, acute response during the first 24 hr, a secondary subacute response was observed after a delay of approximately 3 weeks and persisted for several weeks. To evaluate the time course of the inflammatory process quantitatively, we tested more than 100 different parameters. Finally, only four parameters were selected based on practicability and metrological properties, namely, the body temperature, difference in skin temperature, difference in stifle diameter and vertical force exerted by arthritic hind limb measured using a force plate. The main results of the experimentation were the demonstration that these four parameters were sufficiently repeatable, reproducible, and appropriate to be used for quantitative evaluation of the inflammatory process, and that training of both animals and investigators was required. Finally, it was illustrated that an adjuvant periarthritis in dogs can be used to carry out a pharmacokinetic/pharmacodynamic modelling of an antiinflammatory drug. PMID:7865864

  9. Highly sensitive and quantitative evaluation of the EGFR T790M mutation by nanofluidic digital PCR.

    PubMed

    Iwama, Eiji; Takayama, Koichi; Harada, Taishi; Okamoto, Isamu; Ookubo, Fumihiko; Kishimoto, Junji; Baba, Eishi; Oda, Yoshinao; Nakanishi, Yoichi

    2015-08-21

    The mutation of T790M in EGFR is a major mechanism of resistance to treatment with EGFR-TKIs. Only qualitative detection (presence or absence) of T790M has been described to date, however. Digital PCR (dPCR) analysis has recently been applied to the quantitative detection of target molecules in cancer with high sensitivity. In the present study, 25 tumor samples (13 obtained before and 12 after EGFR-TKI treatment) from 18 NSCLC patients with activating EGFR mutations were evaluated for T790M with dPCR. The ratio of the number of T790M alleles to that of activating mutation alleles (T/A) was determined. dPCR detected T790M in all 25 samples. Although T790M was present in all pre-TKI samples from 13 patients, 10 of these patients had a low T/A ratio and manifested substantial tumor shrinkage during treatment with EGFR-TKIs. In six of seven patients for whom both pre- and post-TKI samples were available, the T/A ratio increased markedly during EGFR-TKI treatment. Highly sensitive dPCR thus detected T790M in all NSCLC patients harboring activating EGFR mutations whether or not they had received EGFR-TKI treatment. Not only highly sensitive but also quantitative detection of T790M is important for evaluation of the contribution of T790M to EGFR-TKI resistance.

  10. Panoramic imaging is not suitable for quantitative evaluation, classification, and follow up in unilateral condylar hyperplasia.

    PubMed

    Nolte, J W; Karssemakers, L H E; Grootendorst, D C; Tuinzing, D B; Becking, A G

    2015-05-01

    Patients with suspected unilateral condylar hyperplasia are often screened radiologically with a panoramic radiograph, but this is not sufficient for routine diagnosis and follow up. We have therefore made a quantitative analysis and evaluation of panoramic radiographs in a large group of patients with the condition. During the period 1994-2011, 132 patients with 113 panoramic radiographs were analysed using a validated method. There was good reproducibility between observers, but the condylar neck and head were the regions reported with least reliability. Although in most patients asymmetry of the condylar head, neck, and ramus was confirmed, the kappa coefficient as an indicator of agreement between two observers was poor (-0.040 to 0.504). Hardly any difference between sides was measured at the gonion angle, and the body appeared to be higher on the affected side in 80% of patients. Panoramic radiographs might be suitable for screening, but are not suitable for the quantitative evaluation, classification, and follow up of patients with unilateral condylar hyperplasia. PMID:25798757

  11. Quantitative evaluation on internal seeing induced by heat-stop of solar telescope.

    PubMed

    Liu, Yangyi; Gu, Naiting; Rao, Changhui

    2015-07-27

    heat-stop is one of the essential thermal control devices of solar telescope. The internal seeing induced by its temperature rise will degrade the imaging quality significantly. For quantitative evaluation on internal seeing, an integrated analysis method based on computational fluid dynamics and geometric optics is proposed in this paper. Firstly, the temperature field of the heat-affected zone induced by heat-stop temperature rise is obtained by the method of computational fluid dynamics calculation. Secondly, the temperature field is transformed to refractive index field by corresponding equations. Thirdly, the wavefront aberration induced by internal seeing is calculated by geometric optics based on optical integration in the refractive index field. This integrated method is applied in the heat-stop of the Chinese Large Solar Telescope to quantitatively evaluate its internal seeing. The analytical results show that the maximum acceptable temperature rise of heat-stop is up to 5 Kelvins above the ambient air at any telescope pointing directions under the condition that the root-mean-square of wavefront aberration induced by internal seeing is less than 25nm. Furthermore, it is found that the magnitude of wavefront aberration gradually increases with the increase of heat-stop temperature rise for a certain telescope pointing direction. Meanwhile, with the variation of telescope pointing varying from the horizontal to the vertical direction, the magnitude of wavefront aberration decreases at first and then increases for the same heat-stop temperature rise.

  12. An evaluation of ARM radiosonde operational performance

    SciTech Connect

    Lesht, B.M.

    1995-06-01

    Because the ARM (Atmospheric Radiation Measurement) program uses data from radiosondes for real-time quality control and sensitive modeling applications, it is important to have a quantitative measure of the quality of the radiosonde data themselves. Two methods have been tried for estimating the quality of radiosonde data: comparisons with known standards before launch and examination of pseudo-replicate samples by single sensors aloft. The ground check procedure showed that the ARM radiosondes are within manufacturer`s specifications for measuring relative humidity; procedural artifacts prevented verification for temperature. Pseudo-replicates from ascent and descent suggest that the temperature measurement is within the specified {minus_plus}0.2 C. On average ascent and descent data are similar, but detailed structure may be obscured on descent by loss of sampling density, and the descent involves other uncertainties.

  13. Development and evaluation of quantitative-competitive PCR for quantitation of coxsackievirus B3 RNA in experimentally infected murine tissues.

    PubMed

    Reetoo, K N; Osman, S A; Illavia, S J; Banatvala, J E; Muir, P

    1999-10-01

    A method is described for quantitation of enterovirus RNA in experimentally infected murine tissues. Viral RNA was extracted from tissue samples and amplified by reverse transcriptase PCR in the presence of an internal standard RNA. The ratio of PCR product derived from viral RNA and internal standard RNA was then determined using specific probes in a post-PCR electrochemiluminescent hybridization assay. This provided an estimate of the viral RNA copy number in the original sample, and detection of PCR product derived from internal standard RNA validated sample processing and amplification procedures. RNA copy number correlated with viral infectivity of cell culture-derived virus, and one tissue culture infective dose was found to contain approximately 10(3) genome equivalents. The ratio of RNA copy number to infectivity in myocardial tissue taken from mice during the acute phase of coxsackievirus B3 myocarditis was more variable ranging from 10(4)-10(7), and was dependent on the stage of infection, reflecting differential rates of clearance for viral RNA and viral infectivity. The assay is rapid, and could facilitate investigations which currently rely upon enterovirus quantitation by titration in cell culture. This would be useful for experimental studies of viral pathogenesis, prophylaxis and antiviral therapy.

  14. FLUORESCENT TRACER EVALUATION OF PROTECTIVE CLOTHING PERFORMANCE

    EPA Science Inventory

    Field studies evaluating chemical protective clothing (CPC), which is often employed as a primary control option to reduce occupational exposures during pesticide applications, are limited. This study, supported by the U.S. Environmental Protection Agency (EPA), was designed to...

  15. Evaluating Performances of Solar-Energy Systems

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1987-01-01

    CONC11 computer program calculates performances of dish-type solar thermal collectors and power systems. Solar thermal power system consists of one or more collectors, power-conversion subsystems, and powerprocessing subsystems. CONC11 intended to aid system designer in comparing performance of various design alternatives. Written in Athena FORTRAN and Assembler.

  16. Building China's municipal healthcare performance evaluation system: a Tuscan perspective.

    PubMed

    Li, Hao; Barsanti, Sara; Bonini, Anna

    2012-08-01

    Regional healthcare performance evaluation systems can help optimize healthcare resources on regional basis and improve the performance of healthcare services provided. The Tuscany region in Italy is a good example of an institution which meets these requirements. China has yet to build such a system based on international experience. In this paper, based on comparative studies between Tuscany and China, we propose that the managing institutions in China's experimental cities can select and commission a third-party agency to, respectively, evaluate the performance of their affiliated hospitals and community health service centers. Following some features of the Tuscan experience, the Chinese municipal healthcare performance evaluation system can be built by focusing on the selection of an appropriate performance evaluation agency, the design of an adequate performance evaluation mechanism and the formulation of a complete set of laws, rules and regulations. When a performance evaluation system at city level is formed, the provincial government can extend the successful experience to other cities.

  17. 13 CFR 306.7 - Performance evaluations of University Centers.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Performance evaluations of..., DEPARTMENT OF COMMERCE TRAINING, RESEARCH AND TECHNICAL ASSISTANCE INVESTMENTS University Center Economic Development Program § 306.7 Performance evaluations of University Centers. (a) EDA will: (1) Evaluate...

  18. 48 CFR 1536.201 - Evaluation of contracting performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Contracting for Construction 1536.201 Evaluation of contracting performance. (a) The Contracting Officer will... will file the form in the contractor performance evaluation files which it maintains. (e) The Quality... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Evaluation of...

  19. 48 CFR 2936.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Construction 2936.201 Evaluation of contractor performance. The HCA must establish procedures to evaluate... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Evaluation of contractor performance. 2936.201 Section 2936.201 Federal Acquisition Regulations System DEPARTMENT OF LABOR...

  20. 48 CFR 36.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Contracting for Construction 36.201 Evaluation of contractor performance. See 42.1502(e) for the requirements for preparing past performance evaluations for construction contracts. ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Evaluation of...

  1. 13 CFR 306.7 - Performance evaluations of University Centers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Performance evaluations of University Centers. 306.7 Section 306.7 Business Credit and Assistance ECONOMIC DEVELOPMENT ADMINISTRATION... Development Program § 306.7 Performance evaluations of University Centers. (a) EDA will: (1) Evaluate...

  2. Team Primacy Concept (TPC) Based Employee Evaluation and Job Performance

    ERIC Educational Resources Information Center

    Muniute, Eivina I.; Alfred, Mary V.

    2007-01-01

    This qualitative study explored how employees learn from Team Primacy Concept (TPC) based employee evaluation and how they use the feedback in performing their jobs. TPC based evaluation is a form of multirater evaluation, during which the employee's performance is discussed by one's peers in a face-to-face team setting. The study used Kolb's…

  3. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    SciTech Connect

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.; Vaccaro, S.; Schwalbach, P.; Liljenfeldt, Henrik; Tobin, Stephen

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  4. Quantitative evaluation of regularized phase retrieval algorithms on bone scaffolds seeded with bone cells

    NASA Astrophysics Data System (ADS)

    Weber, L.; Langer, M.; Tavella, S.; Ruggiu, A.; Peyrin, F.

    2016-05-01

    In the field of regenerative medicine, there has been a growing interest in studying the combination of bone scaffolds and cells that can maximize newly formed bone. In-line phase-contrast x-ray tomography was used to image porous bone scaffolds (Skelite©), seeded with bone forming cells. This technique allows the quantification of both mineralized and soft tissue, unlike with classical x-ray micro-computed tomography. Phase contrast images were acquired at four distances. The reconstruction is typically performed in two successive steps: phase retrieval and tomographic reconstruction. In this work, different regularization methods were applied to the phase retrieval process. The application of a priori terms for heterogeneous objects enables quantitative 3D imaging of not only bone morphology, mineralization, and soft tissue formation, but also cells trapped in the pre-bone matrix. A statistical study was performed to derive statistically significant information on the different culture conditions.

  5. EVALUATION OF VENTILATION PERFORMANCE FOR INDOOR SPACE

    EPA Science Inventory

    The paper discusses a personal-computer-based application of computational fluid dynamics that can be used to determine the turbulent flow field and time-dependent/steady-state contaminant concentration distributions within isothermal indoor space. (NOTE: Ventilation performance ...

  6. Evaluation of performance impairment by spacecraft contaminants

    NASA Technical Reports Server (NTRS)

    Geller, I.; Hartman, R. J., Jr.; Mendez, V. M.

    1977-01-01

    The environmental contaminants (isolated as off-gases in Skylab and Apollo missions) were evaluated. Specifically, six contaminants were evaluated for their effects on the behavior of juvenile baboons. The concentrations of contaminants were determined through preliminary range-finding studies with laboratory rats. The contaminants evaluated were acetone, methyl ethyl ketone (MEK), methyl isobutyl ketone (MIBK), trichloroethylene (TCE), heptane and Freon 21. When the studies of the individual gases were completed, the baboons were also exposed to a mixture of MEK and TCE. The data obtained revealed alterations in the behavior of baboons exposed to relatively low levels of the contaminants. These findings were presented at the First International Symposium on Voluntary Inhalation of Industrial Solvents in Mexico City, June 21-24, 1976. A preprint of the proceedings is included.

  7. 24 CFR 968.330 - PHA performance and evaluation report.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false PHA performance and evaluation... 250 or More Public Housing Units) § 968.330 PHA performance and evaluation report. For any FFY in which a PHA has received assistance under this subpart, the PHA shall submit a Performance...

  8. A model for evaluating the social performance of construction waste management

    SciTech Connect

    Yuan Hongping

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Scant attention is paid to social performance of construction waste management (CWM). Black-Right-Pointing-Pointer We develop a model for assessing the social performance of CWM. Black-Right-Pointing-Pointer With the model, the social performance of CWM can be quantitatively simulated. - Abstract: It has been determined by existing literature that a lot of research efforts have been made to the economic performance of construction waste management (CWM), but less attention is paid to investigation of the social performance of CWM. This study therefore attempts to develop a model for quantitatively evaluating the social performance of CWM by using a system dynamics (SD) approach. Firstly, major variables affecting the social performance of CWM are identified and a holistic system for assessing the social performance of CWM is formulated in line with feedback relationships underlying these variables. The developed system is then converted into a SD model through the software iThink. An empirical case study is finally conducted to demonstrate application of the model. Results of model validation indicate that the model is robust and reasonable to reflect the situation of the real system under study. Findings of the case study offer helpful insights into effectively promoting the social performance of CWM of the project investigated. Furthermore, the model exhibits great potential to function as an experimental platform for dynamically evaluating effects of management measures on improving the social performance of CWM of construction projects.

  9. Reprint of "Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging".

    PubMed

    Oishi, Kenichi; Faria, Andreia V; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2014-02-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a "growth percentile chart," which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced. Future

  10. Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging.

    PubMed

    Oishi, Kenichi; Faria, Andreia V; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2013-11-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a "growth percentile chart," which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced. Future

  11. Quantitative evaluation of proteins in one- and two-dimensional polyacrylamide gels using a fluorescent stain.

    PubMed

    Nishihara, Julie C; Champion, Kathleen M

    2002-07-01

    The characteristics of protein detection and quantitation with SYPRO Ruby protein gel stain in one- and two-dimensional polyacrylamide gels were evaluated. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) analyses of three different purified recombinant proteins showed that the limits of detection were comparable to the limits of detection with ammoniacal silver staining and were protein-specific, ranging from 0.5 to 5 ng. The linearity of the relationship between protein level and SYPRO Ruby staining intensity also depended on the individual protein, with observed linear dynamic ranges of 200-, 500-, and, 1000-fold for proteins analyzed by SDS-PAGE. SYPRO Ruby protein gel stain was also evaluated in two-dimensional electrophoretic (2-DE) analysis of Escherichia coli proteins. The experiment involved analysis of replicates of the same sample as well as dilution of the sample from 0.5 to 50 nug total protein across gels. In addition to validating the 2-DE system itself, the experiment was used to evaluate three different image analysis programs: Z3 (Compugen), Progenesis (Nonlinear Dynamics), and PDQuest (Bio-Rad). In each program, we analyzed the 2-DE images with respect to sensitivity and reproducibility of overall protein spot detection, as well as linearity of response for 20 representative proteins of different molecular weights and pI. Across all three programs, coefficients of variation (CV) in total number of spots detected among replicate gels ranged from 4 to 11%. For the 20 representative proteins, spot quantitation was also comparable with CVs for gel-to-gel reproducibility ranging from 3 to 33%. Using Progenesis and PDQuest, a 1000-fold linear dynamic range of SYPRO Ruby was demonstrated with a single known protein. These two programs were more suitable than Z3 for examining individual protein spot quantity across a series of gels and gave comparable results.

  12. The use of a battery of tracking tests in the quantitative evaluation of neurological function

    NASA Technical Reports Server (NTRS)

    Repa, B. S.; Albers, J. W.; Potvin, A. R.; Tourtellotte, W. W.

    1972-01-01

    A tracking test battery has been applied in a drug trail designed to compare the efficacy of L-DOPA and amantadine to that of L-DOPA and placebo in the treatment of 28 patients with Parkinson's disease. The drug trial provided an ideal opportunity for objectively evaluating the usefulness of tracking tests in assessing changes in neurologic function. Evaluating changes in patient performance resulting from disease progression and controlled clinical trials is of great importance in establishing effective treatment programs.

  13. Reduced short term memory in congenital adrenal hyperplasia (CAH) and its relationship to spatial and quantitative performance.

    PubMed

    Collaer, Marcia L; Hindmarsh, Peter C; Pasterski, Vickie; Fane, Briony A; Hines, Melissa

    2016-02-01

    Girls and women with classical congenital adrenal hyperplasia (CAH) experience elevated androgens prenatally and show increased male-typical development for certain behaviors. Further, individuals with CAH receive glucocorticoid (GC) treatment postnatally, and this GC treatment could have negative cognitive consequences. We investigated two alternative hypotheses, that: (a) early androgen exposure in females with CAH masculinizes (improves) spatial perception and quantitative abilities at which males typically outperform females, or (b) CAH is associated with performance decrements in these domains, perhaps due to reduced short-term-memory (STM). Adolescent and adult individuals with CAH (40 female and 29 male) were compared with relative controls (29 female and 30 male) on spatial perception and quantitative abilities as well as on Digit Span (DS) to assess STM and on Vocabulary to assess general intelligence. Females with CAH did not perform better (more male-typical) on spatial perception or quantitative abilities than control females, failing to support the hypothesis of cognitive masculinization. Rather, in the sample as a whole individuals with CAH scored lower on spatial perception (p ≤ .009), a quantitative composite (p ≤ .036), and DS (p ≤ .001), despite no differences in general intelligence. Separate analyses of adolescent and adult participants suggested the spatial and quantitative effects might be present only in adult patients with CAH; however, reduced DS performance was found in patients with CAH regardless of age group. Separate regression analyses showed that DS predicted both spatial perception and quantitative performance (both p ≤ .001), when age, sex, and diagnosis status were controlled. Thus, reduced STM in CAH patients versus controls may have more general cognitive consequences, potentially reducing spatial perception and quantitative skills. Although hyponatremia or other aspects of salt-wasting crises or additional hormone

  14. Quantitative analysis of real-time tissue elastography for evaluation of liver fibrosis

    PubMed Central

    Shi, Ying; Wang, Xing-Hua; Zhang, Huan-Hu; Zhang, Hai-Qing; Tu, Ji-Zheng; Wei, Kun; Li, Juan; Liu, Xiao-Li

    2014-01-01

    The present study aimed to investigate the feasibility of quantitative analysis of liver fibrosis using real-time tissue elastography (RTE) and its pathological and molecule biological basis. Methods: Fifty-four New Zealand rabbits were subcutaneously injected with thioacetamide (TAA) to induce liver fibrosis as the model group, and another eight New Zealand rabbits served as the normal control group. Four rabbits were randomly taken every two weeks for real-time tissue elastography (RTE) and quantitative analysis of tissue diffusion. The obtained twelve characteristic quantities included relative mean value (MEAN), standard deviation (SD), blue area % (% AREA), complexity (COMP), kurtosis (KURT), skewness (SKEW), contrast (CONT), entropy (ENT), inverse different moment (IDM), angular secon moment (ASM), correlation (CORR) and liver fibrosis index (LF Index). Rabbits were executed and liver tissues were taken for pathological staging of liver fibrosis (grouped by pathological stage into S0 group, S1 group, S2 group, S3 group and S4 group). In addition, the collagen I (Col I) and collagen III (Col III) expression levels in liver tissue were detected by Western blot. Results: Except for KURT, there were significant differences among the other eleven characteristic quantities (P < 0.05). LF Index, Col I and Col III expression levels showed a rising trend with increased pathological staging of liver fibrosis, presenting a positive correlation with the pathological staging of liver fibrosis (r = 0.718, r = 0.693, r = 0.611, P < 0.05). Conclusion: RTE quantitative analysis is expected for noninvasive evaluation of the pathological staging of liver fibrosis. PMID:24955175

  15. A quantitative metrology for performance characterization of breast tomosynthesis systems based on an anthropomorphic phantom

    NASA Astrophysics Data System (ADS)

    Ikejimba, Lynda; Chen, Yicheng; Oberhofer, Nadia; Kiarashi, Nooshin; Lo, Joseph Y.; Samei, Ehsan

    2015-03-01

    Purpose: Common methods for assessing image quality of digital breast tomosynthesis (DBT) devices currently utilize simplified or otherwise unrealistic phantoms, which use inserts in a uniform background and gauge performance based on a subjective evaluation of insert visibility. This study proposes a different methodology to assess system performance using a three-dimensional clinically-informed anthropomorphic breast phantom. Methods: The system performance is assessed by imaging the phantom and computationally characterizing the resultant images in terms of several new metrics. These include a contrast index (reflective of local difference between adipose and glandular material), a contrast to noise ratio index (reflective of contrast against local background noise), and a nonuniformity index (reflective of contributions of noise and artifacts within uniform adipose regions). Indices were measured at ROI sizes of 10mm and 37 mm, respectively. The method was evaluated at fixed dose of 1.5 mGy AGD. Results: Results indicated notable differences between systems. At 10 mm, vendor A had the highest contrast index, followed by B and C in that. The performance ranking was identical at the largest ROI size. The non-uniformity index similarly exhibited system-dependencies correlated with visual appearance of clutter from out-of-plane artifacts. Vendor A had the greatest NI at all ROI sizes, B had the second greatest, and C the least. Conclusions: The findings illustrate that the anthropomorphic phantom can be used as a quality control tool with results that are targeted to be more reflective of clinical performance of breast tomosynthesis systems of multiple manufacturers.

  16. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  17. Enhancement of a virtual reality wheelchair simulator to include qualitative and quantitative performance metrics.

    PubMed

    Harrison, C S; Grant, P M; Conway, B A

    2010-01-01

    The increasing importance of inclusive design and in particular accessibility guidelines established in the U.K. 1996 Disability Discrimination Act (DDA) has been a prime motivation for the work on wheelchair access, a subset of the DDA guidelines, described in this article. The development of these guidelines mirrors the long-standing provisions developed in the U.S. In order to raise awareness of these guidelines and in particular to give architects, building designers, and users a physical sensation of how a planned development could be experienced, a wheelchair virtual reality system was developed. This compares with conventional methods of measuring against drawings and comparing dimensions against building regulations, established in the U.K. under British standards. Features of this approach include the marriage of an electromechanical force-feedback system with high-quality immersive graphics as well as the potential ability to generate a physiological rating of buildings that do not yet exist. The provision of this sense of "feel" augments immersion within the virtual reality environment and also provides the basis from which both qualitative and quantitative measures of a building's access performance can be gained. PMID:20402044

  18. Quantitative determination of steroid acetates in pharmaceutical preparations by high performance liquid chromatography.

    PubMed

    van Dame, H C

    1980-11-01

    A high performance liquid chromatographic method is described for the rapid, quantitative determination of corticosteroid acetates in a variety of pharmaceutical preparations. The method not only separates the steroids from other ingredients in tablets, creams, ointments, lotions, or suspensions, but also from their probable degradation products. The steroid acetate is extracted from a tablet matrix or from a suspension with water and acetonitrile (50 + 50), or from a hexane suspension of a cream, ointment, or lotion with acetonitrile. A liquid chromatograph equipped with an accurate flow controller, a variable wavelength detector capable of measurement at 240 nm, and a 10 microL loop injector was used. Components were separated on a microBondpak C18 column with acetonitrile-water as the mobile phase. K' values of 10 commonly used steroid acetates are shown and compared with Rf values from thin layer chromatographic systems. Results were equivalent for cortisone acetate tablets assayed by both the proposed and the official USP methods. Other pharmaceutical preparations were assayed to test the method, with satisfactory results.

  19. Comparative evaluation of three commercial quantitative cytomegalovirus standards by use of digital and real-time PCR.

    PubMed

    Hayden, R T; Gu, Z; Sam, S S; Sun, Y; Tang, L; Pounds, S; Caliendo, A M

    2015-05-01

    The recent development of the 1st WHO International Standard for human cytomegalovirus (CMV) and the introduction of commercially produced secondary standards have raised hopes of improved agreement among laboratories performing quantitative PCR for CMV. However, data to evaluate the trueness and uniformity of secondary standards and the consistency of results achieved when these materials are run on various assays are lacking. Three concentrations of each of the three commercially prepared secondary CMV standards were tested in quadruplicate by three real-time and two digital PCR methods. The mean results were compared in a pairwise fashion with nominal values provided by each manufacturer. The agreement of results among all methods for each sample and for like concentrations of each standard was also assessed. The relationship between the nominal values of standards and the measured values varied, depending upon the assay used and the manufacturer of the standards, with the degree of bias ranging from +0.6 to -1.0 log10 IU/ml. The mean digital PCR result differed significantly among the secondary standards, as did the results of the real-time PCRs, particularly when plotted against nominal log10 IU values. Commercially available quantitative secondary CMV standards produce variable results when tested by different real-time and digital PCR assays, with various magnitudes of bias compared to nominal values. These findings suggest that the use of such materials may not achieve the intended uniformity among laboratories measuring CMV viral load, as envisioned by adaptation of the WHO standard.

  20. Performance evaluation of 1 kw PEFC

    SciTech Connect

    Komaki, Hideaki; Tsuchiyama, Syozo

    1996-12-31

    This report covers part of a joint study on a PEFC propulsion system for surface ships, summarized in a presentation to this Seminar, entitled {open_quote}Study on a PEFC Propulsion System for Surface Ships{close_quotes}, and which envisages application to a 1,500 DWT cargo vessel. The aspect treated here concerns the effects brought on PEFC operating performance by conditions particular to shipboard operation. The performance characteristics were examined through tests performed on a 1 kw stack and on a single cell (Manufactured by Fuji Electric Co., Ltd.). The tests covered the items (1) to (4) cited in the headings of the sections that follow. Specifications of the stack and single cell are as given.

  1. Performance evaluation of SAR/GMTI algorithms

    NASA Astrophysics Data System (ADS)

    Garber, Wendy; Pierson, William; Mcginnis, Ryan; Majumder, Uttam; Minardi, Michael; Sobota, David

    2016-05-01

    There is a history and understanding of exploiting moving targets within ground moving target indicator (GMTI) data, including methods for modeling performance. However, many assumptions valid for GMTI processing are invalid for synthetic aperture radar (SAR) data. For example, traditional GMTI processing assumes targets are exo-clutter and a system that uses a GMTI waveform, i.e. low bandwidth (BW) and low pulse repetition frequency (PRF). Conversely, SAR imagery is typically formed to focus data at zero Doppler and requires high BW and high PRF. Therefore, many of the techniques used in performance estimation of GMTI systems are not valid for SAR data. However, as demonstrated by papers in the recent literature,1-11 there is interest in exploiting moving targets within SAR data. The techniques employed vary widely, including filter banks to form images at multiple Dopplers, performing smear detection, and attempting to address the issue through waveform design. The above work validates the need for moving target exploitation in SAR data, but it does not represent a theory allowing for the prediction or bounding of performance. This work develops an approach to estimate and/or bound performance for moving target exploitation specific to SAR data. Synthetic SAR data is generated across a range of sensor, environment, and target parameters to test the exploitation algorithms under specific conditions. This provides a design tool allowing radar systems to be tuned for specific moving target exploitation applications. In summary, we derive a set of rules that bound the performance of specific moving target exploitation algorithms under variable operating conditions.

  2. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  3. Quality consistency evaluation of Melissa officinalis L. commercial herbs by HPLC fingerprint and quantitation of selected phenolic acids.

    PubMed

    Arceusz, Agnieszka; Wesolowski, Marek

    2013-09-01

    To evaluate the quality consistency of commercial medicinal herbs, a simple and reliable HPLC method with UV-vis detector was developed, both for fingerprint analysis and quantitation of some pharmacologically active constituents (marker compounds). Melissa officinalis L. (lemon balm) was chosen for this study because it is widely used as an aromatic, culinary and medicine remedy. About fifty peaks were found in each chromatogram of a lemon balm extract, including twelve satisfactorily resolved characteristic peaks. A reference chromatographic fingerprint for the studied medicinal herb was calculated using Matlab 9.1 software as a result of analysing all the 19 lemon balm samples obtained from 12 Polish manufacturers. The similarity values and the results of principal component analysis revealed that all the samples were highly correlated with the reference fingerprint and could be accurately classified in relation to their quality consistency. Next, a quantitation of selected phenolic acids in the studied samples was performed. The results have shown that the levels of phenolic acids, i.e. gallic, chlorogenic, syringic, caffeic, ferulic and rosmarinic were as follows (mg/g of dry weight): 0.001-0.067, 0.010-0.333, 0.007-0.553, 0.047-0.705, 0.006-1.589 and 0.158-48.608, respectively. Statistical analysis indicated that rosmarinic acid occurs in M. officinalis at the highest level, whereas gallic acid in the lowest. A detailed inspection of these data has also revealed that reference chromatographic fingerprints combined with quantitation of pharmacologically active constituents of the plant could be used as an efficient strategy for monitoring of the lemon balm quality consistency. PMID:23770780

  4. Quality consistency evaluation of Melissa officinalis L. commercial herbs by HPLC fingerprint and quantitation of selected phenolic acids.

    PubMed

    Arceusz, Agnieszka; Wesolowski, Marek

    2013-09-01

    To evaluate the quality consistency of commercial medicinal herbs, a simple and reliable HPLC method with UV-vis detector was developed, both for fingerprint analysis and quantitation of some pharmacologically active constituents (marker compounds). Melissa officinalis L. (lemon balm) was chosen for this study because it is widely used as an aromatic, culinary and medicine remedy. About fifty peaks were found in each chromatogram of a lemon balm extract, including twelve satisfactorily resolved characteristic peaks. A reference chromatographic fingerprint for the studied medicinal herb was calculated using Matlab 9.1 software as a result of analysing all the 19 lemon balm samples obtained from 12 Polish manufacturers. The similarity values and the results of principal component analysis revealed that all the samples were highly correlated with the reference fingerprint and could be accurately classified in relation to their quality consistency. Next, a quantitation of selected phenolic acids in the studied samples was performed. The results have shown that the levels of phenolic acids, i.e. gallic, chlorogenic, syringic, caffeic, ferulic and rosmarinic were as follows (mg/g of dry weight): 0.001-0.067, 0.010-0.333, 0.007-0.553, 0.047-0.705, 0.006-1.589 and 0.158-48.608, respectively. Statistical analysis indicated that rosmarinic acid occurs in M. officinalis at the highest level, whereas gallic acid in the lowest. A detailed inspection of these data has also revealed that reference chromatographic fingerprints combined with quantitation of pharmacologically active constituents of the plant could be used as an efficient strategy for monitoring of the lemon balm quality consistency.

  5. Quantitative evaluation of multi-walled carbon nanotube uptake in wheat and rapeseed.

    PubMed

    Larue, Camille; Pinault, Mathieu; Czarny, Bertrand; Georgin, Dominique; Jaillard, Danielle; Bendiab, Nedjma; Mayne-L'Hermite, Martine; Taran, Frédéric; Dive, Vincent; Carrière, Marie

    2012-08-15

    Environmental contamination with carbon nanotubes would lead to plant exposure and particularly exposure of agricultural crops. The only quantitative exposure data available to date which can be used for risk assessment comes from computer modeling. The aim of this study was to provide quantitative data relative to multi-walled carbon nanotube (MWCNT) uptake and distribution in agricultural crops, and to correlate accumulation data with impact on plant development and physiology. Roots of wheat and rapeseed were exposed in hydroponics to uniformly (14)C-radiolabeled MWCNTs. Radioimaging, transmission electron microscopy and raman spectroscopy were used to identify CNT distribution. Radioactivity counting made it possible absolute quantification of CNT accumulation in plant leaves. Impact of CNTs on seed germination, root elongation, plant biomass, evapotranspiration, chlorophyll, thiobarbituric acid reactive species and H(2)O(2) contents was evaluated. We demonstrate that less than 0.005‰ of the applied MWCNT dose is taken up by plant roots and translocated to the leaves. This accumulation does not impact plant development and physiology. In addition, it does not induce any modifications in photosynthetic activity nor cause oxidative stress in plant leaves. Our results suggest that if environmental contamination occurs and MWCNTs are in the same physico-chemical state than the ones used in the present article, MWCNT transfer to the food chain via food crops would be very low.

  6. Evaluation of a Quantitative Serological Assay for Diagnosing Chronic Pulmonary Aspergillosis

    PubMed Central

    Fujita, Yuka; Suzuki, Hokuto; Doushita, Kazushi; Kuroda, Hikaru; Takahashi, Masaaki; Yamazaki, Yasuhiro; Tsuji, Tadakatsu; Fujikane, Toshiaki; Osanai, Shinobu; Sasaki, Takaaki; Ohsaki, Yoshinobu

    2016-01-01

    The purpose of this study was to evaluate the clinical utility of a quantitative Aspergillus IgG assay for diagnosing chronic pulmonary aspergillosis. We examined Aspergillus-specific IgG levels in patients who met the following criteria: (i) chronic (duration of >3 months) pulmonary or systemic symptoms, (ii) radiological evidence of a progressive (over months or years) pulmonary lesion with surrounding inflammation, and (iii) no major discernible immunocompromising factors. Anti-Aspergillus IgG serum levels were retrospectively analyzed according to defined classifications. Mean Aspergillus IgG levels were significantly higher in the proven group than those in the possible and control groups (P < 0.01). Receiver operating characteristic curve analysis revealed that the Aspergillus IgG cutoff value for diagnosing proven cases was 50 mg of antigen-specific antibodies/liter (area under the curve, 0.94; sensitivity, 0.98; specificity, 0.84). The sensitivity and specificity for diagnosing proven cases using this cutoff were 0.77 and 0.78, respectively. The positive rates of Aspergillus IgG in the proven and possible groups were 97.9% and 39.2%, respectively, whereas that of the control group was 6.6%. The quantitative Aspergillus IgG assay offers reliable sensitivity and specificity for diagnosing chronic pulmonary aspergillosis and may be an alternative to the conventional precipitin test. PMID:27008878

  7. Quantitative evaluation of hidden defects in cast iron components using ultrasound activated lock-in vibrothermography

    SciTech Connect

    Montanini, R.; Freni, F.; Rossi, G. L.

    2012-09-15

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.

  8. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    PubMed Central

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant. PMID:25763050

  9. Quantitative and Qualitative Evaluation of Iranian Researchers’ Scientific Production in Dentistry Subfields

    PubMed Central

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-01-01

    Background: As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers’ scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. Methods: This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. Results: 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). Conclusions: The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields. PMID:26635439

  10. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  11. Space Shuttle Underside Astronaut Communications Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Dobbins, Justin A.; Loh, Yin-Chung; Kroll, Quin D.; Sham, Catherine C.

    2005-01-01

    The Space Shuttle Ultra High Frequency (UHF) communications system is planned to provide Radio Frequency (RF) coverage for astronauts working underside of the Space Shuttle Orbiter (SSO) for thermal tile inspection and repairing. This study is to assess the Space Shuttle UHF communication performance for astronauts in the shadow region without line-of-sight (LOS) to the Space Shuttle and Space Station UHF antennas. To insure the RF coverage performance at anticipated astronaut worksites, the link margin between the UHF antennas and Extravehicular Activity (EVA) Astronauts with significant vehicle structure blockage was analyzed. A series of near-field measurements were performed using the NASA/JSC Anechoic Chamber Antenna test facilities. Computational investigations were also performed using the electromagnetic modeling techniques. The computer simulation tool based on the Geometrical Theory of Diffraction (GTD) was used to compute the signal strengths. The signal strength was obtained by computing the reflected and diffracted fields along the propagation paths between the transmitting and receiving antennas. Based on the results obtained in this study, RF coverage for UHF communication links was determined for the anticipated astronaut worksite in the shadow region underneath the Space Shuttle.

  12. Performance Evaluation Gravity Probe B Design

    NASA Technical Reports Server (NTRS)

    Francis, Ronnie; Wells, Eugene M.

    1996-01-01

    This final report documents the work done to develop a 6 degree-of-freedom simulation of the Lockheed Martin Gravity Probe B (GPB) Spacecraft. This simulation includes the effects of vehicle flexibility and propellant slosh. The simulation was used to investigate the control performance of the spacecraft when subjected to realistic on orbit disturbances.

  13. Game Performance Evaluation in Male Goalball Players

    PubMed Central

    Molik, Bartosz; Morgulec-Adamowicz, Natalia; Kosmol, Andrzej; Perkowski, Krzysztof; Bednarczuk, Grzegorz; Skowroński, Waldemar; Gomez, Miguel Angel; Koc, Krzysztof; Rutkowska, Izabela; Szyman, Robert J

    2015-01-01

    Goalball is a Paralympic sport exclusively for athletes who are visually impaired and blind. The aims of this study were twofold: to describe game performance of elite male goalball players based upon the degree of visual impairment, and to determine if game performance was related to anthropometric characteristics of elite male goalball players. The study sample consisted of 44 male goalball athletes. A total of 38 games were recorded during the Summer Paralympic Games in London 2012. Observations were reported using the Game Efficiency Sheet for Goalball. Additional anthropometric measurements included body mass (kg), body height (cm), the arm span (cm) and length of the body in the defensive position (cm). The results differentiating both groups showed that the players with total blindness obtained higher means than the players with visual impairment for game indicators such as the sum of defense (p = 0.03) and the sum of good defense (p = 0.04). The players with visual impairment obtained higher results than those with total blindness for attack efficiency (p = 0.04), the sum of penalty defenses (p = 0.01), and fouls (p = 0.01). The study showed that athletes with blindness demonstrated higher game performance in defence. However, athletes with visual impairment presented higher efficiency in offensive actions. The analyses confirmed that body mass, body height, the arm span and length of the body in the defensive position did not differentiate players’ performance at the elite level. PMID:26834872

  14. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  15. Application performation evaluation of the HTMT architecture.

    SciTech Connect

    Hereld, M.; Judson, I. R.; Stevens, R.

    2004-02-23

    In this report we summarize findings from a study of the predicted performance of a suite of application codes taken from the research environment and analyzed against a modeling framework for the HTMT architecture. We find that the inward bandwidth of the data vortex may be a limiting factor for some applications. We also find that available memory in the cryogenic layer is a constraining factor in the partitioning of applications into parcels. The architecture in several examples may be inadequately exploited; in particular, applications typically did not capitalize well on the available computational power or data organizational capability in the PIM layers. The application suite provided significant examples of wide excursions from the accepted (if simplified) program execution model--in particular, by required complex in-SPELL synchronization between parcels. The availability of the HTMT-C emulation environment did not contribute significantly to the ability to analyze applications, because of the large gap between the available hardware descriptions and parameters in the modeling framework and the types of data that could be collected via HTMT-C emulation runs. Detailed analysis of application performance, and indeed further credible development of the HTMT-inspired program execution model and system architecture, requires development of much better tools. Chief among them are cycle-accurate simulation tools for computational, network, and memory components. Additionally, there is a critical need for a whole system simulation tool to allow detailed programming exercises and performance tests to be developed. We address three issues in this report: (1) The landscape for applications of petaflops computing; (2) The performance of applications on the HTMT architecture; and (3) The effectiveness of HTMT-C as a tool for studying and developing the HTMT architecture. We set the scene with observations about the course of application development as petaflops

  16. Evaluating Suit Fit Using Performance Degradation

    NASA Technical Reports Server (NTRS)

    Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar

    2012-01-01

    The Mark III planetary technology demonstrator space suit can be tailored to an individual by swapping the modular components of the suit, such as the arms, legs, and gloves, as well as adding or removing sizing inserts in key areas. A method was sought to identify the transition from an ideal suit fit to a bad fit and how to quantify this breakdown using a metric of mobility-based human performance data. To this end, the degradation of the range of motion of the elbow and wrist of the suit as a function of suit sizing modifications was investigated to attempt to improve suit fit. The sizing range tested spanned optimal and poor fit and was adjusted incrementally in order to compare each joint angle across five different sizing configurations. Suited range of motion data were collected using a motion capture system for nine isolated and functional tasks utilizing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm by itself. Findings indicated that no single joint drives the performance of the arm as a function of suit size; instead it is based on the interaction of multiple joints along a limb. To determine a size adjustment range where an individual can operate the suit at an acceptable level, a performance detriment limit was set. This user-selected limit reveals the task-dependent tolerance of the suit fit around optimal size. For example, the isolated joint motion indicated that the suit can deviate from optimal by as little as -0.6 in to -2.6 in before experiencing a 10% performance drop in the wrist or elbow joint. The study identified a preliminary method to quantify the impact of size on performance and developed a new way to gauge tolerances around optimal size.

  17. An hierarchical approach to performance evaluation of expert systems

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1985-01-01

    The number and size of expert systems is growing rapidly. Formal evaluation of these systems - which is not performed for many systems - increases the acceptability by the user community and hence their success. Hierarchical evaluation that had been conducted for computer systems is applied for expert system performance evaluation. Expert systems are also evaluated by treating them as software systems (or programs). This paper reports many of the basic concepts and ideas in the Performance Evaluation of Expert Systems Study being conducted at the University of Southwestern Louisiana.

  18. PERFORMANCE EVALUATION OF TYPE I MARINE SANITATION DEVICES

    EPA Science Inventory

    This performance test was designed to evaluate the effectiveness of two Type I Marine Sanitation Devices (MSDs): the Electro Scan Model EST 12, manufactured by Raritan Engineering Company, Inc., and the Thermopure-2, manufactured by Gross Mechanical Laboratories, Inc. Performance...

  19. QUANTITATIVE NON-DESTRUCTIVE EVALUATION (QNDE) OF THE ELASTIC MODULI OF POROUS TIAL ALLOYS

    SciTech Connect

    Yeheskel, O.

    2008-02-28

    The elastic moduli of {gamma}-TiA1 were studied in porous samples consolidated by various techniques e.g. cold isostatic pressing (CIP), pressure-less sintering, or hot isostatic pressing (HIP). Porosity linearly affects the dynamic elastic moduli of samples. The results indicate that the sound wave velocities and the elastic moduli affected by the processing route and depend not only on the attained density but also on the consolidation temperature. In this paper we show that there is linear correlation between the shear and the longitudinal sound velocities in porous TiA1. This opens the way to use a single sound velocity as a tool for quantitative non-destructive evaluation (QNDE) of porous TiA1 alloys. Here we demonstrate the applicability of an equation derived from the elastic theory and used previously for porous cubic metals.

  20. An experimental method for quantitatively evaluating the elemental processes of indoor radioactive aerosol behavior.

    PubMed

    Yamazawa, H; Yamada, S; Xu, Y; Hirao, S; Moriizumi, J

    2015-11-01

    An experimental method for quantitatively evaluating the elemental processes governing the indoor behaviour of naturally occurring radioactive aerosols was proposed. This method utilises transient response of aerosol concentrations to an artificial change in aerosol removal rate by turning on and off an air purifier. It was shown that the indoor-outdoor exchange rate and the indoor deposition rate could be estimated by a continuous measurement of outdoor and indoor aerosol number concentration measurements and by the method proposed in this study. Although the scatter of the estimated parameters is relatively large, both the methods gave consistent results. It was also found that the size distribution of radioactive aerosol particles and hence activity median aerodynamic diameter remained not largely affected by the operation of the air purifier, implying the predominance of the exchange and deposition processes over other processes causing change in the size distribution such as the size growth by coagulation and the size dependence of deposition.

  1. [Quantitative evaluation of film-screen combinations for x-ray diagnosis].

    PubMed

    Bronder, T; Heinze-Assmann, R

    1988-05-01

    The properties of screen/film combinations for radiographs set a lower limit for the x-ray exposure of the patient and an upper limit for the quality of the x-ray picture. Sensitivity, slope and resolution of different screen/film combinations were determined using a measuring phantom which was developed in the PTB. For all screens used the measurements show the same relation between screen sensitivity and resolution. This allows quantitative evaluation of image quality. A classification scheme derived from these results facilitates the selection of screen/film combinations for practical use. In addition for quality assurance gross differences in material properties and conditions of film development can be detected with the aid of the measuring phantom. PMID:3399512

  2. Methods for quantitative evaluation of dynamics of repair proteins within irradiated cells

    NASA Astrophysics Data System (ADS)

    Hable, V.; Dollinger, G.; Greubel, C.; Hauptner, A.; Krücken, R.; Dietzel, S.; Cremer, T.; Drexler, G. A.; Friedl, A. A.; Löwe, R.

    2006-04-01

    Living HeLa cells are irradiated well directed with single 100 MeV oxygen ions by the superconducting ion microprobe SNAKE, the Superconducting Nanoscope for Applied Nuclear (=Kern-) Physics Experiments, at the Munich 14 MV tandem accelerator. Various proteins, which are involved directly or indirectly in repair processes, accumulate as clusters (so called foci) at DNA-double strand breaks (DSBs) induced by the ions. The spatiotemporal dynamics of these foci built by the phosphorylated histone γ-H2AX are studied. For this purpose cells are irradiated in line patterns. The γ-H2AX is made visible under the fluorescence microscope using immunofluorescence techniques. Quantitative analysis methods are developed to evaluate the data of the microscopic images in order to analyze movement of the foci and their changing size.

  3. Quantitative non-destructive evaluation of high-temperature superconducting materials

    SciTech Connect

    Achenbach, J.D.

    1990-09-15

    Even though the currently intensive research efforts on high-temperature superconducting materials have not yet converged on a well specified material, the strong indications are that such a material will be brittle, anisotropic, and may contain many flaws such as microcracks and voids at grain boundaries. Consequently, practical applications of high temperature superconducting materials will require a very careful strength analysis based on fracture mechanics considerations. Because of the high sensitivity of the strength of such materials to the presence of defects, methods of quantitative non-destructive evaluation may be expected to play an important role in strength determinations. This proposal is concerned with the use of ultrasonic methods to detect and characterize isolated cracks, clusters of microcracks and microcracks distributed throughout the material. Particular attention will be devoted to relating ultrasonic results directly to deterministic and statistical linear elastic fracture mechanics considerations.

  4. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  5. Quantitative MR evaluation of body composition in patients with Duchenne muscular dystrophy.

    PubMed

    Pichiecchio, Anna; Uggetti, Carla; Egitto, Maria Grazia; Berardinelli, Angela; Orcesi, Simona; Gorni, Ksenija Olga Tatiana; Zanardi, Cristina; Tagliabue, Anna

    2002-11-01

    The aim of this study was to propose a quantitative MR protocol with very short acquisition time and good reliability in volume construction, for the evaluation of body composition in patients affected by Duchenne muscular dystrophy (DMD). This MR protocol was compared with common anthropometric evaluations of the same patients. Nine boys affected by DMD, ranging in age from 6 to 12 years, were selected to undergo MR examination. Transversal T1-weighted spin-echo sequences (0.5T; TR 300 ms, TE 10 ms, slice thickness 10 mm, slice gap 1 mm) were used for all acquisitions, each consisting of 8 slices and lasting just 54 s. Whole-body examination needed an average of nine acquisitions. Afterwards, images were downloaded to an independent workstation and, through their electronic segmentation with a reference filter, total volume and adipose tissue volumes were calculated manually. This process took up to 2 h for each patient. The MR data were compared with anthropometric evaluations. Affected children have a marked increase in adipose tissue and a decrease in lean tissue compared with reference healthy controls. Mean fat mass calculated by MR is significantly higher than mean fat mass obtained using anthropometric measurements ( p<0.001). Our MR study proved to be accurate and easy to apply, although it was time-consuming. We recommend it in monitoring the progression of the disease and planning DMD patients' diet.

  6. Quantitative MR evaluation of body composition in patients with Duchenne muscular dystrophy.

    PubMed

    Pichiecchio, Anna; Uggetti, Carla; Egitto, Maria Grazia; Berardinelli, Angela; Orcesi, Simona; Gorni, Ksenija Olga Tatiana; Zanardi, Cristina; Tagliabue, Anna

    2002-11-01

    The aim of this study was to propose a quantitative MR protocol with very short acquisition time and good reliability in volume construction, for the evaluation of body composition in patients affected by Duchenne muscular dystrophy (DMD). This MR protocol was compared with common anthropometric evaluations of the same patients. Nine boys affected by DMD, ranging in age from 6 to 12 years, were selected to undergo MR examination. Transversal T1-weighted spin-echo sequences (0.5T; TR 300 ms, TE 10 ms, slice thickness 10 mm, slice gap 1 mm) were used for all acquisitions, each consisting of 8 slices and lasting just 54 s. Whole-body examination needed an average of nine acquisitions. Afterwards, images were downloaded to an independent workstation and, through their electronic segmentation with a reference filter, total volume and adipose tissue volumes were calculated manually. This process took up to 2 h for each patient. The MR data were compared with anthropometric evaluations. Affected children have a marked increase in adipose tissue and a decrease in lean tissue compared with reference healthy controls. Mean fat mass calculated by MR is significantly higher than mean fat mass obtained using anthropometric measurements ( p<0.001). Our MR study proved to be accurate and easy to apply, although it was time-consuming. We recommend it in monitoring the progression of the disease and planning DMD patients' diet. PMID:12386760

  7. Quantitative analysis of topoisomerase II{alpha} to rapidly evaluate cell proliferation in brain tumors

    SciTech Connect

    Oda, Masashi; Arakawa, Yoshiki; Kano, Hideyuki; Kawabata, Yasuhiro; Katsuki, Takahisa; Shirahata, Mitsuaki; Ono, Makoto; Yamana, Norikazu; Hashimoto, Nobuo; Takahashi, Jun A. . E-mail: jat@kuhp.kyoto-u.ac.jp

    2005-06-17

    Immunohistochemical cell proliferation analyses have come into wide use for evaluation of tumor malignancy. Topoisomerase II{alpha} (topo II{alpha}), an essential nuclear enzyme, has been known to have cell cycle coupled expression. We here show the usefulness of quantitative analysis of topo II{alpha} mRNA to rapidly evaluate cell proliferation in brain tumors. A protocol to quantify topo II{alpha} mRNA was developed with a real-time RT-PCR. It took only 3 h to quantify from a specimen. A total of 28 brain tumors were analyzed, and the level of topo II{alpha} mRNA was significantly correlated with its immuno-staining index (p < 0.0001, r = 0.9077). Furthermore, it sharply detected that topo II{alpha} mRNA decreased in growth-inhibited glioma cell. These results support that topo II{alpha} mRNA may be a good and rapid indicator to evaluate cell proliferate potential in brain tumors.

  8. Quantitative evaluation of optical coherence tomography signal enhancement with gold nanoshells.

    PubMed

    Agrawal, Anant; Huang, Stanley; Wei Haw Lin, Alex; Lee, Min-Ho; Barton, Jennifer K; Drezek, Rebekah A; Pfefer, T Joshua

    2006-01-01

    Nanoshell-enhanced optical coherence tomography (OCT) is a novel technique with the potential for molecular imaging and improved disease detection. However, optimization of this approach will require a quantitative understanding of the influence of nanoshell parameters on detected OCT signals. In this study, OCT was performed at 1310 nm in water and turbid tissue-simulating phantoms to which nanoshells were added. The effect of nanoshell concentration, core diameter, and shell thickness on signal enhancement was characterized. Experimental results indicated trends that were consistent with predicted optical properties-a monotonic increase in signal intensity and attenuation with increasing shell and core size. Threshold concentrations for a 2-dB OCT signal intensity gain were determined for several nanoshell geometries. For the most highly backscattering nanoshells tested-291-nm core diameter, 25-nm shell thickness-a concentration of 10(9) nanoshells/mL was needed to produce this signal increase. Based on these results, we discuss various practical considerations for optimizing nanoshell-enhanced OCT. Quantitative experimental data presented here will facilitate optimization of OCT-based diagnostics and may also be relevant to other reflectance-based approaches as well. PMID:16965149

  9. Quantitative Ultrasonic Evaluation of Radiation-Induced Late Tissue Toxicity: Pilot Study of Breast Cancer Radiotherapy

    SciTech Connect

    Liu Tian; Zhou Jun; Yoshida, Emi J.; Woodhouse, Shermian A.; Schiff, Peter B.; Wang, Tony J.C.; Lu Zhengfeng; Pile-Spellman, Eliza; Zhang Pengpeng; Kutcher, Gerald J.

    2010-11-01

    Purpose: To investigate the use of advanced ultrasonic imaging to quantitatively evaluate normal-tissue toxicity in breast-cancer radiation treatment. Methods and Materials: Eighteen breast cancer patients who received radiation treatment were enrolled in an institutional review board-approved clinical study. Radiotherapy involved a radiation dose of 50.0 to 50.4 Gy delivered to the entire breast, followed by an electron boost of 10.0 to 16.0 Gy delivered to the tumor bed. Patients underwent scanning with ultrasound during follow-up, which ranged from 6 to 94 months (median, 22 months) postradiotherapy. Conventional ultrasound images and radio-frequency (RF) echo signals were acquired from treated and untreated breasts. Three ultrasound parameters, namely, skin thickness, Pearson coefficient, and spectral midband fit, were computed from RF signals to measure radiation-induced changes in dermis, hypodermis, and subcutaneous tissue, respectively. Ultrasound parameter values of the treated breast were compared with those of the untreated breast. Ultrasound findings were compared with clinical assessment using Radiation Therapy Oncology Group (RTOG) late-toxicity scores. Results: Significant changes were observed in ultrasonic parameter values of the treated vs. untreated breasts. Average skin thickness increased by 27.3%, from 2.05 {+-} 0.22mm to 2.61 {+-} 0.52mm; Pearson coefficient decreased by 31.7%, from 0.41 {+-} 0.07 to 0.28 {+-} 0.05; and midband fit increased by 94.6%, from -0.92 {+-} 7.35 dB to 0.87 {+-} 6.70 dB. Ultrasound evaluations were consistent with RTOG scores. Conclusions: Quantitative ultrasound provides a noninvasive, objective means of assessing radiation-induced changes to the skin and subcutaneous tissue. This imaging tool will become increasingly valuable as we continue to improve radiation therapy technique.

  10. Performance Evaluation Method for Dissimilar Aircraft Designs

    NASA Technical Reports Server (NTRS)

    Walker, H. J.

    1979-01-01

    A rationale is presented for using the square of the wingspan rather than the wing reference area as a basis for nondimensional comparisons of the aerodynamic and performance characteristics of aircraft that differ substantially in planform and loading. Working relationships are developed and illustrated through application to several categories of aircraft covering a range of Mach numbers from 0.60 to 2.00. For each application, direct comparisons of drag polars, lift-to-drag ratios, and maneuverability are shown for both nondimensional systems. The inaccuracies that may arise in the determination of aerodynamic efficiency based on reference area are noted. Span loading is introduced independently in comparing the combined effects of loading and aerodynamic efficiency on overall performance. Performance comparisons are made for the NACA research aircraft, lifting bodies, century-series fighter aircraft, F-111A aircraft with conventional and supercritical wings, and a group of supersonic aircraft including the B-58 and XB-70 bomber aircraft. An idealized configuration is included in each category to serve as a standard for comparing overall efficiency.

  11. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  12. Quantitative evaluation of susceptibility effects caused by dental materials in head magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Strocchi, S.; Ghielmi, M.; Basilico, F.; Macchi, A.; Novario, R.; Ferretti, R.; Binaghi, E.

    2016-03-01

    This work quantitatively evaluates the effects induced by susceptibility characteristics of materials commonly used in dental practice on the quality of head MR images in a clinical 1.5T device. The proposed evaluation procedure measures the image artifacts induced by susceptibility in MR images by providing an index consistent with the global degradation as perceived by the experts. Susceptibility artifacts were evaluated in a near-clinical setup, using a phantom with susceptibility and geometric characteristics similar to that of a human head. We tested different dentist materials, called PAL Keramit, Ti6Al4V-ELI, Keramit NP, ILOR F, Zirconia and used different clinical MR acquisition sequences, such as "classical" SE and fast, gradient, and diffusion sequences. The evaluation is designed as a matching process between reference and artifacts affected images recording the same scene. The extent of the degradation induced by susceptibility is then measured in terms of similarity with the corresponding reference image. The matching process involves a multimodal registration task and the use an adequate similarity index psychophysically validated, based on correlation coefficient. The proposed analyses are integrated within a computer-supported procedure that interactively guides the users in the different phases of the evaluation method. 2-Dimensional and 3-dimensional indexes are used for each material and each acquisition sequence. From these, we drew a ranking of the materials, averaging the results obtained. Zirconia and ILOR F appear to be the best choice from the susceptibility artefacts point of view, followed, in order, by PAL Keramit, Ti6Al4V-ELI and Keramit NP.

  13. What Makes a Good Criminal Justice Professor? A Quantitative Analysis of Student Evaluation Forms

    ERIC Educational Resources Information Center

    Gerkin, Patrick M.; Kierkus, Christopher A.

    2011-01-01

    The goal of this research is to understand how students define teaching effectiveness. By using multivariate regression analysis of 8,000+ student evaluations of teaching compiled by a School of Criminal Justice at a Midwestern public university, this paper explores the relationships between individual indicators of instructor performance (e.g.…

  14. Quantitative evaluation of his-tag purification and immunoprecipitation of tristetraprolin and its mutant proteins from transfected human cells

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Histidine (His)-tag is widely used for affinity purification of recombinant proteins, but the yield and purity of expressed proteins are quite different. Little information is available about quantitative evaluation of this procedure. The objective of the current study was to evaluate the His-tag pr...

  15. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... model used by the Center for Biologics Evaluation and Research (CBER) and suggestions for further...: Richard Forshee, Center for Biologics Evaluation and Research (HFM-210), Food and Drug Administration... disease computer simulation models to generate quantitative estimates of the benefits and risks...

  16. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas

    PubMed Central

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken’s embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  17. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas.

    PubMed

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken's embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  18. High-performance piezoelectric nanogenerators for self-powered nanosystems: quantitative standards and figures of merit

    NASA Astrophysics Data System (ADS)

    Wu, Wenzhuo

    2016-03-01

    Harvesting energies from the atmosphere cost-effectively is critical for both addressing worldwide long-term energy needs at the macro-scale, and achieving the sustainable maintenance-free operation of nanodevices at the micro-scale (Wang and Wu 2012 Angew. Chem. Int. Ed. 51 11700-21). Piezoelectric nanogenerator (NG) technology has demonstrated its great application potential in harvesting the ubiquitous and abundant mechanical energy. Despite of the progress made in this rapidly-advancing field, a fundamental understanding and common standard for consistently quantifying and evaluating the performance of the various types of piezoelectric NGs is still lacking. In their recent study Crossley and Kar-Narayan (2015 Nanotechnology 26 344001), systematically investigated dynamical properties of piezoelectric NGs by taking into account the effect of driving mechanism and load frequency on NG performance. They further defined the NGs’ figures of merit as energy harvested normalized by applied strain or stress for NGs under strain-driven or stress-driven conditions, which are commonly seen in the vibrational energy harvesting. This work provides new insight and a feasible approach for consistently evaluating piezoelectric nanomaterials and NG devices, which is important for designing and optimizing nanoscale piezoelectric energy harvesters, as well as promoting their applications in emerging areas e.g. the internet of things, wearable devices, and self-powered nanosystems.

  19. High-performance piezoelectric nanogenerators for self-powered nanosystems: quantitative standards and figures of merit.

    PubMed

    Wu, Wenzhuo

    2016-03-18

    Harvesting energies from the atmosphere cost-effectively is critical for both addressing worldwide long-term energy needs at the macro-scale, and achieving the sustainable maintenance-free operation of nanodevices at the micro-scale (Wang and Wu 2012 Angew. Chem. Int. Ed. 51 11700-21). Piezoelectric nanogenerator (NG) technology has demonstrated its great application potential in harvesting the ubiquitous and abundant mechanical energy. Despite of the progress made in this rapidly-advancing field, a fundamental understanding and common standard for consistently quantifying and evaluating the performance of the various types of piezoelectric NGs is still lacking. In their recent study Crossley and Kar-Narayan (2015 Nanotechnology 26 344001), systematically investigated dynamical properties of piezoelectric NGs by taking into account the effect of driving mechanism and load frequency on NG performance. They further defined the NGs' figures of merit as energy harvested normalized by applied strain or stress for NGs under strain-driven or stress-driven conditions, which are commonly seen in the vibrational energy harvesting. This work provides new insight and a feasible approach for consistently evaluating piezoelectric nanomaterials and NG devices, which is important for designing and optimizing nanoscale piezoelectric energy harvesters, as well as promoting their applications in emerging areas e.g. the internet of things, wearable devices, and self-powered nanosystems. PMID:26871611

  20. High-performance piezoelectric nanogenerators for self-powered nanosystems: quantitative standards and figures of merit.

    PubMed

    Wu, Wenzhuo

    2016-03-18

    Harvesting energies from the atmosphere cost-effectively is critical for both addressing worldwide long-term energy needs at the macro-scale, and achieving the sustainable maintenance-free operation of nanodevices at the micro-scale (Wang and Wu 2012 Angew. Chem. Int. Ed. 51 11700-21). Piezoelectric nanogenerator (NG) technology has demonstrated its great application potential in harvesting the ubiquitous and abundant mechanical energy. Despite of the progress made in this rapidly-advancing field, a fundamental understanding and common standard for consistently quantifying and evaluating the performance of the various types of piezoelectric NGs is still lacking. In their recent study Crossley and Kar-Narayan (2015 Nanotechnology 26 344001), systematically investigated dynamical properties of piezoelectric NGs by taking into account the effect of driving mechanism and load frequency on NG performance. They further defined the NGs' figures of merit as energy harvested normalized by applied strain or stress for NGs under strain-driven or stress-driven conditions, which are commonly seen in the vibrational energy harvesting. This work provides new insight and a feasible approach for consistently evaluating piezoelectric nanomaterials and NG devices, which is important for designing and optimizing nanoscale piezoelectric energy harvesters, as well as promoting their applications in emerging areas e.g. the internet of things, wearable devices, and self-powered nanosystems.

  1. HENC performance evaluation and plutonium calibration

    SciTech Connect

    Menlove, H.O.; Baca, J.; Pecos, J.M.; Davidson, D.R.; McElroy, R.D.; Brochu, D.B.

    1997-10-01

    The authors have designed a high-efficiency neutron counter (HENC) to increase the plutonium content in 200-L waste drums. The counter uses totals neutron counting, coincidence counting, and multiplicity counting to determine the plutonium mass. The HENC was developed as part of a Cooperative Research and Development Agreement between the Department of Energy and Canberra Industries. This report presents the results of the detector modifications, the performance tests, the add-a-source calibration, and the plutonium calibration at Los Alamos National Laboratory (TA-35) in 1996.

  2. ATAMM enhancement and multiprocessing performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.

    1994-01-01

    The algorithm to architecture mapping model (ATAAM) is a Petri net based model which provides a strategy for periodic execution of a class of real-time algorithms on multicomputer dataflow architecture. The execution of large-grained, decision-free algorithms on homogeneous processing elements is studied. The ATAAM provides an analytical basis for calculating performance bounds on throughput characteristics. Extension of the ATAMM as a strategy for cyclo-static scheduling provides for a truly distributed ATAMM multicomputer operating system. An ATAAM testbed consisting of a centralized graph manager and three processors is described using embedded firmware on 68HC11 microcontrollers.

  3. Phased array performance evaluation with photoelastic visualization

    SciTech Connect

    Ginzel, Robert; Dao, Gavin

    2014-02-18

    New instrumentation and a widening range of phased array transducer options are affording the industry a greater potential. Visualization of the complex wave components using the photoelastic system can greatly enhance understanding of the generated signals. Diffraction, mode conversion and wave front interaction, together with beam forming for linear, sectorial and matrix arrays, will be viewed using the photoelastic system. Beam focus and steering performance will be shown with a range of embedded and surface targets within glass samples. This paper will present principles and sound field images using this visualization system.

  4. Evaluation of dental enamel caries assessment using Quantitative Light Induced Fluorescence and Optical Coherence Tomography.

    PubMed

    Maia, Ana Marly Araújo; de Freitas, Anderson Zanardi; de L Campello, Sergio; Gomes, Anderson Stevens Leônidas; Karlsson, Lena

    2016-06-01

    An in vitro study of morphological alterations between sound dental structure and artificially induced white spot lesions in human teeth, was performed through the loss of fluorescence by Quantitative Light-Induced Fluorescence (QLF) and the alterations of the light attenuation coefficient by Optical Coherence Tomography (OCT). To analyze the OCT images using a commercially available system, a special algorithm was applied, whereas the QLF images were analyzed using the software available in the commercial system employed. When analyzing the sound region against white spot lesions region by QLF, a reduction in the fluorescence intensity was observed, whilst an increase of light attenuation by the OCT system occurred. Comparison of the percentage of alteration between optical properties of sound and artificial enamel caries regions showed that OCT processed images through the attenuation of light enhanced the tooth optical alterations more than fluorescence detected by QLF System. QLF versus OCT imaging of enamel caries: a photonics assessment.

  5. High performance liquid chromatographic assay for the quantitation of total glutathione in plasma

    NASA Technical Reports Server (NTRS)

    Abukhalaf, Imad K.; Silvestrov, Natalia A.; Menter, Julian M.; von Deutsch, Daniel A.; Bayorh, Mohamed A.; Socci, Robin R.; Ganafa, Agaba A.

    2002-01-01

    A simple and widely used homocysteine HPLC procedure was applied for the HPLC identification and quantitation of glutathione in plasma. The method, which utilizes SBDF as a derivatizing agent utilizes only 50 microl of sample volume. Linear quantitative response curve was generated for glutathione over a concentration range of 0.3125-62.50 micromol/l. Linear regression analysis of the standard curve exhibited correlation coefficient of 0.999. Limit of detection (LOD) and limit of quantitation (LOQ) values were 5.0 and 15 pmol, respectively. Glutathione recovery using this method was nearly complete (above 96%). Intra-assay and inter-assay precision studies reflected a high level of reliability and reproducibility of the method. The applicability of the method for the quantitation of glutathione was demonstrated successfully using human and rat plasma samples.

  6. Holistic Evaluation of Quality Consistency of Ixeris sonchifolia (Bunge) Hance Injectables by Quantitative Fingerprinting in Combination with Antioxidant Activity and Chemometric Methods

    PubMed Central

    Yang, Lanping; Sun, Guoxiang; Guo, Yong; Hou, Zhifei; Chen, Shuai

    2016-01-01

    A widely used herbal medicine, Ixeris sonchifolia (Bge.) Hance Injectable (ISHI) was investigated for quality consistency. Characteristic fingerprints of 23 batches of the ISHI samples were generated at five wavelengths and evaluated by the systematic quantitative fingerprint method (SQFM) as well as simultaneous analysis of the content of seven marker compounds. Chemometric methods, i.e., support vector machine (SVM) and principal component analysis (PCA) were performed to assist in fingerprint evaluation of the ISHI samples. Qualitative classification of the ISHI samples by SVM was consistent with PCA, and in agreement with the quantitative evaluation by SQFM. In addition, the antioxidant activities of the ISHI samples were determined by both the off-line and on-line DPPH (2, 2-diphenyl-1-picryldrazyl) radical scavenging assays. A fingerprint–efficacy relationship linking the chemical components and in vitro antioxidant activity was established and validated using the partial least squares (PLS) and orthogonal projection to latent structures (OPLS) models; and the online DPPH assay further revealed those components that had position contribution to the total antioxidant activity. Therefore, the combined use of the chemometric methods, quantitative fingerprint evaluation by SQFM, and multiple marker compound analysis in conjunction with the assay of antioxidant activity provides a powerful and holistic approach to evaluate quality consistency of herbal medicines and their preparations. PMID:26872364

  7. Inclusion and Student Learning: A Quantitative Comparison of Special and General Education Student Performance Using Team and Solo-Teaching

    ERIC Educational Resources Information Center

    Jamison, Joseph A.

    2013-01-01

    This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…

  8. Examination of Information Technology (IT) Certification and the Human Resources (HR) Professional Perception of Job Performance: A Quantitative Study

    ERIC Educational Resources Information Center

    O'Horo, Neal O.

    2013-01-01

    The purpose of this quantitative survey study was to test the Leontief input/output theory relating the input of IT certification to the output of the English-speaking U.S. human resource professional perceived IT professional job performance. Participants (N = 104) rated their perceptions of IT certified vs. non-IT certified professionals' job…

  9. Traction contact performance evaluation at high speeds

    NASA Technical Reports Server (NTRS)

    Tevaarwerk, J. L.

    1981-01-01

    The results of traction tests performed on two fluids are presented. These tests covered a pressure range of 1.0 to 2.5 GPa, an inlet temperature range of 30 'C to 70 'C, a speed range of 10 to 80 m/sec, aspect ratios of .5 to 5 and spin from 0 to 2.1 percent. The test results are presented in the form of two dimensionless parameters, the initial traction slope and the maximum traction peak. With the use of a suitable rheological fluid model the actual traction curves measured can now be reconstituted from the two fluid parameters. More importantly, the knowledge of these parameters together with the fluid rheological model, allow the prediction of traction under conditions of spin, slip and any combination thereof. Comparison between theoretically predicted traction under these conditions and those measured in actual traction tests shows that this method gives good results.

  10. Evaluation of quantitative PCR combined with PMA treatment for molecular assessment of microbial water quality.

    PubMed

    Gensberger, Eva Theres; Polt, Marlies; Konrad-Köszler, Marianne; Kinner, Paul; Sessitsch, Angela; Kostić, Tanja

    2014-12-15

    Microbial water quality assessment currently relies on cultivation-based methods. Nucleic acid-based techniques such as quantitative PCR (qPCR) enable more rapid and specific detection of target organisms and propidium monoazide (PMA) treatment facilitates the exclusion of false positive results caused by DNA from dead cells. Established molecular assays (qPCR and PMA-qPCR) for legally defined microbial quality parameters (Escherichia coli, Enterococcus spp. and Pseudomonas aeruginosa) and indicator organism group of coliforms (implemented on the molecular detection of Enterobacteriaceae) were comparatively evaluated to conventional microbiological methods. The evaluation of an extended set of drinking and process water samples showed that PMA-qPCR for E. coli, Enterococcus spp. and P. aeruginosa resulted in higher specificity because substantial or complete reduction of false positive signals in comparison to qPCR were obtained. Complete compliance to reference method was achieved for E. coli PMA-qPCR and 100% specificity for Enterococcus spp. and P. aeruginosa in the evaluation of process water samples. A major challenge remained in sensitivity of the assays, exhibited through false negative results (7-23%), which is presumably due to insufficient sample preparation (i.e. concentration of bacteria and DNA extraction), rather than the qPCR limit of detection. For the detection of the indicator group of coliforms, the evaluation study revealed that the utilization of alternative molecular assays based on the taxonomic group of Enterobacteriaceae was not adequate. Given the careful optimization of the sensitivity, the highly specific PMA-qPCR could be a valuable tool for rapid detection of hygienic parameters such as E. coli, Enterococcus spp. and P. aeruginosa.

  11. Evaluation of board performance in Iran’s universities of medical sciences

    PubMed Central

    Sajadi, Haniye Sadat; Maleki, Mohammadreza; Ravaghi, Hamid; Farzan, Homayoun; Aminlou, Hasan; Hadi, Mohammad

    2014-01-01

    Background: The critical role that the board plays in governance of universities clarifies the necessity of evaluating its performance. This study was aimed to evaluate the performance of the boards of medical universities and provide solutions to enhance its performance. Methods: The first phase of present study was a qualitative research in which data were collected through face-to-face semi-structured interviews. Data were analyzed by thematic approach. The second phase was a mixed qualitative and quantitative study, with quantitative part in cross-sectional format and qualitative part in content analysis format. In the quantitative part, data were collected through Ministry of Health and Medical Education (MoHME). In the qualitative part, the content of 2,148 resolutions that were selected by using stratified sampling method were analyzed. Results: Participants believed that the boards had no acceptable performance for a long time.Results also indicated the increasing number of meetings and resolutions of the boards in these 21 years. The boards’ resolutions were mostly operational in domain and administrative in nature. The share of specific resolutions was more than the general ones. Conclusion: Given the current pace of change and development and the need to timely respond them, it is recommended to accelerate the slow pace of improvement process of the boards. It appears that more delegation and strengthening the position of the boards are the effective strategies to speed up this process. PMID:25337597

  12. 48 CFR 3052.216-72 - Performance evaluation plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Performance evaluation... CONTRACT CLAUSES Text of Provisions and Clauses 3052.216-72 Performance evaluation plan. As prescribed in (HSAR) 48 CFR 3016.406(e)(i)(ii), insert a clause substantially the same as the following:...

  13. Evaluation of Section Heads' Performance at Kuwait Secondary Schools

    ERIC Educational Resources Information Center

    Al-Hamdan, Jasem M.; Al-Yacoub, Ali M.

    2005-01-01

    Purpose: The study attempts to examine the viewpoints of those involved in evaluating the performance of section heads in Kuwait secondary schools; mainly section heads themselves, supervisors and principals. It sets out to determine the strength and weaknesses in the performance evaluation form designed for section heads.…

  14. Sexism and Beautyism in Women's Evaluations of Peer Performance.

    ERIC Educational Resources Information Center

    Cash, Thomas F.; Trimer, Claire A.

    1984-01-01

    Investigated independent and interactive effects of physical attractiveness (PA), sex, and task sex-typing on performance evaluations by 216 college women. Found that the halo effect ("beauty is talent") of PA operated when subjects evaluated both sexes, with the exception of ratings of attractive women in out-of-role ("masculine") performances.…

  15. Genetic variability of oil palm parental genotypes and performance of its' progenies as revealed by molecular markers and quantitative traits.

    PubMed

    Abdullah, Norziha; Rafii Yusop, Mohd; Ithnin, Maizura; Saleh, Ghizan; Latif, M A

    2011-04-01

    Studies were conducted to assess the genetic relationships between the parental palms (dura and pisifera) and performance of their progenies based on nine microsatellite markers and 29 quantitative traits. Correlation analyses between genetic distances and hybrids performance were estimated. The coefficients of correlation values of genetic distances with hybrid performance were non-significant, except for mean nut weight and leaf number. However, the correlation coefficient of genetic distances with these characters was low to be used as predicted value. These results indicated that genetic distances based on the microsatellite markers may not be useful for predicting hybrid performance. The genetic distance analysis using UPGMA clustering system generated 5 genetic clusters with coefficient of 1.26 based on quantitative traits of progenies. The genotypes, DP16, DP14, DP4, DP13, DP12, DP15, DP8, DP1 and DP2 belonging to distant clusters and greater genetic distances could be selected for further breeding programs.

  16. Evaluation of PV Module Field Performance

    SciTech Connect

    Wohlgemuth, John; Silverman, Timothy; Miller, David C.; McNutt, Peter; Kempe, Michael; Deceglie, Michael

    2015-06-14

    This paper describes an effort to inspect and evaluate PV modules in order to determine what failure or degradation modes are occurring in field installations. This paper will report on the results of six site visits, including the Sacramento Municipal Utility District (SMUD) Hedge Array, Tucson Electric Power (TEP) Springerville, Central Florida Utility, Florida Solar Energy Center (FSEC), the TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification. TEP Solar Test Yard, and University of Toledo installations. The effort here makes use of a recently developed field inspection data collection protocol, and the results were input into a corresponding database. The results of this work have also been used to develop a draft of the IEC standard for climate and application specific accelerated stress testing beyond module qualification.

  17. Evaluation of ECCS performance for an SBWR

    SciTech Connect

    Abe, Nobuaki; Arai, Kenji; Hamazaki, Ryouichi; Nagasaka, Hideo

    1990-01-01

    A simplified boiling water reactor (SBWR), one of the next generation of light water reactors, is now under development. From the safety viewpoint, the SBWR is characterized by the adoption of a passive emergency core cooling system (ECCS) and a passive containment cooling system (PCCS). The ECCS network for an SBWR consists of depressurization valves (DPVs) and a gravity-driven cooling system (GDCS). The DPV and GDCS are designed to keep the core covered with water following any loss-of-coolant accident (LOCA) assuming a single failure in the ECCS. The SAPPHIRE code has been developed in order to evaluate the effectiveness of the ECCS of the SBWR. The SAPPHIRE code has been developed to calculate the short-term thermal-hydraulic phenomena simultaneously inside the contaminant including the RPV, drywell, and wetwell. The predictive capability of SAPPHIRE for SBWR LOCA analysis has been demonstrated by a comparison with the best estimate TRAC code. Both SAPPHIRE and TRAC codes indicate no core uncovery during a maximum drain line break.

  18. Evaluating hospital performance based on excess cause-specific incidence.

    PubMed

    Van Rompaye, Bart; Eriksson, Marie; Goetghebeur, Els

    2015-04-15

    Formal evaluation of hospital performance in specific types of care is becoming an indispensable tool for quality assurance in the health care system. When the prime concern lies in reducing the risk of a cause-specific event, we propose to evaluate performance in terms of an average excess cumulative incidence, referring to the center's observed patient mix. Its intuitive interpretation helps give meaning to the evaluation results and facilitates the determination of important benchmarks for hospital performance. We apply it to the evaluation of cerebrovascular deaths after stroke in Swedish stroke centers, using data from Riksstroke, the Swedish stroke registry.

  19. A Quantitative and Qualitative Evaluation of Sentence Boundary Detection for the Clinical Domain

    PubMed Central

    Griffis, Denis; Shivade, Chaitanya; Fosler-Lussier, Eric; Lai, Albert M.

    2016-01-01

    Sentence boundary detection (SBD) is a critical preprocessing task for many natural language processing (NLP) applications. However, there has been little work on evaluating how well existing methods for SBD perform in the clinical domain. We evaluate five popular off-the-shelf NLP toolkits on the task of SBD in various kinds of text using a diverse set of corpora, including the GENIA corpus of biomedical abstracts, a corpus of clinical notes used in the 2010 i2b2 shared task, and two general-domain corpora (the British National Corpus and Switchboard). We find that, with the exception of the cTAKES system, the toolkits we evaluate perform noticeably worse on clinical text than on general-domain text. We identify and discuss major classes of errors, and suggest directions for future work to improve SBD methods in the clinical domain. We also make the code used for SBD evaluation in this paper available for download at http://github.com/drgriffis/SBD-Evaluation. PMID:27570656

  20. A Quantitative and Qualitative Evaluation of Sentence Boundary Detection for the Clinical Domain.

    PubMed

    Griffis, Denis; Shivade, Chaitanya; Fosler-Lussier, Eric; Lai, Albert M

    2016-01-01

    Sentence boundary detection (SBD) is a critical preprocessing task for many natural language processing (NLP) applications. However, there has been little work on evaluating how well existing methods for SBD perform in the clinical domain. We evaluate five popular off-the-shelf NLP toolkits on the task of SBD in various kinds of text using a diverse set of corpora, including the GENIA corpus of biomedical abstracts, a corpus of clinical notes used in the 2010 i2b2 shared task, and two general-domain corpora (the British National Corpus and Switchboard). We find that, with the exception of the cTAKES system, the toolkits we evaluate perform noticeably worse on clinical text than on general-domain text. We identify and discuss major classes of errors, and suggest directions for future work to improve SBD methods in the clinical domain. We also make the code used for SBD evaluation in this paper available for download at http://github.com/drgriffis/SBD-Evaluation. PMID:27570656

  1. Quantitative evaluation of reactive nitrogen emissions with urbanization: a case study in Beijing megacity, China.

    PubMed

    Xian, Chaofan; Ouyang, Zhiyun; Lu, Fei; Xiao, Yang; Li, Yanmin

    2016-09-01

    The rapid increase in anthropogenic nitrogen (N) load in urbanized environment threatens urban sustainability. In this study, we estimated the amount of reactive N (Nr) as an index of N pollution potential caused by human activities, using the megacity of Beijing as a case study. We investigated the temporal changes in Nr emissions in the environment from 2000 to 2012 using a multidisciplinary approach with quantitative evaluation. The Nr emissions presented slightly increasing during study period, and the annual emission was 0.19 Tg N, mainly resulting from fuel combustion. Nevertheless, the Nr output intensity resulting from inhabitants' livelihoods and material production had weakened over the study period. The evaluation results showed that the environmental measures to remove Nr in Beijing were efficient in most years, suggesting that progress in mitigating the growth of the Nr load in this urban environment was significant. Further measures based on N offset are suggested that could help alleviate the environmental pressure resulting from anthropogenic Nr emissions. These could provide theoretical support for the sustainable development of megacities. PMID:27240830

  2. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  3. Regression of white spot enamel lesions. A new optical method for quantitative longitudinal evaluation in vivo.

    PubMed

    Ogaard, B; Ten Bosch, J J

    1994-09-01

    This article describes a new nondestructive optical method for evaluation of lesion regression in vivo. White spot caries lesions were induced with orthodontic bands in two vital premolars of seven patients. The teeth were banded for 4 weeks with special orthodontic bands that allowed plaque accumulation on the buccal surface. The teeth were left in the dentition for 2 or 4 weeks after debanding. Regular oral hygiene with a nonfluoridated toothpaste was applied during the entire experimental period. The optical scattering coefficient of the banded area was measured before banding and in 1-week intervals thereafter. The scattering coefficient returned to the sound value in an exponential manner, the half-value-time for left teeth being 1.1 week, for right teeth 1.8 weeks, these values being significantly inequal (p = 0.035). At the start of the regression period, the scattering coefficient of left teeth lesions was 2.5 as high as of right teeth lesions, values being inequal with p = 0.09. It is concluded that regression of initial lesions in the presence of saliva is a relatively rapid process. The new optical method may be of clinical importance for quantitative evaluation of enamel lesion regression developed during fixed appliance therapy.

  4. Reduction of motion blurring artifacts using respiratory gated CT in sinogram space: A quantitative evaluation

    SciTech Connect

    Lu Wei; Parikh, Parag J.; Hubenschmidt, James P.; Politte, David G.; Whiting, Bruce R.; Bradley, Jeffrey D.; Mutic, Sasa; Low, Daniel A.

    2005-11-15

    Techniques have been developed for reducing motion blurring artifacts by using respiratory gated computed tomography (CT) in sinogram space and quantitatively evaluating the artifact reduction. A synthetic sinogram was built from multiple scans intercepting a respiratory gating window. A gated CT image was then reconstructed using the filtered back-projection algorithm. Wedge phantoms, developed for quantifying the motion artifact reduction, were scanned while being moved using a computer-controlled linear stage. The resulting artifacts appeared between the high and low density regions as an apparent feature with a Hounsfield value that was the average of the two regions. A CT profile through these regions was fit using two error functions, each modeling the partial-volume averaging characteristics for the unmoving phantom. The motion artifact was quantified by determining the apparent distance between the two functions. The blurring artifact had a linear relationship with both the speed and the tangent of the wedge angles. When gating was employed, the blurring artifact was reduced systematically at the air-phantom interface. The gated image of phantoms moving at 20 mm/s showed similar blurring artifacts as the nongated image of phantoms moving at 10 mm/s. Nine patients were also scanned using the synchronized respiratory motion technique. Image artifacts were evaluated in the diaphragm, where high contrast interfaces intercepted the imaging plane. For patients, this respiratory gating technique reduced the blurring artifacts by 9%-41% at the lung-diaphragm interface.

  5. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    PubMed

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  6. Evaluation of green coffee beans quality using near infrared spectroscopy: a quantitative approach.

    PubMed

    Santos, João Rodrigo; Sarraguça, Mafalda C; Rangel, António O S S; Lopes, João A

    2012-12-01

    Characterisation of coffee quality based on bean quality assessment is associated with the relative amount of defective beans among non-defective beans. It is therefore important to develop a methodology capable of identifying the presence of defective beans that enables a fast assessment of coffee grade and that can become an analytical tool to standardise coffee quality. In this work, a methodology for quality assessment of green coffee based on near infrared spectroscopy (NIRS) is proposed. NIRS is a green chemistry, low cost, fast response technique without the need of sample processing. The applicability of NIRS was evaluated for Arabica and Robusta varieties from different geographical locations. Partial least squares regression was used to relate the NIR spectrum to the mass fraction of defective and non-defective beans. Relative errors around 5% show that NIRS can be a valuable analytical tool to be used by coffee roasters, enabling a simple and quantitative evaluation of green coffee quality in a fast way. PMID:22953929

  7. Laboratory design and test procedures for quantitative evaluation of infrared sensors to assess thermal anomalies

    SciTech Connect

    Chang, Y.M.; Grot, R.A.; Wood, J.T.

    1985-06-01

    This report presents the description of the laboratory apparatus and preliminary results of the quantitative evaluation of three high-resolution and two low-resolution infrared imaging systems. These systems which are commonly used for building diagnostics are tested under various background temperatures (from -20/sup 0/C to 25/sup 0/C) for their minimum resolvable temperature differences (MRTD) at spatial frequencies from 0.03 to 0.25 cycles per milliradian. The calibration curves of absolute and differential temperature measurements are obtained for three systems. The signal transfer function and line spread function at ambient temperature of another three systems are also measured. Comparisons of the dependence of the MRTD on background temperatures from the measured data with the predicted values given in ASHRAE Standards 101-83 are also included. The dependence of background temperatures for absolute temperature measurements are presented, as well as comparison of measured data and data given by the manufacturer. Horizontal on-axis magnification factors of the geometric transfer function of two systems are also established to calibrate the horizontal axis for the measured line spread function to obtain the modulation transfer function. The variation of the uniformity for horizontal display of these two sensors are also observed. Included are detailed descriptions of laboratory design, equipment setup, and evaluation procedures of each test. 10 refs., 38 figs., 12 tabs.

  8. Quantitative Evaluation of Peptide-Material Interactions by a Force Mapping Method: Guidelines for Surface Modification.

    PubMed

    Mochizuki, Masahito; Oguchi, Masahiro; Kim, Seong-Oh; Jackman, Joshua A; Ogawa, Tetsu; Lkhamsuren, Ganchimeg; Cho, Nam-Joon; Hayashi, Tomohiro

    2015-07-28

    Peptide coatings on material surfaces have demonstrated wide application across materials science and biotechnology, facilitating the development of nanobio interfaces through surface modification. A guiding motivation in the field is to engineer peptides with a high and selective binding affinity to target materials. Herein, we introduce a quantitative force mapping method in order to evaluate the binding affinity of peptides to various hydrophilic oxide materials by atomic force microscopy (AFM). Statistical analysis of adhesion forces and probabilities obtained on substrates with a materials contrast enabled us to simultaneously compare the peptide binding affinity to different materials. On the basis of the experimental results and corresponding theoretical analysis, we discuss the role of various interfacial forces in modulating the strength of peptide attachment to hydrophilic oxide solid supports as well as to gold. The results emphasize the precision and robustness of our approach to evaluating the adhesion strength of peptides to solid supports, thereby offering guidelines to improve the design and fabrication of peptide-coated materials.

  9. A quantitative and standardized robotic method for the evaluation of arm proprioception after stroke.

    PubMed

    Simo, Lucia S; Ghez, Claude; Botzer, Lior; Scheidt, Robert A

    2011-01-01

    Stroke often results in both motor and sensory deficits, which may interact in the manifested functional impairment. Proprioception is known to play important roles in the planning and control of limb posture and movement; however, the impact of proprioceptive deficits on motor function has been difficult to elucidate due in part to the qualitative nature of available clinical tests. We present a quantitative and standardized method for evaluating proprioception in tasks directly relevant to those used to assess motor function. Using a robotic manipulandum that exerted controlled displacements of the hand, stroke participants were evaluated, and compared with a control group, in their ability to detect such displacements in a 2-alternative, forced-choice paradigm. A psychometric function parameterized the decision process underlying the detection of the hand displacements. The shape of this function was determined by a signal detection threshold and by the variability of the response about this threshold. Our automatic procedure differentiates between participants with and without proprioceptive deficits and quantifies functional proprioceptive sensation on a magnitude scale that is meaningful for ongoing studies of degraded motor function in comparable horizontal movements.

  10. Quantitative evaluation of regular morning meetings aimed at improving work practices associated with effective interdisciplinary communication.

    PubMed

    Aston, Judy; Shi, Edward; Bullôt, Helen; Galway, Robyn; Crisp, Jackie

    2006-04-01

    In 2000, an interdisciplinary surgical morning meeting (SMM) was introduced into the infants' and toddlers' ward of a major paediatric hospital to help overcome a number of communication and work process problems among the health professionals providing care to children/families. The objective of this study was to evaluate the impact of the SMM on a range of work practices. Comparative design including pre- and postintervention data collection was used. Data were collected on 100 patient records. Twenty children, from each of the five diagnostic-related groups most commonly admitted to the ward, were included. Demographic, medical review, documentation, critical incidents and complaint variables were obtained from three sources: the hospital clinical information system, the children's medical records and the hospital reporting systems for complaints and critical incidents. Children in the postintervention group were significantly more likely to be reviewed regularly by medical staff, to be reviewed in the morning, to have plans for discharge documented regularly throughout their admission and to have admission summary sheets completed at the time of discharge. The findings of the quantitative evaluation add some weight to the arguments for the purposely structured introduction of interdisciplinary teams into acute-care environments.

  11. Quantitative evaluation of reactive nitrogen emissions with urbanization: a case study in Beijing megacity, China.

    PubMed

    Xian, Chaofan; Ouyang, Zhiyun; Lu, Fei; Xiao, Yang; Li, Yanmin

    2016-09-01

    The rapid increase in anthropogenic nitrogen (N) load in urbanized environment threatens urban sustainability. In this study, we estimated the amount of reactive N (Nr) as an index of N pollution potential caused by human activities, using the megacity of Beijing as a case study. We investigated the temporal changes in Nr emissions in the environment from 2000 to 2012 using a multidisciplinary approach with quantitative evaluation. The Nr emissions presented slightly increasing during study period, and the annual emission was 0.19 Tg N, mainly resulting from fuel combustion. Nevertheless, the Nr output intensity resulting from inhabitants' livelihoods and material production had weakened over the study period. The evaluation results showed that the environmental measures to remove Nr in Beijing were efficient in most years, suggesting that progress in mitigating the growth of the Nr load in this urban environment was significant. Further measures based on N offset are suggested that could help alleviate the environmental pressure resulting from anthropogenic Nr emissions. These could provide theoretical support for the sustainable development of megacities.

  12. A quantitative health assessment index for rapid evaluation of fish condition in the field

    SciTech Connect

    Adams, S.M. ); Brown, A.M. ); Goede, R.W. )

    1993-01-01

    The health assessment index (HAI) is an extension and refinement of a previously published field necropsy system. The HAI is a quantitative index that allows statistical comparisons of fish health among data sets. Index variables are assigned numerical values based on the degree of severity or damage incurred by an organ or tissue from environmental stressors. This approach has been used to evaluate the general health status of fish populations in a wide range of reservoir types in the Tennessee River basin (North Carolina, Tennessee, Alabama, Kentucky), in Hartwell Reservoir (Georgia, South Carolina) that is contaminated by polychlorinated biphenyls, and in the Pigeon River (Tennessee, North Carolina) that receives effluents from a bleaches kraft mill. The ability of the HAI to accurately characterize the health of fish in these systems was evaluated by comparing this index to other types of fish health measures (contaminant, bioindicator, and reproductive analysis) made at the same time as the HAI. In all cases, the HAI demonstrated the same pattern of fish health status between sites as did each of the other more sophisticated health assessment methods. The HAI has proven to be a simple and inexpensive means of rapidly assessing general fish health in field situations. 29 refs., 5 tabs.

  13. Thrust Stand for Electric Propulsion Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Markusic, T. E.; Jones, J. E.; Cox, M. D.

    2004-01-01

    An electric propulsion thrust stand capable of supporting thrusters with total mass of up to 125 kg and 1 mN to 1 N thrust levels has been developed and tested. The mechanical design features a conventional hanging pendulum arm attached to a balance mechanism that transforms horizontal motion into amplified vertical motion, with accommodation for variable displacement sensitivity. Unlike conventional hanging pendulum thrust stands, the deflection is independent of the length of the pendulum arm, and no reference structure is required at the end of the pendulum. Displacement is measured using a non-contact, optical linear gap displacement transducer. Mechanical oscillations are attenuated using a passive, eddy current damper. An on-board microprocessor-based level control system, which includes a two axis accelerometer and two linear-displacement stepper motors, continuously maintains the level of the balance mechanism - counteracting mechanical %era drift during thruster testing. A thermal control system, which includes heat exchange panels, thermocouples, and a programmable recirculating water chiller, continuously adjusts to varying thermal loads to maintain the balance mechanism temperature, to counteract thermal drifts. An in-situ calibration rig allows for steady state calibration both prior to and during thruster testing. Thrust measurements were carried out on a well-characterized 1 kW Hall thruster; the thrust stand was shown to produce repeatable results consistent with previously published performance data.

  14. Using Business Performance To Evaluate Multimedia Training in Manufacturing.

    ERIC Educational Resources Information Center

    Lachenmaier, Lynn S.; Moor, William C.

    1997-01-01

    Discusses training evaluation and shows how an abbreviated form of Kirkpatrick's four-level evaluation model can be used effectively to evaluate multimedia-based manufacturing training. Topics include trends in manufacturing training, quantifying performance improvement, and statistical comparisons using the Mann-Whitney test and the Tukey Quick…

  15. At-Risk Youth Appearance and Job Performance Evaluation

    ERIC Educational Resources Information Center

    Freeburg, Beth Winfrey; Workman, Jane E.

    2008-01-01

    The goal of this study was to identify the relationship of at-risk youth workplace appearance to other job performance criteria. Employers (n = 30; each employing from 1 to 17 youths) evaluated 178 at-risk high school youths who completed a paid summer employment experience. Appearance evaluations were significantly correlated with evaluations of…

  16. The use of a tracking test battery in the quantitative evaluation of neurological function

    NASA Technical Reports Server (NTRS)

    Repa, B. S.

    1973-01-01

    A number of tracking tasks that have proven useful to control engineers and psychologists measuring skilled performance have been evaluated for clinical use. Normal subjects as well as patients with previous diagnoses of Parkinson's disease, multiple sclerosis, and cerebral palsy were used in the evaluation. The tests that were studied included step tracking, random tracking, and critical tracking. The results of the present experiments encourage the continued use of tracking tasks as assessment precedures in a clinical environment. They have proven to be reliable, valid, and sensitive measures of neurological function.

  17. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 μg L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams.

  18. Quantitative Evaluation of Stomatal Cytoskeletal Patterns during the Activation of Immune Signaling in Arabidopsis thaliana

    PubMed Central

    Shimono, Masaki; Higaki, Takumi; Kaku, Hanae; Shibuya, Naoto; Hasezawa, Seiichiro

    2016-01-01

    Historically viewed as primarily functioning in the regulation of gas and water vapor exchange, it is now evident that stomata serve an important role in plant immunity. Indeed, in addition to classically defined functions related to cell architecture and movement, the actin cytoskeleton has emerged as a central component of the plant immune system, underpinning not only processes related to cell shape and movement, but also receptor activation and signaling. Using high resolution quantitative imaging techniques, the temporal and spatial changes in the actin microfilament array during diurnal cycling of stomatal guard cells has revealed a highly orchestrated transition from random arrays to ordered bundled filaments. While recent studies have demonstrated that plant stomata close in response to pathogen infection, an evaluation of stimulus-induced changes in actin cytoskeletal dynamics during immune activation in the guard cell, as well as the relationship of these changes to the function of the actin cytoskeleton and stomatal aperture, remains undefined. In the current study, we employed quantitative cell imaging and hierarchical clustering analyses to define the response of the guard cell actin cytoskeleton to pathogen infection and the elicitation of immune signaling. Using this approach, we demonstrate that stomatal-localized actin filaments respond rapidly, and specifically, to both bacterial phytopathogens and purified pathogen elicitors. Notably, we demonstrate that higher order temporal and spatial changes in the filament array show distinct patterns of organization during immune activation, and that changes in the naïve diurnal oscillations of guard cell actin filaments are perturbed by pathogens, and that these changes parallel pathogen-induced stomatal gating. The data presented herein demonstrate the application of a highly tractable and quantifiable method to assign transitions in actin filament organization to the activation of immune signaling in

  19. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides.

    PubMed

    Kniss, Andrew R; Coburn, Carl W

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman's rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact.

  20. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699