Science.gov

Sample records for quantitative performance evaluation

  1. Study on the performance evaluation of quantitative precipitation estimation and quantitative precipitation forecast

    NASA Astrophysics Data System (ADS)

    Yang, H.; Chang, K.; Suk, M.; cha, J.; Choi, Y.

    2011-12-01

    Rainfall estimation and short-term (several hours) quantitative prediction of precipitation based on meteorological radar data is one of the intensely studied topics. The Korea Peninsula has the horizontally narrow land area and complex topography with many of mountains, and so it has the characteristics that the rainfall system changes in many cases. Quantitative precipitation estimation (QPE) and quantitative precipitation forecasts (QPF) are the crucial information for severe weather or water management. We have been conducted the performance evaluation of QPE/QPF of Korea Meteorological Administration (KMA), which is the first step for optimizing QPE/QPF system in South Korea. The real-time adjusted RAR (Radar-AWS-Rainrate) system gives better agreement with the observed rain-rate than that of the fixed Z-R relation, and the additional bias correction of RAR yields the slightly better results. A correlation coefficient of R2 = 0.84 is obtained between the daily accumulated observed and RAR estimated rainfall. The RAR will be available for the hydrological applications such as the water budget. The VSRF (Very Short Range Forecast) shows better performance than the MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian) within 40 minutes, but the MAPLE better than the VSRF after 40 minutes. In case of hourly forecast, MAPLE shows better performance than the VSRF. QPE and QPF are thought to be meaningful for the nowcasting (1~2 hours) except the model forecast. The long-term forecast longer than 3 hours by meteorological model is especially meaningful for such as water management.

  2. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  3. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  4. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  5. Performance evaluation of ExiStation HBV diagnostic system for hepatitis B virus DNA quantitation.

    PubMed

    Cha, Young Joo; Yoo, Soo Jin; Sohn, Yong-Hak; Kim, Hyun Soo

    2013-11-01

    The performance of a recently developed real-time PCR system, the ExiStation HBV diagnostic system, for quantitation of hepatitis B virus (HBV) in human blood was evaluated. The detection limit, reproducibility, cross-reactivity, and interference were evaluated as measures of analytical performance. For the comparison study, 100 HBV-positive blood samples and 100 HBV-negative samples from Korean Blood Bank Serum were used, and the results of the ExiStation HBV system showed good correlation with those obtained using the Cobas TaqMan (r2=0.9931) and Abbott real-time PCR systems (r2=0.9894). The lower limit of detection was measured as 9.55 IU/mL using WHO standards and the dynamic range was linear from 6.68 to 6.68×10(9) IU/mL using cloned plasmids. The within-run coefficient of variation (CV) was 9.4%, 2.1%, and 1.1%, and the total CV was 11.8%, 3.6%, and 1.7% at a concentration of 1.92 log10 IU/mL, 3.88 log10 IU/mL, and 6.84 log10 IU/mL, respectively. No cross-reactivity or interference was detected. The ExiStation HBV diagnostic system showed satisfactory analytical sensitivity, excellent reproducibility, no cross-reactivity, no interference, and high agreement with the Cobas TaqMan and Abbott real-time PCR systems, and is therefore a useful tool for the detection and monitoring of HBV infection. PMID:23892129

  6. On the quantitative evaluation of edge detection schemes and their comparison with human performance. [image processing of satellite photographs

    NASA Technical Reports Server (NTRS)

    Fram, J. R.; Deutsch, E. S.

    1975-01-01

    A technique for the quantitative evaluation of edge detection schemes is presented. It is used to assess the performance of three such schemes using a specially-generated set of images containing noise. The ability of human subjects to distinguish the edges in the presence of noise is also measured and compared with that of the edge detection schemes. The edge detection schemes are used on a high-resolution satellite photograph with varying degrees of noise added in order to relate the quantitative comparison to real-life imagery.

  7. Performance evaluation of quantitative adiabatic (13)C NMR pulse sequences for site-specific isotopic measurements.

    PubMed

    Thibaudeau, Christophe; Remaud, Gérald; Silvestre, Virginie; Akoka, Serge

    2010-07-01

    (2)H/(1)H and (13)C/(12)C site-specific isotope ratios determined by NMR spectroscopy may be used to discriminate pharmaceutically active ingredients based on the synthetic process used in production. Extending the Site-specific Natural Isotope Fractionation NMR (SNIF-NMR) method to (13)C is highly beneficial for complex organic molecules when measurements of (2)H/(1)H ratios lead to poorly defined molecular fingerprints. The current NMR methodology to determine (13)C/(12)C site-specific isotope ratios suffers from poor sensitivity and long experimental times. In this work, several NMR pulse sequences based on polarization transfer were evaluated and optimized to measure precise quantitative (13)C NMR spectra within a short time. Adiabatic 180 degrees (1)H and (13)C pulses were incorporated into distortionless enhancement by polarization transfer (DEPT) and refocused insensitive nuclei enhanced by polarization transfer (INEPT) to minimize the influence of 180 degrees pulse imperfections and of off-resonance effects on the precision of the measured (13)C peak areas. The adiabatic DEPT sequence was applied to draw up a precise site-specific (13)C isotope profile of ibuprofen. A modified heteronuclear cross-polarization (HCP) experiment featuring (1)H and (13)C spin-locks with adiabatic 180 degrees pulses is also introduced. This sequence enables efficient magnetization transfer across a wide (13)C frequency range although not enough for an application in quantitative (13)C isotopic analysis. PMID:20527737

  8. (Un)awareness of unilateral spatial neglect: a quantitative evaluation of performance in visuo-spatial tasks.

    PubMed

    Ronchi, Roberta; Bolognini, Nadia; Gallucci, Marcello; Chiapella, Laura; Algeri, Lorella; Spada, Maria Simonetta; Vallar, Giuseppe

    2014-12-01

    Right-brain-damaged patients with unilateral spatial neglect are usually unaware (anosognosic) about their spatial deficits. However, in the scientific literature there is a lack of systematic and quantitative evaluation of this kind of unawareness, despite the negative impact of anosognosia on rehabilitation programs. This study investigated anosognosia for neglect-related impairments at different clinical tasks, by means of a quantitative assessment. Patients were tested in two different conditions (before and after execution of each task), in order to evaluate changes in the level of awareness of neglect-related behaviours triggered by task execution. Twenty-nine right-brain-damaged patients (17 with left spatial neglect) and 27 neurologically unimpaired controls entered the study. Anosognosia for spatial deficits is not pervasive, with different tasks evoking different degrees of awareness about neglect symptoms. Indeed, patients showed a largely preserved awareness about their performance in complex visuo-motor spatial and reading tasks; conversely, they were impaired in evaluating their spatial difficulties in line bisection and drawing from memory, showing over-estimation of their performance. The selectivity of the patients' unawareness of specific manifestations of spatial neglect is further supported by their preserved awareness of performance at a linguistic task, and by the absence of anosognosia for hemiplegia. This evidence indicates that discrete processes are involved in the aware monitoring of cognitive and motor performance, which can be selectively compromised by brain damage. Awareness of spatial difficulties is supported by a number of distinct components, and influenced by the specific skills required to perform a given task. PMID:25481474

  9. Optimizing experimental procedures for quantitative evaluation of crop plant performance in high throughput phenotyping systems

    PubMed Central

    Junker, Astrid; Muraya, Moses M.; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Klukas, Christian; Melchinger, Albrecht E.; Meyer, Rhonda C.; Riewe, David; Altmann, Thomas

    2015-01-01

    Detailed and standardized protocols for plant cultivation in environmentally controlled conditions are an essential prerequisite to conduct reproducible experiments with precisely defined treatments. Setting up appropriate and well defined experimental procedures is thus crucial for the generation of solid evidence and indispensable for successful plant research. Non-invasive and high throughput (HT) phenotyping technologies offer the opportunity to monitor and quantify performance dynamics of several hundreds of plants at a time. Compared to small scale plant cultivations, HT systems have much higher demands, from a conceptual and a logistic point of view, on experimental design, as well as the actual plant cultivation conditions, and the image analysis and statistical methods for data evaluation. Furthermore, cultivation conditions need to be designed that elicit plant performance characteristics corresponding to those under natural conditions. This manuscript describes critical steps in the optimization of procedures for HT plant phenotyping systems. Starting with the model plant Arabidopsis, HT-compatible methods were tested, and optimized with regard to growth substrate, soil coverage, watering regime, experimental design (considering environmental inhomogeneities) in automated plant cultivation and imaging systems. As revealed by metabolite profiling, plant movement did not affect the plants' physiological status. Based on these results, procedures for maize HT cultivation and monitoring were established. Variation of maize vegetative growth in the HT phenotyping system did match well with that observed in the field. The presented results outline important issues to be considered in the design of HT phenotyping experiments for model and crop plants. It thereby provides guidelines for the setup of HT experimental procedures, which are required for the generation of reliable and reproducible data of phenotypic variation for a broad range of applications. PMID

  10. Quantitative performance evaluation of a blurring restoration algorithm based on principal component analysis

    NASA Astrophysics Data System (ADS)

    Greco, Mario; Huebner, Claudia; Marchi, Gabriele

    2008-10-01

    In the field on blind image deconvolution a new promising algorithm, based on the Principal Component Analysis (PCA), has been recently proposed in the literature. The main advantages of the algorithm are the following: computational complexity is generally lower than other deconvolution techniques (e.g., the widely used Iterative Blind Deconvolution - IBD - method); it is robust to white noise; only the blurring point spread function support is required to perform the single-observation deconvolution (i.e., a single degraded observation of a scene is available), while the multiple-observation one is completely unsupervised (i.e., multiple degraded observations of a scene are available). The effectiveness of the PCA-based restoration algorithm has been only confirmed by visual inspection and, to the best of our knowledge, no objective image quality assessment has been performed. In this paper a generalization of the original algorithm version is proposed; then the previous unexplored issue is considered and the achieved results are compared with that of the IBD method, which is used as benchmark.

  11. Quantitative evaluation of the performance of an industrial benchtop enclosing hood.

    PubMed

    He, Xinjian Kevin; Guffey, Steven E

    2013-01-01

    Plain benchtop enclosing hoods are assumed to be highly effective in protecting workers from airborne contaminants, but there is little research published to support or rebut that assumption. The purpose of this research was to investigate the performance of a 36 in. wide, 30 in. high, and 40 in. deep benchtop enclosing hood. The study consisted of two parts: (1) investigating the effects of hood face velocity (five levels: 111, 140, 170, 200, and 229 ft/min) and wind tunnel cross-draft velocity (five levels: 14, 26, 36, 46, and 57 ft/min) on a plain benchtop enclosing hood, and (2) studying the effects of specific interventions (no-intervention, collar flange, bottom flange, cowling, and sash) added onto the same enclosing hood. A tracer gas method was used to study the hood's performance inside a 9 ft high, 12 ft wide, and 40 ft long wind tunnel. Freon-134a concentrations were measured at the mouth and nose of an anthropometrically scaled, heated, breathing manikin holding a source between its hands while standing at the enclosing hood's face. Roughly 3 L/min of pure Freon-134a mixed with 9 L/min of helium was released from the source during all tests. Results showed that hood face velocity, wind tunnel cross-draft velocity, and interventions had statistically significant effects (p < 0.05) on the concentrations measured at the manikin's breathing zone. Lower exposures were associated with higher face velocities and higher cross-draft velocities. The highest exposures occurred when the face velocity was at the lowest test value (111 ft/min), and the cross-draft velocity was at its lowest test value (14 ft/min). For the effects of interventions to the hood face, the results showed that flanges and the cowling failed to consistently reduce exposures and often exacerbated them. However, the customized sash reduced exposures to less than the detection limit of 0.1 ppm, so a similar sash should be considered when feasible. The hood face velocity should be at least 150

  12. Three-year randomised clinical trial to evaluate the clinical performance, quantitative and qualitative wear patterns of hybrid composite restorations

    PubMed Central

    Palaniappan, Senthamaraiselvi; Elsen, Liesbeth; Lijnen, Inge; Peumans, Marleen; Van Meerbeek, Bart

    2009-01-01

    The aim of the study was to compare the clinical performance, quantitative and qualitative wear patterns of conventional hybrid (Tetric Ceram), micro-filled hybrid (Gradia Direct Posterior) and nano-hybrid (Tetric EvoCeram, TEC) posterior composite restorations in a 3-year randomised clinical trial. Sixteen Tetric Ceram, 17 TEC and 16 Gradia Direct Posterior restorations were placed in human molars and evaluated at baseline, 6, 12, 24 and 36 months of clinical service according to US Public Health Service criteria. The gypsum replicas at each recall were used for 3D laser scanning to quantify wear, and the epoxy resin replicas were observed under scanning electron microscope to study the qualitative wear patterns. After 3 years of clinical service, the three hybrid restorative materials performed clinically well in posterior cavities. Within the observation period, the nano-hybrid and micro-hybrid restorations evolved better in polishability with improved surface gloss retention than the conventional hybrid counterpart. The three hybrid composites showed enamel-like vertical wear and cavity-size dependant volume loss magnitude. Qualitatively, while the micro-filled and nano-hybrid composite restorations exhibited signs of fatigue similar to the conventional hybrid composite restorations at heavy occlusal contact area, their light occlusal contact areas showed less surface pitting after 3 years of clinical service. PMID:19669176

  13. Community Health Workers to Improve Antenatal Care and PMTCT Uptake in Dar es Salaam, Tanzania: A Quantitative Performance Evaluation

    PubMed Central

    Sando, David; Magesa, Lucy; Machumi, Lameck; Mungure, Esther; Mwanyika Sando, Mary; Geldsetzer, Pascal; Foster, Dawn; Kajoka, Deborah; Naburi, Helga; Ekström, Anna M.; Spiegelman, Donna; Li, Nan; Chalamilla, Guerino; Fawzi, Wafaie; Bärnighausen, Till

    2014-01-01

    Background: Home visits by community health workers (CHW) could be effective in identifying pregnant women in the community before they have presented to the health system. CHW could thus improve the uptake of antenatal care (ANC), HIV testing, and prevention of mother-to-child transmission (PMTCT) services. Methods: Over a 16-month period, we carried out a quantitative evaluation of the performance of CHW in reaching women early in pregnancy and before they have attended ANC in Dar es Salaam, Tanzania. Results: As part of the intervention, 213 CHW conducted more than 45,000 home visits to about 43,000 pregnant women. More than 75% of the pregnant women identified through home visits had not yet attended ANC at the time of the first contact with a CHW and about 40% of those who had not yet attended ANC were in the first trimester of pregnancy. Over time, the number of pregnant women the CHW identified each month increased, as did the proportion of women who had not yet attended ANC. The median gestational age of pregnant women contacted for the first time by a CHW decreased steadily and significantly over time (from 21/22 to 16 weeks, P-value for test of trend <0.0001). Conclusions: A large-scale CHW intervention was effective in identifying pregnant women in their homes early in pregnancy and before they had attended ANC. The intervention thus fulfills some of the conditions that are necessary for CHW to improve timely ANC uptake and early HIV testing and PMTCT enrollment in pregnancy. PMID:25436818

  14. Evaluation and performance of desorption electrospray ionization using a triple quadrupole mass spectrometer for quantitation of pharmaceuticals in plasma.

    PubMed

    Kennedy, Joseph H; Wiseman, Justin M

    2010-02-01

    The present work describes the methodology and investigates the performance of desorption electrospray ionization (DESI) combined with a triple quadrupole mass spectrometer for the quantitation of small drug molecules in human plasma. Amoxepine, atenolol, carbamazepine, clozapine, prazosin, propranolol and verapamil were selected as target analytes while terfenadine was selected as the internal standard common to each of the analytes. Protein precipitation of human plasma using acetonitrile was utilized for all samples. Limits of detection were determined for all analytes in plasma and shown to be in the range 0.2-40 ng/mL. Quantitative analysis of amoxepine, prazosin and verapamil was performed over the range 20-7400 ng/mL and shown to be linear in all cases with R(2) >0.99. In most cases, the precision (relative standard deviation) and accuracy (relative error) of each method were less than or equal to 20%, respectively. The performance of the combined techniques made it possible to analyze each sample in 15 s illustrating DESI tandem mass spectrometry (MS/MS) as powerful tool for the quantitation of analytes in deproteinized human plasma. PMID:20049888

  15. Quantitative evaluation of manufacturability and performance for ILT produced mask shapes using a single-objective function

    NASA Astrophysics Data System (ADS)

    Choi, Heon; Wang, Wei-long; Kallingal, Chidam

    2015-03-01

    The continuous scaling of semiconductor devices is quickly outpacing the resolution improvements of lithographic exposure tools and processes. This one-sided progression has pushed optical lithography to its limits, resulting in the use of well-known techniques such as Sub-Resolution Assist Features (SRAF's), Source-Mask Optimization (SMO), and double-patterning, to name a few. These techniques, belonging to a larger category of Resolution Enhancement Techniques (RET), have extended the resolution capabilities of optical lithography at the cost of increasing mask complexity, and therefore cost. One such technique, called Inverse Lithography Technique (ILT), has attracted much attention for its ability to produce the best possible theoretical mask design. ILT treats the mask design process as an inverse problem, where the known transformation from mask to wafer is carried out backwards using a rigorous mathematical approach. One practical problem in the application of ILT is the resulting contour-like mask shapes that must be "Manhattanized" (composed of straight edges and 90-deg corners) in order to produce a manufacturable mask. This conversion process inherently degrades the mask quality as it is a departure from the "optimal mask" represented by the continuously curved shapes produced by ILT. However, simpler masks composed of longer straight edges reduce the mask cost as it lowers the shot count and saves mask writing time during mask fabrication, resulting in a conflict between manufacturability and performance for ILT produced masks1,2. In this study, various commonly used metrics will be combined into an objective function to produce a single number to quantitatively measure a particular ILT solution's ability to balance mask manufacturability and RET performance. Several metrics that relate to mask manufacturing costs (i.e. mask vertex count, ILT computation runtime) are appropriately weighted against metrics that represent RET capability (i.e. process

  16. Blind Analysis of Fortified Pesticide Residues in Carrot Extracts using GC-MS to Evaluate Qualitative and Quantitative Performance

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Unlike quantitative analysis, the quality of the qualitative results in the analysis of pesticide residues in food are generally ignored in practice. Instead, chemists tend to rely on advanced mass spectrometric techniques and general subjective guidelines or fixed acceptability criteria when makin...

  17. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  18. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. PMID:26456933

  19. Performance evaluation of Laser Induced Breakdown Spectroscopy (LIBS) for quantitative analysis of rare earth elements in phosphate glasses

    NASA Astrophysics Data System (ADS)

    Devangad, Praveen; Unnikrishnan, V. K.; Nayak, Rajesh; Tamboli, M. M.; Muhammed Shameem, K. M.; Santhosh, C.; Kumar, G. A.; Sardar, D. K.

    2016-02-01

    In the current study, we have determined the elemental compositions of synthesized rare earth doped phosphate glasses using a laboratory Laser-Induced Breakdown Spectroscopy (LIBS) system. LIBS spectra of this rare earth (samarium (Sm), thulium (Tm) and ytterbium (Yb)) doped glass samples with known composition are recorded using a highly sensitive detector. Major atomic emission lines of Sm, Tm and Yb found in LIBS spectra are reported. By considering the atomic emission line of phosphorous as an internal standard, calibration curves were constructed for all the rare earth concentrations. Very good linear regression coefficient (R2) values were obtained using this technique. Analytical predictive skill of LIBS was studied further using leave-one-out method. Low values of the reported correlation uncertainty between measured LIBS concentration ratio and certified concentration ratio confirms that LIBS technique has great potential for quantitative analysis of rare earth elements in glass matrix.

  20. Imaging Performance of Quantitative Transmission Ultrasound

    PubMed Central

    Lenox, Mark W.; Wiskin, James; Lewis, Matthew A.; Darrouzet, Stephen; Borup, David; Hsieh, Scott

    2015-01-01

    Quantitative Transmission Ultrasound (QTUS) is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound. PMID:26604918

  1. An approach for relating the results of quantitative nondestructive evaluation to intrinsic properties of high-performance materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1990-01-01

    One of the most difficult problems the manufacturing community has faced during recent years has been to accurately assess the physical state of anisotropic high-performance materials by nondestructive means. In order to advance the design of ultrasonic nondestructive testing systems, a more fundamental understanding of how ultrasonic waves travel and interact within the anisotropic material is needed. The relationship between the ultrasonic and engineering parameters needs to be explored to understand their mutual dependence. One common denominator is provided by the elastic constants. The preparation of specific graphite/epoxy samples to be used in the experimental investigation of the anisotropic properties (through the measurement of the elastic stiffness constants) is discussed. Accurate measurements of these constants will depend upon knowledge of refraction effects as well as the direction of group velocity propagation. The continuing effort for the development of improved visualization techniques for physical parameters is discussed. Group velocity images are presented and discussed. In order to fully understand the relationship between the ultrasonic and the common engineering parameters, the physical interpretation of the linear elastic coefficients (the quantities that relate applied stresses to resulting strains) are discussed. This discussion builds a more intuitional understanding of how the ultrasonic parameters are related to the traditional engineering parameters.

  2. Performance of phalangeal quantitative ultrasound parameters in the evaluation of reduced bone mineral density assessed by DX in patients with 21 hydroxylase deficiency.

    PubMed

    Gonçalves, Ezequiel M; Sewaybricker, Leticia E; Baptista, Fatima; Silva, Analiza M; Carvalho, Wellington R G; Santos, Allan O; de Mello, Maricilda P; Lemos-Marini, Sofia H V; Guerra, Gil

    2014-07-01

    The purpose of this study was to verify the performance of quantitative ultrasound (QUS) parameters of proximal phalanges in the evaluation of reduced bone mineral density (BMD) in patients with congenital adrenal hyperplasia due to 21-hydroxylase deficiency (21 OHD). Seventy patients with 21 OHD (41 females and 29 males), aged between 6-27 y were assessed. The QUS measurements, amplitude-dependent speed of sound (AD-SoS), bone transmission time (BTT), and ultrasound bone profile index (UBPI) were obtained using the BMD Sonic device (IGEA, Carpi, Italy) on the last four proximal phalanges in the non-dominant hand. BMD was determined by dual energy X-ray (DXA) across the total body and lumbar spine (LS). Total body and LS BMD were positively correlated to UBPI, BTT and AD-SoS (correlation coefficients ranged from 0.59-0.72, p < 0.001). In contrast, when comparing patients with normal and low (Z-score < -2) BMD, no differences were found in the QUS parameters. Furthermore, UBPI, BTT and AD-SoS measurements were not effective for diagnosing patients with reduced BMD by receiver operator characteristic curve parameters. Although the AD-SoS, BTT and UBPI showed significant correlations with the data obtained by DXA, they were not effective for diagnosing reduced bone mass in patients with 21 OHD. PMID:24726797

  3. Quantitative roadmap of holographic media performance

    NASA Astrophysics Data System (ADS)

    Kowalski, Benjamin A.; McLeod, Robert R.

    2015-09-01

    For holographic photopolymer media, the "formula limit" concept enables facile calculation of the fraction of writing chemistry that is usefully patterned, and the fraction that is wasted. This provides a quantitative context to compare the performance of a diverse range of media formulations from the literature, using only information already reported in the original works. Finally, this analysis is extended to estimate the scope of achievable future performance improvements.

  4. Influence of sulphur-fumigation on the quality of white ginseng: a quantitative evaluation of major ginsenosides by high performance liquid chromatography.

    PubMed

    Jin, Xin; Zhu, Ling-Ying; Shen, Hong; Xu, Jun; Li, Song-Lin; Jia, Xiao-Bin; Cai, Hao; Cai, Bao-Chang; Yan, Ru

    2012-12-01

    White ginseng was reported to be sulphur-fumigated during post-harvest handling. In the present study, the influence of sulphur-fumigation on the quality of white ginseng and its decoction were quantitatively evaluated through simultaneous quantification of 14 major ginsenosides by a validated high performance liquid chromatography. Poroshell 120 EC-C18 (100mm×3.0mm, 2.7μm) column was chosen for the separation of the major ginsenosides, which were eluted with gradient water and acetonitrile as mobile phase. The analytes were monitored by UV at 203nm. The method was validated in terms of linearity, sensitivity, precision, accuracy and stability. The sulphur-fumigated and non-fumigated white ginseng samples, as well as their respective decoctions, were comparatively analysed with the newly-validated method. It was found that the contents of nine ginsenosides detected in raw materials decreased by about 3-85%, respectively, and the total content of the nine ginsenosides detected in raw materials, decreased by almost 54% after sulphur-fumigation. On the other hand, the contents of 10 ginsenosides detected in decoctions of sulphur-fumigated white ginseng were decreased by about 33-83%, respectively, and the total content of ginsenosides was decreased by up to 64% when compared with that of non-fumigated white ginseng. In addition, ginsenoside Rh(2) and Rg(5) could be detected in the decoctions of sulphur-fumigated white ginseng but not in that of non-fumigated white ginseng. It is suggested that sulphur-fumigation can significantly influence not only the contents of original ginsenosides, but also the decocting-induced chemical transformation of ginsenosides in white ginseng. PMID:22953836

  5. Quantitative Evaluation of Industrial Components

    NASA Astrophysics Data System (ADS)

    Juptner, Werner

    1987-09-01

    Holographic Interferometry was thought to be a powerful tool for a lot of applications, since it was invented by Stetson and Powell /II. Although till today only few industrial applications - mainly in Holographic Non-Destructive Testing (HNDT) -are known, this is still valid. There are tasks, which can be solved by means of this technique better and more economically than by conventional methods. For example, it is nearly impossible to calculate the deformation behaviour of complex parts of pressure vessels till today: The very complex form would lead to a very long calculation time, using e.g. Finite-Element-Methods (FEM). Even when a calculation is performed, it is necessary to prove the calculation by experimen-tal stress analysis. For complex objects it needs up to 1000 strain gauges. This means several months of preparing time and approximately 100.000$ costs for one result. In this application holographic interferometry could do the job for less than half the amount of costs.

  6. Combination of quantitative analysis and chemometric analysis for the quality evaluation of three different frankincenses by ultra high performance liquid chromatography and quadrupole time of flight mass spectrometry.

    PubMed

    Zhang, Chao; Sun, Lei; Tian, Run-tao; Jin, Hong-yu; Ma, Shuang-Cheng; Gu, Bing-ren

    2015-10-01

    Frankincense has gained increasing attention in the pharmaceutical industry because of its pharmacologically active components such as boswellic acids. However, the identity and overall quality evaluation of three different frankincense species in different Pharmacopeias and the literature have less been reported. In this paper, quantitative analysis and chemometric evaluation were established and applied for the quality control of frankincense. Meanwhile, quantitative and chemometric analysis could be conducted under the same analytical conditions. In total 55 samples from four habitats (three species) of frankincense were collected and six boswellic acids were chosen for quantitative analysis. Chemometric analyses such as similarity analysis, hierarchical cluster analysis, and principal component analysis were used to identify frankincense of three species to reveal the correlation between its components and species. In addition, 12 chromatographic peaks have been tentatively identified explored by reference substances and quadrupole time-of-flight mass spectrometry. The results indicated that the total boswellic acid profiles of three species of frankincense are similar and their fingerprints can be used to differentiate between them. PMID:26228790

  7. A Program to Evaluate Quantitative Analysis Unknowns

    ERIC Educational Resources Information Center

    Potter, Larry; Brown, Bruce

    1978-01-01

    Reports on a computer batch program that will not only perform routine grading using several grading algorithms, but will also calculate various statistical measures by which the class performance can be evaluated and cumulative data collected. ( Author/CP)

  8. Apprentice Performance Evaluation.

    ERIC Educational Resources Information Center

    Gast, Clyde W.

    The Granite City (Illinois) Steel apprentices are under a performance evaluation from entry to graduation. Federally approved, the program is guided by joint apprenticeship committees whose monthly meetings include performance evaluation from three information sources: journeymen, supervisors, and instructors. Journeymen's evaluations are made…

  9. EPICS performance evaluation

    SciTech Connect

    Botlo, M.; Jagielski, M.; Romero, A.

    1993-09-01

    The authors report on the software architecture, some CPU and memory issues, and the performance of the Experimental Physics and Industrial Control System (EPICS). Specifically, they subject each EPICS software layer to a series of tests and extract quantitative results that should be useful to system architects planning to use EPICS for control applications.

  10. Evaluating Teacher Performance Fairly.

    ERIC Educational Resources Information Center

    Sportsman, Michel Allain

    1986-01-01

    Describes foundation and development of a performance-based teacher evaluation method developed in Missouri which makes mastery learning the basis for outcomes of instruction. Eight discrete parts of the teaching act characterizing successful teaching, four criteria important in performance-based evaluation development, and four definable phases…

  11. Influence of processing procedure on the quality of Radix Scrophulariae: a quantitative evaluation of the main compounds obtained by accelerated solvent extraction and high-performance liquid chromatography.

    PubMed

    Cao, Gang; Wu, Xin; Li, Qinglin; Cai, Hao; Cai, Baochang; Zhu, Xuemei

    2015-02-01

    An improved high-performance liquid chromatography with diode array detection combined with accelerated solvent extraction method was used to simultaneously determine six compounds in crude and processed Radix Scrophulariae samples. Accelerated solvent extraction parameters such as extraction solvent, temperature, number of cycles, and analysis procedure were systematically optimized. The results indicated that compared with crude Radix Scrophulariae samples, the processed samples had lower contents of harpagide and harpagoside but higher contents of catalpol, acteoside, angoroside C, and cinnamic acid. The established method was sufficiently rapid and reliable for the global quality evaluation of crude and processed herbal medicines. PMID:25431110

  12. Evaluation of Performance Characteristics of the Aptima HIV-1 Quant Dx Assay for Detection and Quantitation of Human Immunodeficiency Virus Type 1 in Plasma and Cervicovaginal Lavage Samples.

    PubMed

    Sam, Soya S; Kurpewski, Jaclynn R; Cu-Uvin, Susan; Caliendo, Angela M

    2016-04-01

    Quantification of HIV-1 RNA has become the standard of care in the clinical management of HIV-1-infected individuals. The objective of this study was to evaluate performance characteristics and relative workflow of the Aptima HIV-1 Quant Dx assay in comparison with the Abbott RealTime HIV-1 assay using plasma and cervicovaginal lavage (CVL) specimens. Assay performance was evaluated by using an AcroMetrix HIV-1 panel, AcroMetrix positive controls, Qnostics and SeraCare HIV-1 evaluation panels, 208 clinical plasma samples, and 205 matched CVL specimens on the Panther and m2000 platforms. The Aptima assay demonstrated good linearity over the quantification range tested (2 to 5 log10copies/ml), and there was strong linear correlation between the assays (R(2)= 0.99), with a comparable coefficient of variance of <5.5%. For the plasma samples, Deming regression analyses and Bland-Altman plots showed excellent agreement between the assays, with an interassay concordance of 91.35% (kappa = 0.75; 95% confidence interval [CI], 0.65 to 0.85), and on average, the viral loads determined by the Aptima assay were 0.21 log10copies/ml higher than those determined by the RealTime assay. The assays differed in their sensitivity for quantifying HIV-1 RNA loads in CVL samples, with the Aptima and RealTime assays detecting 30% and 20%, respectively. Aptima had fewer invalid results, and on average, the viral loads in CVL samples quantified by the Aptima assay were 0.072 log10copies/ml higher than those of the RealTime assay. Our results demonstrate that the Aptima assay is sensitive and accurate in quantifying viral loads in both plasma and CVL specimens and that the fully automated Panther system has all the necessary features suitable for clinical laboratories demanding high-throughput sample processing. PMID:26842702

  13. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  14. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  15. Performance Evaluation Process.

    ERIC Educational Resources Information Center

    1998

    This document contains four papers from a symposium on the performance evaluation process and human resource development (HRD). "Assessing the Effectiveness of OJT (On the Job Training): A Case Study Approach" (Julie Furst-Bowe, Debra Gates) is a case study of the effectiveness of OJT in one of a high-tech manufacturing company's product lines.…

  16. Assessment beyond Performance: Phenomenography in Educational Evaluation

    ERIC Educational Resources Information Center

    Micari, Marina; Light, Gregory; Calkins, Susanna; Streitwieser, Bernhard

    2007-01-01

    Increasing calls for accountability in education have promoted improvements in quantitative evaluation approaches that measure student performance; however, this has often been to the detriment of qualitative approaches, reducing the richness of educational evaluation as an enterprise. In this article the authors assert that it is not merely…

  17. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  18. Performance of calibration standards for antigen quantitation with flow cytometry.

    PubMed

    Lenkei, R; Gratama, J W; Rothe, G; Schmitz, G; D'hautcourt, J L; Arekrans, A; Mandy, F; Marti, G

    1998-10-01

    In the frame of the activities initiated by the Task Force for Antigen Quantitation of the European Working Group on Clinical Cell Analysis (EWGCCA), an experiment was conducted to evaluate microbead standards used for quantitative flow cytometry (QFCM). An unified window of analysis (UWA) was established on three different instruments (EPICS XL [Coulter Corporation, Miami, FL], FACScan and FACS Calibur [Becton Dickinson, San Jose, CA]) with QC3 microbeads (FCSC, PR). By using this defined fluorescence intensity scale, the performance of several monoclonal antibodies directed to CD3, CD4, and CD8 (conjugated and unconjugated), from three manufacturers (BDIS, Coulter [Immunotech], and DAKO) was tested. In addition, the QIFI system (DAKO) and QuantiBRITE (BDIS), and a method of relative fluorescence intensity (RFI, method of Giorgi), were compared. mAbs reacting with three more antigens, CD16, CD19, and CD38 were tested on the FACScan instrument. Quantitation was carried out using a single batch of cryopreserved peripheral blood leukocytes, and all tests were performed as single color analyses. Significant correlations were observed between the antibody-binding capacity (ABC) values of the same CD antigen measured with various calibrators and with antibodies differing in respect to vendor, labeling and possible epitope recognition. Despite the significant correlations, the ABC values of most monoclonal antibodies differed by 20-40% when determined by the different fluorochrome conjugates and different calibrators. The results of this study indicate that, at the present stage of QFCM consistent ABC values may be attained between laboratories provided that a specific calibration system is used including specific calibrators, reagents, and protocols. PMID:9773879

  19. Quantitative comparison of the performance of SAR segmentation algorithms.

    PubMed

    Caves, R; Quegan, S; White, R

    1998-01-01

    Methods to evaluate the performance of segmentation algorithms for synthetic aperture radar (SAR) images are developed, based on known properties of coherent speckle and a scene model in which areas of constant backscatter coefficient are separated by abrupt edges. Local and global measures of segmentation homogeneity are derived and applied to the outputs of two segmentation algorithms developed for SAR data, one based on iterative edge detection and segment growing, the other based on global maximum a posteriori (MAP) estimation using simulated annealing. The quantitative statistically based measures appear consistent with visual impressions of the relative quality of the segmentations produced by the two algorithms. On simulated data meeting algorithm assumptions, both algorithms performed well but MAP methods appeared visually and measurably better. On real data, MAP estimation was markedly the better method and retained performance comparable to that on simulated data, while the performance of the other algorithm deteriorated sharply. Improvements in the performance measures will require a more realistic scene model and techniques to recognize oversegmentation. PMID:18276219

  20. Quantitative assessment of hyperspectral sensor detection performance

    NASA Astrophysics Data System (ADS)

    Sommese, Anthony M.; Shetler, Bruce V.; Billingsley, Frank P.

    1997-10-01

    The ability to differentiate man-made materials from natural materials depends upon exploiting recognizable differences in their respective spectral response. Of particular interest is the question of whether a given materials signature derived from a laboratory or field measurement can be used as a training vector for discrimination or identification in a given setting. Through the application of a matched filter, we can quantify the performance of training vectors which have been transferred in this way as well as identify which spectral regions are most diagnostic in a given situation.

  1. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  2. A study on the quantitative evaluation of skin barrier function

    NASA Astrophysics Data System (ADS)

    Maruyama, Tomomi; Kabetani, Yasuhiro; Kido, Michiko; Yamada, Kenji; Oikaze, Hirotoshi; Takechi, Yohei; Furuta, Tomotaka; Ishii, Shoichi; Katayama, Haruna; Jeong, Hieyong; Ohno, Yuko

    2015-03-01

    We propose a quantitative evaluation method of skin barrier function using Optical Coherence Microscopy system (OCM system) with coherency of near-infrared light. There are a lot of skin problems such as itching, irritation and so on. It has been recognized skin problems are caused by impairment of skin barrier function, which prevents damage from various external stimuli and loss of water. To evaluate skin barrier function, it is a common strategy that they observe skin surface and ask patients about their skin condition. The methods are subjective judgements and they are influenced by difference of experience of persons. Furthermore, microscopy has been used to observe inner structure of the skin in detail, and in vitro measurements like microscopy requires tissue sampling. On the other hand, it is necessary to assess objectively skin barrier function by quantitative evaluation method. In addition, non-invasive and nondestructive measuring method and examination changes over time are needed. Therefore, in vivo measurements are crucial for evaluating skin barrier function. In this study, we evaluate changes of stratum corneum structure which is important for evaluating skin barrier function by comparing water-penetrated skin with normal skin using a system with coherency of near-infrared light. Proposed method can obtain in vivo 3D images of inner structure of body tissue, which is non-invasive and non-destructive measuring method. We formulate changes of skin ultrastructure after water penetration. Finally, we evaluate the limit of performance of the OCM system in this work in order to discuss how to improve the OCM system.

  3. Functional Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Greenisen, Michael C.; Hayes, Judith C.; Siconolfi, Steven F.; Moore, Alan D.

    1999-01-01

    The Extended Duration Orbiter Medical Project (EDOMP) was established to address specific issues associated with optimizing the ability of crews to complete mission tasks deemed essential to entry, landing, and egress for spaceflights lasting up to 16 days. The main objectives of this functional performance evaluation were to investigate the physiological effects of long-duration spaceflight on skeletal muscle strength and endurance, as well as aerobic capacity and orthostatic function. Long-duration exposure to a microgravity environment may produce physiological alterations that affect crew ability to complete critical tasks such as extravehicular activity (EVA), intravehicular activity (IVA), and nominal or emergency egress. Ultimately, this information will be used to develop and verify countermeasures. The answers to three specific functional performance questions were sought: (1) What are the performance decrements resulting from missions of varying durations? (2) What are the physical requirements for successful entry, landing, and emergency egress from the Shuttle? and (3) What combination of preflight fitness training and in-flight countermeasures will minimize in-flight muscle performance decrements? To answer these questions, the Exercise Countermeasures Project looked at physiological changes associated with muscle degradation as well as orthostatic intolerance. A means of ensuring motor coordination was necessary to maintain proficiency in piloting skills, EVA, and IVA tasks. In addition, it was necessary to maintain musculoskeletal strength and function to meet the rigors associated with moderate altitude bailout and with nominal or emergency egress from the landed Orbiter. Eight investigations, referred to as Detailed Supplementary Objectives (DSOs) 475, 476, 477, 606, 608, 617, 618, and 624, were conducted to study muscle degradation and the effects of exercise on exercise capacity and orthostatic function (Table 3-1). This chapter is divided into

  4. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  5. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams. PMID:27505659

  6. Evaluating Economic Performance and Policies.

    ERIC Educational Resources Information Center

    Thurow, Lester C.

    1987-01-01

    Argues that a social welfare approach to evaluating economic performance is inappropriate at the high school level. Provides several historical case studies which could be used to augment instruction aimed at the evaluation of economic performance and policies. (JDH)

  7. Investigations of the relationship between use of in vitro cell culture-quantitative PCR and a mouse-based bioassay for evaluating critical factors affecting the disinfection performance of pulsed UV light for treating Cryptosporidium parvum oocysts in saline.

    PubMed

    Garvey, Mary; Farrell, Hugh; Cormican, Martin; Rowan, Neil

    2010-03-01

    Cryptosporidium parvum is an enteric coccidian parasite that is recognised as a frequent cause of water-borne disease in humans. We report for the first time on use of the in vitro HCT-8 cell culture-quantitative PCR (qPCR) assay and the in vivo SCID-mouse bioassay for evaluating critical factors that reduce or eliminate infectivity of C. parvum after irradiating oocysts in saline solution under varying operational conditions with pulsed UV light. Infections post UV treatments were detected by immunofluorescence (IF) microscopy and by quantitative PCR in cell culture, and by IF staining of faeces and by hematoxylin and eosin staining of intestinal villi in mice. There was a good agreement between using cell culture-qPCR and the mouse assay for determining reduction or elimination of C. parvum infectivity as a consequence of varying UV operating conditions. Reduction in infectivity depended on the intensity of lamp discharge energy applied, amount of pulsing and population size of oocysts (P < or = 0.05). Conventional radiometer was unable to measure fluence or UV dose in saline samples due to the ultra-short non-continuous nature of the high-energy light pulses. Incorporation of humic acid at a concentration above that found in surface water (i.e., < or =10 ppm) did not significantly affect PUV disinfection capability irrespective of parameters tested (P < or = 0.05). These observations show that use of this HCT-8 cell culture assay is equivalent to using the 'gold standard' mouse-based infectivity assay for determining disinfection performances of PUV for treating C. parvum in saline solution. PMID:20096310

  8. Evaluating Student Teacher Performance

    ERIC Educational Resources Information Center

    Castolo, Carmencita L.; Dizon, Rosemariebeth R.

    2007-01-01

    Evaluation is a continuous process interwoven into the entire students teaching experience. Preplanning the evaluation process is therefore very important. Without continuous planned evaluation from the co-operating teacher, the value of student teaching is greatly reduced. One of the main purposes of the student teaching experience is to allow…

  9. A Method for Quantitatively Evaluating a University Library Collection

    ERIC Educational Resources Information Center

    Golden, Barbara

    1974-01-01

    The acquisitions department of the University of Nebraska at Omaha library conducted a quantitative evaluation of the library's book collection in relation to the course offerings of the university. (Author/LS)

  10. Quantitative Evaluation of Management Courses: Part 1

    ERIC Educational Resources Information Center

    Cunningham, Cyril

    1973-01-01

    The author describes how he developed a method of evaluating and comparing management courses of different types and lengths by applying an ordinal system of relative values using a process of transmutation. (MS)

  11. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: QA TESTS, QUANTITATION AND SPECTROSCOPY

    EPA Science Inventory

    Confocal Microscopy System Performance: QA tests, Quantitation and Spectroscopy.

    Robert M. Zucker 1 and Jeremy M. Lerner 2,
    1Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research Development, U.S. Environmen...

  12. QUANTITATIVE EVALUATION OF FIRE SEPARATION AND BARRIERS

    SciTech Connect

    Coutts, D

    2007-04-17

    Fire barriers, and physical separation are key components in managing the fire risk in Nuclear Facilities. The expected performance of these features have often been predicted using rules-of-thumb or expert judgment. These approaches often lack the convincing technical bases that exist when addressing other Nuclear Facility accident events. This paper presents science-based approaches to demonstrate the effectiveness of fire separation methods.

  13. Evaluation (not validation) of quantitative models.

    PubMed Central

    Oreskes, N

    1998-01-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  14. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  15. Integrating Qualitative and Quantitative Evaluation Methods in Substance Abuse Research.

    ERIC Educational Resources Information Center

    Dennis, Michael L.; And Others

    1994-01-01

    Some specific opportunities and techniques are described for combining and integrating qualitative and quantitative methods from the design stage of a substance abuse program evaluation through implementation and reporting. The multiple problems and requirements of such an evaluation make integrated methods essential. (SLD)

  16. Instrument performance evaluation

    SciTech Connect

    Swinth, K.L.

    1993-03-01

    Deficiencies exist in both the performance and the quality of health physics instruments. Recognizing the implications of such deficiencies for the protection of workers and the public, in the early 1980s the DOE and the NRC encouraged the development of a performance standard and established a program to test a series of instruments against criteria in the standard. The purpose of the testing was to establish the practicality of the criteria in the standard, to determine the performance of a cross section of available instruments, and to establish a testing capability. Over 100 instruments were tested, resulting in a practical standard and an understanding of the deficiencies in available instruments. In parallel with the instrument testing, a value-impact study clearly established the benefits of implementing a formal testing program. An ad hoc committee also met several times to establish recommendations for the voluntary implementation of a testing program based on the studies and the performance standard. For several reasons, a formal program did not materialize. Ongoing tests and studies have supported the development of specific instruments and have helped specific clients understand the performance of their instruments. The purpose of this presentation is to trace the history of instrument testing to date and suggest the benefits of a centralized formal program.

  17. Quantitative code accuracy evaluation of ISP33

    SciTech Connect

    Kalli, H.; Miwrrin, A.; Purhonen, H.

    1995-09-01

    Aiming at quantifying code accuracy, a methodology based on the Fast Fourier Transform has been developed at the University of Pisa, Italy. The paper deals with a short presentation of the methodology and its application to pre-test and post-test calculations submitted to the International Standard Problem ISP33. This was a double-blind natural circulation exercise with a stepwise reduced primary coolant inventory, performed in PACTEL facility in Finland. PACTEL is a 1/305 volumetrically scaled, full-height simulator of the Russian type VVER-440 pressurized water reactor, with horizontal steam generators and loop seals in both cold and hot legs. Fifteen foreign organizations participated in ISP33, with 21 blind calculations and 20 post-test calculations, altogether 10 different thermal hydraulic codes and code versions were used. The results of the application of the methodology to nine selected measured quantities are summarized.

  18. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  19. Performance Objectives: Foundation for Evaluation

    ERIC Educational Resources Information Center

    McKinney, Floyd L.; Mannebach, Alfred J.

    1970-01-01

    Only when agricultural educators and others evaluate agricultural education programs on the basis of student's performance in relation to valid and realistic performance objectives will progress be made in educational program improvement. (Authors)

  20. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  1. Evaluating and Improving Teacher Performance.

    ERIC Educational Resources Information Center

    Manatt, Richard P.

    This workbook, coordinated with Manatt Teacher Performance Evaluation (TPE) workshops, summarizes large group presentation in sequence with the transparancies used. The first four modules of the workbook deal with the state of the art of evaluating and improving teacher performance; the development of the TPE system, including selection of…

  2. Performance evaluation soil samples utilizing encapsulation technology

    DOEpatents

    Dahlgran, J.R.

    1999-08-17

    Performance evaluation soil samples and method of their preparation uses encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration. 1 fig.

  3. Performance evaluation soil samples utilizing encapsulation technology

    DOEpatents

    Dahlgran, James R.

    1999-01-01

    Performance evaluation soil samples and method of their preparation using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.

  4. Performance Evaluation and Benchmarking of Intelligent Systems

    SciTech Connect

    Madhavan, Raj; Messina, Elena; Tunstel, Edward

    2009-09-01

    To design and develop capable, dependable, and affordable intelligent systems, their performance must be measurable. Scientific methodologies for standardization and benchmarking are crucial for quantitatively evaluating the performance of emerging robotic and intelligent systems technologies. There is currently no accepted standard for quantitatively measuring the performance of these systems against user-defined requirements; and furthermore, there is no consensus on what objective evaluation procedures need to be followed to understand the performance of these systems. The lack of reproducible and repeatable test methods has precluded researchers working towards a common goal from exchanging and communicating results, inter-comparing system performance, and leveraging previous work that could otherwise avoid duplication and expedite technology transfer. Currently, this lack of cohesion in the community hinders progress in many domains, such as manufacturing, service, healthcare, and security. By providing the research community with access to standardized tools, reference data sets, and open source libraries of solutions, researchers and consumers will be able to evaluate the cost and benefits associated with intelligent systems and associated technologies. In this vein, the edited book volume addresses performance evaluation and metrics for intelligent systems, in general, while emphasizing the need and solutions for standardized methods. To the knowledge of the editors, there is not a single book on the market that is solely dedicated to the subject of performance evaluation and benchmarking of intelligent systems. Even books that address this topic do so only marginally or are out of date. The research work presented in this volume fills this void by drawing from the experiences and insights of experts gained both through theoretical development and practical implementation of intelligent systems in a variety of diverse application domains. The book presents

  5. Performance comparison between static and dynamic cardiac CT on perfusion quantitation and patient classification tasks

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2015-03-01

    Cardiac CT acquisitions for perfusion assessment can be performed in a dynamic or static mode. In this simulation study, we evaluate the relative classification and quantification performance of these modes for assessing myocardial blood flow (MBF). In the dynamic method, a series of low dose cardiac CT acquisitions yields data on contrast bolus dynamics over time; these data are fit with a model to give a quantitative MBF estimate. In the static method, a single CT acquisition is obtained, and the relative CT numbers in the myocardium are used to infer perfusion states. The static method does not directly yield a quantitative estimate of MBF, but these estimates can be roughly approximated by introducing assumed linear relationships between CT number and MBF, consistent with the ways such images are typically visually interpreted. Data obtained by either method may be used for a variety of clinical tasks, including 1) stratifying patients into differing categories of ischemia and 2) using the quantitative MBF estimate directly to evaluate ischemic disease severity. Through simulations, we evaluate the performance on each of these tasks. The dynamic method has very low bias in MBF estimates, making it particularly suitable for quantitative estimation. At matched radiation dose levels, ROC analysis demonstrated that the static method, with its high bias but generally lower variance, has superior performance in stratifying patients, especially for larger patients.

  6. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under-performers

  7. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  8. Evaluating Administrative/Supervisory Performance.

    ERIC Educational Resources Information Center

    Educational Research Service, Arlington, VA.

    This is a report on the third survey conducted on procedures for evaluating the performance of administrators and supervisors in local school systems. A questionnaire was sent to school systems enrolling 25,000 or more pupils, and results indicated that 84 of the 154 responding systems have formal evaluation procedures. Tables and discussions of…

  9. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    SciTech Connect

    Som, P.; Oster, Z.H.

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  10. Performance Assessment in Fingerprinting and Multi Component Quantitative NMR Analyses.

    PubMed

    Gallo, Vito; Intini, Nicola; Mastrorilli, Piero; Latronico, Mario; Scapicchio, Pasquale; Triggiani, Maurizio; Bevilacqua, Vitoantonio; Fanizzi, Paolo; Acquotti, Domenico; Airoldi, Cristina; Arnesano, Fabio; Assfalg, Michael; Benevelli, Francesca; Bertelli, Davide; Cagliani, Laura R; Casadei, Luca; Cesare Marincola, Flaminia; Colafemmina, Giuseppe; Consonni, Roberto; Cosentino, Cesare; Davalli, Silvia; De Pascali, Sandra A; D'Aiuto, Virginia; Faccini, Andrea; Gobetto, Roberto; Lamanna, Raffaele; Liguori, Francesca; Longobardi, Francesco; Mallamace, Domenico; Mazzei, Pierluigi; Menegazzo, Ileana; Milone, Salvatore; Mucci, Adele; Napoli, Claudia; Pertinhez, Thelma; Rizzuti, Antonino; Rocchigiani, Luca; Schievano, Elisabetta; Sciubba, Fabio; Sobolev, Anatoly; Tenori, Leonardo; Valerio, Mariacristina

    2015-07-01

    An interlaboratory comparison (ILC) was organized with the aim to set up quality control indicators suitable for multicomponent quantitative analysis by nuclear magnetic resonance (NMR) spectroscopy. A total of 36 NMR data sets (corresponding to 1260 NMR spectra) were produced by 30 participants using 34 NMR spectrometers. The calibration line method was chosen for the quantification of a five-component model mixture. Results show that quantitative NMR is a robust quantification tool and that 26 out of 36 data sets resulted in statistically equivalent calibration lines for all considered NMR signals. The performance of each laboratory was assessed by means of a new performance index (named Qp-score) which is related to the difference between the experimental and the consensus values of the slope of the calibration lines. Laboratories endowed with a Qp-score falling within the suitable acceptability range are qualified to produce NMR spectra that can be considered statistically equivalent in terms of relative intensities of the signals. In addition, the specific response of nuclei to the experimental excitation/relaxation conditions was addressed by means of the parameter named NR. NR is related to the difference between the theoretical and the consensus slopes of the calibration lines and is specific for each signal produced by a well-defined set of acquisition parameters. PMID:26020452

  11. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    PubMed Central

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-01-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output. PMID:26430292

  12. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  13. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  14. Quantitatively evaluating the CBM reservoir using logging data

    NASA Astrophysics Data System (ADS)

    Liu, Zhidi; Zhao, Jingzhou

    2016-02-01

    In order to evaluate coal bed methane (CBM) reservoirs, this paper select five parameters: porosity, permeability, CBM content, the coal structure index and effective thickness of the coal seam. Making full use of logging and the laboratory analysis data of a coal core, the logging evaluation methods of the five parameters were discussed in detail, and the comprehensive evaluation model of the CBM reservoir was established. The #5 coal seam of the Hancheng mine on the eastern edge of the Ordos Basin in China was quantitatively evaluated using this method. The results show that the CBM reservoir in the study area is better than in the central and northern regions. The actual development of CBM shows that the region with a good reservoir has high gas production—indicating that the method introduced in this paper can evaluate the CBM reservoir more effectively.

  15. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease

    PubMed Central

    van Gilst, Merel M.; van Mierlo, Petra; Bloem, Bastiaan R.; Overeem, Sebastiaan

    2015-01-01

    Study Objectives: Many people with Parkinson disease experience “sleep benefit”: temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. Design: Eighteen Parkinson patients with and 20 without subjective sleep benefit and 20 healthy controls participated. Before and directly after a regular night sleep and an afternoon nap, subjects performed the timed pegboard dexterity task and quantified finger tapping task. Subjective ratings of motor functioning and mood/vigilange were included. Sleep was monitored using polysomnography. Results: On both tasks, patients were overall slower than healthy controls (night: F2,55 = 16.938, P < 0.001; nap: F2,55 = 15.331, P < 0.001). On the pegboard task, there was a small overall effect of night sleep (F1,55 = 9.695, P = 0.003); both patients and controls were on average slightly slower in the morning. However, in both tasks there was no sleep*group interaction for nighttime sleep nor for afternoon nap. There was a modest correlation between the score on the pegboard task and self-rated motor symptoms among patients (rho = 0.233, P = 0.004). No correlations in task performance and mood/vigilance or sleep time/efficiency were found. Conclusions: A positive effect of sleep on motor function is commonly reported by Parkinson patients. Here we show that the subjective experience of sleep benefit is not paralleled by an actual improvement in motor functioning. Sleep benefit therefore appears to be a subjective phenomenon and not a Parkinson-specific reduction in symptoms. Citation: van Gilst MM, van Mierlo P, Bloem BR, Overeem S. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease. SLEEP 2015;38(10):1567–1573. PMID:25902811

  16. Protein quantitation using various modes of high performance liquid chromatography.

    PubMed

    Grotefend, Sandra; Kaminski, Lukas; Wroblewitz, Stefanie; Deeb, Sami El; Kühn, Nancy; Reichl, Stephan; Limberger, Markus; Watt, Steven; Wätzig, Hermann

    2012-12-01

    Pharmaceuticals based on proteins (biologicals), such as monoclonal antibodies (mAb), attain more and more relevance since they were established as potent drugs in anticancer therapy or for the treatment of autoimmune based diseases. Due to their high efficiency it is essential to have accurate and precise methods for protein quantitation and the detection of protein aggregates, which in some cases may lead to adverse effects after application. Selectivity and precision of traditional protein quantification methods such as the Bradford assay or SDS-PAGE are insufficient for quality control (QC) purposes. In this work several HPLC separation modes, which can significantly improve these important parameters, were compared for their application in this field. High performance size exclusion (HP-SEC), strong anion exchange (SAX), weak cation exchange (WCX) as well as reversed phase chromatography are all already successfully applied in protein analysis. Good precision (SEC: <1.9%, SAX: <5%, RP: <2% and WCX: <3.5% - RSD% for peak areas day-to-day), high selectivity and low quantitation limits (<15μg/ml) for the model proteins ovalbumin, myoglobin and bovine serum albumin (BSA), respectively cytochrome c and lysozyme in the cation exchange mode, could be achieved. Consecutively, the four separation modes were compared to each other and to electrophoretic techniques in terms of precision, selectivity, analysis time, effort of sample and mobile phase preparation as well as separating capacity. Moreover, the analysis of an IgG1-type antibody was included in this study. PMID:22980318

  17. Evaluation of four genes in rice for their suitability as endogenous reference standards in quantitative PCR.

    PubMed

    Wang, Chong; Jiang, Lingxi; Rao, Jun; Liu, Yinan; Yang, Litao; Zhang, Dabing

    2010-11-24

    The genetically modified (GM) food/feed quantification depends on the reliable detection systems of endogenous reference genes. Currently, four endogenous reference genes including sucrose phosphate synthase (SPS), GOS9, phospholipase D (PLD), and ppi phosphofructokinase (ppi-PPF) of rice have been used in GM rice detection. To compare the applicability of these four rice reference genes in quantitative PCR systems, we analyzed the target nucleotide sequence variation in 58 conventional rice varieties from various geographic and phylogenic origins, also their quantification performances were evaluated using quantitative real-time PCR and GeNorm analysis via a series of statistical calculation to get a "M value" which is negative correlation with the stability of genes. The sequencing analysis results showed that the reported GOS9 and PLD taqman probe regions had detectable single nucleotide polymorphisms (SNPs) among the tested rice cultivars, while no SNPs were observed for SPS and ppi-PPF amplicons. Also, poor quantitative performance was detectable in these cultivars with SNPs using GOS9 and PLD quantitative PCR systems. Even though the PCR efficiency of ppi-PPF system was slightly lower, the SPS and ppi-PPF quantitative PCR systems were shown to be applicable for rice endogenous reference assay with less variation among the C(t) values, good reproducibility in quantitative assays, and the low M values by the comprehensive quantitative PCR comparison and GeNorm analysis. PMID:20961039

  18. High-performance quantitative robust switching control for optical telescopes

    NASA Astrophysics Data System (ADS)

    Lounsbury, William P.; Garcia-Sanz, Mario

    2014-07-01

    This paper introduces an innovative robust and nonlinear control design methodology for high-performance servosystems in optical telescopes. The dynamics of optical telescopes typically vary according to azimuth and altitude angles, temperature, friction, speed and acceleration, leading to nonlinearities and plant parameter uncertainty. The methodology proposed in this paper combines robust Quantitative Feedback Theory (QFT) techniques with nonlinear switching strategies that achieve simultaneously the best characteristics of a set of very active (fast) robust QFT controllers and very stable (slow) robust QFT controllers. A general dynamic model and a variety of specifications from several different commercially available amateur Newtonian telescopes are used for the controller design as well as the simulation and validation. It is also proven that the nonlinear/switching controller is stable for any switching strategy and switching velocity, according to described frequency conditions based on common quadratic Lyapunov functions (CQLF) and the circle criterion.

  19. Evaluating Nursing Students' Clinical Performance.

    PubMed

    Koharchik, Linda; Weideman, Yvonne L; Walters, Cynthia A; Hardy, Elaine

    2015-10-01

    This article is one in a series on the roles of adjunct clinical faculty and preceptors, who teach nursing students to apply knowledge in clinical settings. This article describes aspects of the student evaluation process, which should involve regular feedback and clearly stated performance expectations. PMID:26402292

  20. Physics and Performance Evaluation Group

    SciTech Connect

    Donini, Andrea; Pascoli, Silvia; Winter, Walter; Yasuda, Osamu

    2008-02-21

    We summarize the objectives and results of the ''international scoping study of a future neutrino factory and superbeam facility'' (ISS) physics working group. Furthermore, we discuss how the ISS study should develop into a neutrino factory design study (IDS-NF) from the point of view of physics and performance evaluation.

  1. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  2. Quantitative projections of a quality measure: Performance of a complex task

    NASA Astrophysics Data System (ADS)

    Christensen, K.; Kleppe, Gisle; Vold, Martin; Frette, Vidar

    2014-12-01

    Complex data series that arise during interaction between humans (operators) and advanced technology in a controlled and realistic setting have been explored. The purpose is to obtain quantitative measures that reflect quality in task performance: on a ship simulator, nine crews have solved the same exercise, and detailed maneuvering histories have been logged. There are many degrees of freedom, some of them connected to the fact that the vessels may be freely moved in any direction. To compare maneuvering histories, several measures were used: the time needed to reach the position of operation, the integrated angle between the hull direction and the direction of motion, and the extent of movement when the vessel is to be manually kept in a fixed position. These measures are expected to reflect quality in performance. We have also obtained expert quality evaluations of the crews. The quantitative measures and the expert evaluations, taken together, allow a ranking of crew performance. However, except for time and integrated angle, there is no correlation between the individual measures. This may indicate that complex situations with social and man-machine interactions need complex measures of quality in task performance. In general terms, we have established a context-dependent and flexible framework with quantitative measures in contact with a social-science concept that is hard to define. This approach may be useful for other (qualitative) concepts in social science that contain important information on the society.

  3. Reliability and Validity of the Professional Counseling Performance Evaluation

    ERIC Educational Resources Information Center

    Shepherd, J. Brad; Britton, Paula J.; Kress, Victoria E.

    2008-01-01

    The definition and measurement of counsellor trainee competency is an issue that has received increased attention yet lacks quantitative study. This research evaluates item responses, scale reliability and intercorrelations, interrater agreement, and criterion-related validity of the Professional Performance Fitness Evaluation/Professional…

  4. Quantitative analyses of matching-to-sample performance.

    PubMed Central

    Jones, B M

    2003-01-01

    Six pigeons performed a simultaneous matching-to-sample (MTS) task involving patterns of dots on a liquid-crystal display. Two samples and two comparisons differed in terms of the density of pixels visible through pecking keys mounted in front of the display. Selections of Comparison 1 after Sample 1, and of Comparison 2 after Sample 2, produced intermittent access to food, and errors always produced a time-out. The disparity between the samples and between the comparisons varied across sets of conditions. The ratio of food deliveries for the two correct responses varied over a wide range within each set of conditions, and one condition arranged extinction for correct responses following Sample 1. The quantitative models proposed by Davison and Tustin (1978), Alsop (1991), and Davison (1991) failed to predict performance in some extreme reinforcer-ratio conditions because comparison choice approached indifference (and strong position biases emerged) when the sample clearly signaled a low (or zero) rate of reinforcement. An alternative conceptualization of the reinforcement contingencies operating in MTS tasks is advanced and was supported by further analyses of the data. This model relates the differential responding between the comparisons following each sample to the differential reinforcement for correct responses following that sample. PMID:12908761

  5. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  6. Chinese Middle School Teachers' Preferences Regarding Performance Evaluation Measures

    ERIC Educational Resources Information Center

    Liu, Shujie; Xu, Xianxuan; Stronge, James H.

    2016-01-01

    Teacher performance evaluation currently is receiving unprecedented attention from policy makers, scholars, and practitioners worldwide. This study is one of the few studies of teacher perceptions regarding teacher performance measures that focus on China. We employed a quantitative dominant mixed research design to investigate Chinese teachers'…

  7. Performance Criteria and Evaluation System

    Energy Science and Technology Software Center (ESTSC)

    1992-06-18

    The Performance Criteria and Evaluation System (PCES) was developed in order to make a data base of criteria accessible to radiation safety staff. The criteria included in the package are applicable to occupational radiation safety at DOE reactor and nonreactor nuclear facilities, but any data base of criteria may be created using the Criterion Data Base Utiliity (CDU). PCES assists personnel in carrying out oversight, line, and support activities.

  8. Metrics for Offline Evaluation of Prognostic Performance

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2010-01-01

    Prognostic performance evaluation has gained significant attention in the past few years. Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.

  9. Longitudinal flexural mode utility in quantitative guided wave evaluation

    NASA Astrophysics Data System (ADS)

    Li, Jian

    2001-07-01

    Longitudinal Non-axisymmetric flexural mode utility in quantitative guided wave evaluation is examined for pipe and tube inspection. Attention is focused on hollow cylinders. Several source loading problems such as a partial-loading angle beam, an axisymmetric comb transducer and an angle beam array are studied. The Normal Mode Expansion method is employed to simulate the generated guided wave fields. For non-axisymmetric sources, an important angular profile feature is studied. Based on numerical calculations, an angular profile varies with frequency, mode and propagating distance. Since an angular profile determines the energy distribution of the guided waves, the angular profile has a great impact on the pipe inspection capability of guided waves. The simulation of non-axisymmetric angular profiles generated by partialloading is verified by experiments. An angular profile is the superposition of harmonic axisymmetric and non-axisymmetric modes with various phase velocities. A simpler equation is derived to calculate the phase velocities of the non-axisymmetric guided waves and is used for discussing the characteristics of non-axisymmetric guided waves. Angular profiles have many applications in practical pipe testing. The procedure of building desired angular profiles and also angular profile tuning is discussed. This angular profile tuning process is implemented by a phased transducer array and a special computational algorithm. Since a transducer array plays a critical role in guided wave inspection, the performance of a transducer array is discussed in terms of guided wave mode control ability and excitation sensitivity. With time delay inputs, a transducer array is greatly improved for its mode control ability and sensitivity. The algorithms for setting time delays are derived based on frequency, element spacing and phase velocity. With the help of the conclusions drawn on non- axisymmetric guided waves, a phased circumferential partial-loading array is

  10. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  11. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  12. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety. PMID:20055976

  13. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  14. A review of published quantitative experimental studies on factors affecting laboratory fume hood performance.

    PubMed

    Ahn, Kwangseog; Woskie, Susan; DiBerardinis, Louis; Ellenbecker, Michael

    2008-11-01

    This study attempted to identify the important factors that affect the performance of a laboratory fume hood and the relationship between the factors and hood performance under various conditions by analyzing and generalizing the results from other studies that quantitatively investigated fume hood performance. A literature search identified 43 studies that were published from 1966 to 2006. For each of those studies, information on the type of test methods used, the factors investigated, and the findings were recorded and summarized. Among the 43 quantitative experimental studies, 21 comparable studies were selected, and then a meta-analysis of the comparable studies was conducted. The exposure concentration variable from the resulting 617 independent test conditions was dichotomized into acceptable or unacceptable using the control level of 0.1 ppm tracer gas. Regression analysis using Cox proportional hazards models provided hood failure ratios for potential exposure determinants. The variables that were found to be statistically significant were the presence of a mannequin/human subject, the distance between a source and breathing zone, and the height of sash opening. In summary, performance of laboratory fume hoods was affected mainly by the presence of a mannequin/human subject, distance between a source and breathing zone, and height of sash opening. Presence of a mannequin/human subject in front of the hood adversely affects hood performance. Worker exposures to air contaminants can be greatly reduced by increasing the distance between the contaminant source and breathing zone and by reducing the height of sash opening. Many other factors can also affect hood performance. Checking face velocity by itself is unlikely to be sufficient in evaluating hood performance properly. An evaluation of the performance of a laboratory fume hood should be performed with a human subject or a mannequin in front of the hood and should address the effects of the activities

  15. Evaluation of solar pond performance

    SciTech Connect

    Wittenberg, L.J.

    1981-01-01

    During 1978 the City of Miamisburg constructed a large, salt-gradient solar pond as part of its community park development project. The thermal energy stored in the pond is being used to heat an outdoor swimming pool in the summer and an adjacent recreational building during part of the winter. This solar pond, which occupies an area of 2020 m/sup 2/ (22,000 ft/sup 2/), was designed from experience obtained at smaller research ponds. This project is directed toward data collection and evaluation of the thermal performance and operational characteristics of the largest, operational, salt-gradient solar pond in the United States; to gain firsthand experience regarding the maintenance, adjustments and repairs required of a large, operational solar pond facility; and to provide technical consulation regarding the operation and the optimization of the pond performance.

  16. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Performance evaluation... Performance evaluation. Preparation of performance evaluation reports. (a) In addition to the requirements of FAR 36.604, performance evaluation reports shall be prepared for indefinite-delivery type...

  17. Quantitative Evaluation and Selection of Reference Genes for Quantitative RT-PCR in Mouse Acute Pancreatitis

    PubMed Central

    Yan, Zhaoping; Gao, Jinhang; Lv, Xiuhe; Yang, Wenjuan; Wen, Shilei; Tong, Huan; Tang, Chengwei

    2016-01-01

    The analysis of differences in gene expression is dependent on normalization using reference genes. However, the expression of many of these reference genes, as evaluated by quantitative RT-PCR, is upregulated in acute pancreatitis, so they cannot be used as the standard for gene expression in this condition. For this reason, we sought to identify a stable reference gene, or a suitable combination, for expression analysis in acute pancreatitis. The expression stability of 10 reference genes (ACTB, GAPDH, 18sRNA, TUBB, B2M, HPRT1, UBC, YWHAZ, EF-1α, and RPL-13A) was analyzed using geNorm, NormFinder, and BestKeeper software and evaluated according to variations in the raw Ct values. These reference genes were evaluated using a comprehensive method, which ranked the expression stability of these genes as follows (from most stable to least stable): RPL-13A, YWHAZ > HPRT1 > GAPDH > UBC > EF-1α > 18sRNA > B2M > TUBB > ACTB. RPL-13A was the most suitable reference gene, and the combination of RPL-13A and YWHAZ was the most stable group of reference genes in our experiments. The expression levels of ACTB, TUBB, and B2M were found to be significantly upregulated during acute pancreatitis, whereas the expression level of 18sRNA was downregulated. Thus, we recommend the use of RPL-13A or a combination of RPL-13A and YWHAZ for normalization in qRT-PCR analyses of gene expression in mouse models of acute pancreatitis. PMID:27069927

  18. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  19. Quantitative genetic activity graphical profiles for use in chemical evaluation

    SciTech Connect

    Waters, M.D.; Stack, H.F.; Garrett, N.E.; Jackson, M.A.

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  20. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 236.604 Performance evaluation. Prepare a separate performance evaluation after... familiar with the architect-engineer contractor's performance....

  1. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 236.604 Performance evaluation. Prepare a separate performance evaluation after... familiar with the architect-engineer contractor's performance....

  2. Evaluation of a virucidal quantitative carrier test for surface disinfectants.

    PubMed

    Rabenau, Holger F; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  3. Evaluation of a Virucidal Quantitative Carrier Test for Surface Disinfectants

    PubMed Central

    Rabenau, Holger F.; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  4. Predictive Heterosis in Multibreed Evaluations Using Quantitative and Molecular Approaches

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Heterosis is the extra genetic boost in performance obtained by crossing two cattle breeds. It is an important tool for increasing the efficiency of beef production. It is also important to adjust data used to calculate genetic evaluations for differences in heterosis. Good estimates of heterosis...

  5. Quantitative evaluations of ankle spasticity and stiffness in neurological disorders using manual spasticity evaluator

    PubMed Central

    Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J.; Zhang, Li-Qun

    2013-01-01

    Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke. PMID:21674395

  6. Formative Evaluation in the Performance Context.

    ERIC Educational Resources Information Center

    Dick, Walter; King, Debby

    1994-01-01

    Reviews the traditional formative evaluation model used by instructional designers; summarizes Kirkpatrick's model of evaluation; proposes the integration of part of Kirkpatrick's model with traditional formative evaluation; and discusses performance-context formative evaluation. (three references) (LRW)

  7. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  8. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  9. SEASAT SAR performance evaluation study

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The performance of the SEASAT synthetic aperture radar (SAR) sensor was evaluated using data processed by the MDA digital processor. Two particular aspects are considered the location accuracy of image data, and the calibration of the measured backscatter amplitude of a set of corner reflectors. The image location accuracy was assessed by selecting identifiable targets in several scenes, converting their image location to UTM coordinates, and comparing the results to map sheets. The error standard deviation is measured to be approximately 30 meters. The amplitude was calibrated by measuring the responses of the Goldstone corner reflector array and comparing the results to theoretical values. A linear regression of the measured against theoretical values results in a slope of 0.954 with a correlation coefficient of 0.970.

  10. Quantitative evaluation of image registration techniques in the case of retinal images

    NASA Astrophysics Data System (ADS)

    Gavet, Yann; Fernandes, Mathieu; Pinoli, Jean-Charles

    2012-04-01

    In human retina observation (with non mydriatic optical microscopes), an image registration process is often employed to enlarge the field of view. Analyzing all the images takes a lot of time. Numerous techniques have been proposed to perform the registration process. Its good evaluation is a difficult question that is then raising. This article presents the use of two quantitative criterions to evaluate and compare some classical feature-based image registration techniques. The images are first segmented and the resulting binary images are then registered. The good quality of the registration process is evaluated with a normalized criterion based on the ɛ dissimilarity criterion, and the figure of merit criterion (fom), for 25 pairs of images with a manual selection of control points. These criterions are normalized by the results of the affine method (considered as the most simple method). Then, for each pair, the influence of the number of points used to perform the registration is evaluated.

  11. Evaluation of squib performance variables

    SciTech Connect

    Munger, A.C.; Woods, C.M.; Phillabaum, M.R.

    1991-01-01

    The use of Kinetic Energy Device for measuring the output of a pyrotechnic squib or actuator was presented in the proceedings of the Thirteenth Pyrotechnic Seminar held in Grand Junction, Colorado 1988. This device was demonstrated as a valuable tool for evaluating the interface design between the squib and the next assembly. The thrust of this investigation was to evaluate the amount of containment that the interface provides and its effect on the amount of energy transmitted to a moving piston on the other side of the interface. Experiments were repeats of tests done with another test device known as the Variable Explosive Chamber. This data was presented in the proceedings of the Twelfth Pyrotechnic Seminar held in Juan-les-Pins, France 1987. A second area of investigation was to determine the effects of variation in the average compaction density and total mass of pyrotechnic powder load on the performance of the squib. The data shown here is for one specific geometry but may have implications to other geometries and even to other devices such as ignitors or matches. The equations of motion are examined for two geometries of test actuators. Pressure pulse curves are derived from the displacement versus time records for the extremes of a constant density, variable mass test series. 4 refs.

  12. Quantitative methods for somatosensory evaluation in atypical odontalgia.

    PubMed

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana; Bonjardim, Leonardo Rigoldi; Conti, Paulo César Rodrigues; Svensson, Peter

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection (CPT), and/or wind-up ratio (WUR). The publications meeting the inclusion criteria revealed that only mechanical allodynia tests (DMA1, DMA2, and WUR) were significantly higher and pain threshold tests to heat stimulation (HPT) were significantly lower in the affected side, compared with the contralateral side, in AO patients; however, for MDT, MPT, PPT, CDT, and WDT, the results were not significant. These data support the presence of central sensitization features, such as allodynia and temporal summation. In contrast, considerable inconsistencies between studies were found when AO patients were compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests. PMID:25627886

  13. Improving Student Retention and Performance in Quantitative Courses Using Clickers

    ERIC Educational Resources Information Center

    Liu, Wallace C.; Stengel, Donald N.

    2011-01-01

    Clickers offer instructors of mathematics-related courses an opportunity to involve students actively in class sessions while diminishing the embarrassment of being wrong. This paper reports on the use of clickers in two university-level courses in quantitative analysis and business statistics. Results for student retention and examination…

  14. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies. PMID:25542704

  15. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme. PMID:23008259

  16. A quantitative evaluation of the AVITEWRITE model of handwriting learning.

    PubMed

    Paine, R W; Grossberg, S; Van Gemmert, A W A

    2004-12-01

    Much sensory-motor behavior develops through imitation, as during the learning of handwriting by children. Such complex sequential acts are broken down into distinct motor control synergies, or muscle groups, whose activities overlap in time to generate continuous, curved movements that obey an inverse relation between curvature and speed. The adaptive vector integration to endpoint handwriting (AVITEWRITE) model of Grossberg and Paine (2000) [A neural model of corticocerebellar interactions during attentive imitation and predictive learning of sequential handwriting movements. Neural Networks, 13, 999-1046] addressed how such complex movements may be learned through attentive imitation. The model suggested how parietal and motor cortical mechanisms, such as difference vector encoding, interact with adaptively-timed, predictive cerebellar learning during movement imitation and predictive performance. Key psychophysical and neural data about learning to make curved movements were simulated, including a decrease in writing time as learning progresses; generation of unimodal, bell-shaped velocity profiles for each movement synergy; size scaling with isochrony, and speed scaling with preservation of the letter shape and the shapes of the velocity profiles; an inverse relation between curvature and tangential velocity; and a two-thirds power law relation between angular velocity and curvature. However, the model learned from letter trajectories of only one subject, and only qualitative kinematic comparisons were made with previously published human data. The present work describes a quantitative test of AVITEWRITE through direct comparison of a corpus of human handwriting data with the model's performance when it learns by tracing the human trajectories. The results show that model performance was variable across the subjects, with an average correlation between the model and human data of 0.89+/-0.10. The present data from simulations using the AVITEWRITE model

  17. A quantitative evaluation of models for Aegean crustal deformation

    NASA Astrophysics Data System (ADS)

    Nyst, M.; Thatcher, W.

    2003-04-01

    Modeling studies of eastern Mediterranean tectonics show that Aegean deformation is mainly determined by WSW directed expulsion of Anatolia and SW directed extension due to roll-back of African lithosphere along the Hellenic trench. How motion is transferred across the Aegean remains a subject of debate. The two most widely used hypotheses for Aegean tectonics assert fundamentally different mechanisms. The first model describes deformation as a result of opposing rotations of two rigid microplates separated by a zone of extension. In the second model most motion is accommodated by shear on a series of dextral faults and extension on graben systems. These models make different quantitative predictions for the crustal deformation field that can be tested by a new, spatially dense GPS velocity data set. To convert the GPS data into crustal deformation parameters we use different methods to model complementary aspects of crustal deformation. We parameterize the main fault and plate boundary structures of both models and produce representations for the crustal deformation field that range from purely rigid rotations of microplates, via interacting, elastically deforming blocks separated by crustal faults to a continuous velocity gradient field. Critical evaluation of these models indicates strengths and limitations of each and suggests new measurements for further refining understanding of present-day Aegean tectonics.

  18. [Effects of calcinogenic plants--qualitative and quantitative evaluation].

    PubMed

    Mello, J R; Habermehl, G G

    1998-01-01

    Different research methods demonstrated the presence of variable quantities of Vitamin D as well as its metabolites in calcinogenic plants. Most of the experiments indicated that the active component most probably should be the metabolite 1,25 (OH)2D3 linked as a glycoside. By this research it was achieved to evaluate the presence of elements with Vitamin D-like activity in the calcinogenic plants Solanum malacoxylon, Cestrum diurnum, Trisetum flavescens and Nierembergia veitchii by testing different extracts of the above plants by oral application to rachitic chicks within the research model "Strontium added Alimentation". After the oral administration of the extracts, the serum was analysed to determine the level of the elements calcium, phosphorus and alkaline phosphatase. The results gained with chicks demonstrated the presence of substances with Vitamin D-like activity in the 4 plants. Solanum malacoxylon and Cestrum diurnum as well contained substances of hydrosoluble character with elevated activity which was indicated by the significant high levels of calcium and phosphorus combined with a reduced activity of the alkaline phosphatase. This indicated the presence of 1,25 (OH)2D3 in both plants. The hydrosoluble character of the active substance in both plants is most probably explained as a compound of the metabolite 1,25 (OH)2D3, combined as a glycoside in the position O-25 of the molecule. Nierembergia veitchii and Trisetum flavescens contained only minor concentration of elements with hydrosoluble characteristics. The results of the 4 analysed plants were evaluated quantitatively as follows: Solanum malycoxylon--82,800 IU of Vitamin D/kg, Cestrum diurnum--63,200 IU of Vitamin D/kg, Nierembergia veitchii--16,400 IU/kg and Trisetum flavescens 12,000 Vitamin D IU/kg. All concentrations are calcinogenic. PMID:9499629

  19. Image performance evaluation of a 3D surgical imaging platform

    NASA Astrophysics Data System (ADS)

    Petrov, Ivailo E.; Nikolov, Hristo N.; Holdsworth, David W.; Drangova, Maria

    2011-03-01

    The O-arm (Medtronic Inc.) is a multi-dimensional surgical imaging platform. The purpose of this study was to perform a quantitative evaluation of the imaging performance of the O-arm in an effort to understand its potential for future nonorthopedic applications. Performance of the reconstructed 3D images was evaluated, using a custom-built phantom, in terms of resolution, linearity, uniformity and geometrical accuracy. Both the standard (SD, 13 s) and high definition (HD, 26 s) modes were evaluated, with the imaging parameters set to image the head (120 kVp, 100 mAs and 150 mAs, respectively). For quantitative noise characterization, the images were converted to Hounsfield units (HU) off-line. Measurement of the modulation transfer function revealed a limiting resolution (at 10% level) of 1.0 mm-1 in the axial dimension. Image noise varied between 15 and 19 HU for the HD and SD modes, respectively. Image intensities varied linearly over the measured range, up to 1300 HU. Geometric accuracy was maintained in all three dimensions over the field of view. The present study has evaluated the performance characteristics of the O-arm, and demonstrates feasibility for use in interventional applications and quantitative imaging tasks outside those currently targeted by the manufacturer. Further improvements to the reconstruction algorithms may further enhance performance for lower-contrast applications.

  20. 48 CFR 36.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Performance evaluation. 36.604 Section 36.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL... Performance evaluation. See 42.1502(f) for the requirements for preparing past performance evaluations...

  1. Attribution Theory and Academic Library Performance Evaluation.

    ERIC Educational Resources Information Center

    Gedeon, Julie A.; Rubin, Richard E.

    1999-01-01

    Discusses problems with performance evaluations in academic libraries and examines attribution theory, a sociopsychological theory which helps explain how biases may arise in the performance-evaluation process and may be responsible for producing serious and unrecognized inequities. Considers fairness in performance evaluation and differential…

  2. 13 CFR 304.4 - Performance evaluations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Performance evaluations. 304.4... ECONOMIC DEVELOPMENT DISTRICTS § 304.4 Performance evaluations. (a) EDA shall evaluate the management... the District Organization continues to receive Investment Assistance. EDA's evaluation shall...

  3. Range sensors on marble surfaces: quantitative evaluation of artifacts

    NASA Astrophysics Data System (ADS)

    Guidi, Gabriele; Remondino, Fabio; Russo, Michele; Spinetti, Alessandro

    2009-08-01

    While 3D imaging systems are widely available and used, clear statements about the possible influence of material properties over the acquired geometrical data are still rather few. In particular a material very often used in Cultural Heritage is marble, known to give geometrical errors with range sensor technologies and whose entity reported in the literature seems to vary considerably in the different works. In this article a deep investigation with different types of active range sensors used on four types of marble surfaces, has been performed. Two triangulation-based active sensors employing laser stripe and white light pattern projection respectively, and one PW-TOF laser scanner have been used in the experimentation. The analysis gave rather different results for the two categories of instruments. A negligible light penetration came out from the triangulation-based equipment (below 50 microns with the laser stripe and even less with the pattern projection device), while with the TOF system this came out to be two orders of magnitude larger, quantitatively evidencing a source of systematic errors that any surveyor engaged in 3D scanning of Cultural Heritage sites and objects should take into account and correct.

  4. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  5. Energy performance evaluation of AAC

    NASA Astrophysics Data System (ADS)

    Aybek, Hulya

    The U.S. building industry constitutes the largest consumer of energy (i.e., electricity, natural gas, petroleum) in the world. The building sector uses almost 41 percent of the primary energy and approximately 72 percent of the available electricity in the United States. As global energy-generating resources are being depleted at exponential rates, the amount of energy consumed and wasted cannot be ignored. Professionals concerned about the environment have placed a high priority on finding solutions that reduce energy consumption while maintaining occupant comfort. Sustainable design and the judicious combination of building materials comprise one solution to this problem. A future including sustainable energy may result from using energy simulation software to accurately estimate energy consumption and from applying building materials that achieve the potential results derived through simulation analysis. Energy-modeling tools assist professionals with making informed decisions about energy performance during the early planning phases of a design project, such as determining the most advantageous combination of building materials, choosing mechanical systems, and determining building orientation on the site. By implementing energy simulation software to estimate the effect of these factors on the energy consumption of a building, designers can make adjustments to their designs during the design phase when the effect on cost is minimal. The primary objective of this research consisted of identifying a method with which to properly select energy-efficient building materials and involved evaluating the potential of these materials to earn LEED credits when properly applied to a structure. In addition, this objective included establishing a framework that provides suggestions for improvements to currently available simulation software that enhance the viability of the estimates concerning energy efficiency and the achievements of LEED credits. The primary objective

  6. Evaluating GC/MS Performance

    SciTech Connect

    Alcaraz, A; Dougan, A

    2006-11-26

    and Water Check': By selecting View - Diagnostics/Vacuum Control - Vacuum - Air and Water Check. A Yes/No dialogue box will appear; select No (use current values). It is very important to select No! Otherwise the tune values are drastically altered. The software program will generate a water/air report similar to figure 3. Evaluating the GC/MS system with a performance standard: This procedure should allow the analyst to verify that the chromatographic column and associated components are working adequately to separate the various classes of chemical compounds (e.g., hydrocarbons, alcohols, fatty acids, aromatics, etc.). Use the same GC/MS conditions used to collect the system background and solvent check (part 1 of this document). Figure 5 is an example of a commercial GC/MS column test mixture used to evaluate GC/MS prior to analysis.

  7. Aging rat vestibular ganglion: I. Quantitative light microscopic evaluation.

    PubMed

    Alidina, A; Lyon, M J

    1990-01-01

    This study was undertaken to quantify age-related changes in the rat vestibular ganglion. Cell number, diameter, and proximal-distal distribution based on size were evaluated. Serial 5-microns plastic sections of the vestibular ganglion from 15 female Wistar rats were examined. Rats were divided into three age groups: young (Y, 3 to 5 months, n = 5), old (0, 24 to 26 months, n = 3), and very old (VO, 28 to 31 months, n = 7). Quantitative analysis indicated no significant differences (P less than .05) in the estimated number of ganglion cells (mean: Y = 1,690, 0 = 2,257, VO = 1,678), ganglion cell profile diameters (mean: Y = 22.5 microns, n = 2,886; O = 23.7 microns, n = 2,313; VO = 22.8 microns, n = 4,061), or proximal-distal localization (proximal: 22.3 microns, 24.4 microns, 22.7 microns; middle: 22.6 microns, 23.1 microns, 22.4 microns; distal: 23.3 microns, 23.4 microns, 23.7 microns; Y, O, and VO, respectively). When pooled, the old animals tended to have slightly larger cell profiles than the other groups. We noted a dramatic age-related increase of aging pigment within the ganglion cell profiles, making the old and very old animals easily distinguishable from the young. In most of the cell profiles, the aging pigment was more or less uniformly distributed throughout the cytoplasm. However, in some, aging pigment was accumulated at one pole of the cell profile. While no typical degenerating cellular profiles were found in any of the sections, several of the ganglion cell profiles from the old animals revealed dense cytoplasm, possibly indicating an early stage of degeneration. PMID:2382785

  8. Dual-band infrared thermography for quantitative nondestructive evaluation

    SciTech Connect

    Durbin, P.F.; Del Grande, N.K.; Dolan, K.W.; Perkins, D.E.; Shapiro, A.B.

    1993-04-01

    The authors have developed dual-band infrared (DBIR) thermography that is being applied to quantitative nondestructive evaluation (NDE) of aging aircraft. The DBIR technique resolves 0.2 degrees C surface temperature differences for inspecting interior flaws in heated aircraft structures. It locates cracks, corrosion sites, disbonds or delaminations in metallic laps and composite patches. By removing clutter from surface roughness effects, the authors clarify interpretation of subsurface flaws. To accomplish this, the authors ratio images recorded at two infrared bands, centered near 5 microns and 10 microns. These image ratios are used to decouple temperature patterns associated with interior flaw sites from spatially varying surface emissivity noise. They also discuss three-dimensional (3D) dynamic thermal imaging of structural flaws using dual-band infrared (DBIR) computed tomography. Conventional thermography provides single-band infrared images which are difficult to interpret. Standard procedures yield imprecise (or qualitative) information about subsurface flaw sites which are typically masked by surface clutter. They use a DBIR imaging technique pioneered at LLNL to capture the time history of surface temperature difference patterns for flash-heated targets. They relate these patterns to the location, size, shape and depth of subsurface flaws. They have demonstrated temperature accuracies of 0.2{degree}C, timing synchronization of 3 ms (after onset of heat flash) and intervals of 42 ms, between images, during an 8 s cooling (and heating) interval characterizing the front (and back) surface temperature-time history of an epoxy-glue disbond site in a flash-heated aluminum lap joint.

  9. Quantitative comparison between crowd models for evacuation planning and evaluation

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vaisagh; Lee, Chong Eu; Lees, Michael Harold; Cheong, Siew Ann; Sloot, Peter M. A.

    2014-02-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we describe a procedure to quantitatively compare different crowd models or between models and real-world data. We simulated three models: (1) the lattice gas model, (2) the social force model, and (3) the RVO2 model, and obtained the distributions of six observables: (1) evacuation time, (2) zoned evacuation time, (3) passage density, (4) total distance traveled, (5) inconvenience, and (6) flow rate. We then used the DISTATIS procedure to compute the compromise matrix of statistical distances between the three models. Projecting the three models onto the first two principal components of the compromise matrix, we find the lattice gas and RVO2 models are similar in terms of the evacuation time, passage density, and flow rates, whereas the social force and RVO2 models are similar in terms of the total distance traveled. Most importantly, we find that the zoned evacuation times of the three models to be very different from each other. Thus we propose to use this variable, if it can be measured, as the key test between different models, and also between models and the real world. Finally, we compared the model flow rates against the flow rate of an emergency evacuation during the May 2008 Sichuan earthquake, and found the social force model agrees best with this real data.

  10. How To Evaluate Teacher Performence.

    ERIC Educational Resources Information Center

    Wilson, Laval S.

    Teacher evaluations tend to be like clothes. Whatever is in vogue at the time is utilized extensively by those who are attempting to remain modern and current. If you stay around long enough, the "hot" methods of today will probably recycle to be the new discovery of the future. In the end, each school district develops an evaluation process that…

  11. Using hybrid method to evaluate the green performance in uncertainty.

    PubMed

    Tseng, Ming-Lang; Lan, Lawrence W; Wang, Ray; Chiu, Anthony; Cheng, Hui-Ping

    2011-04-01

    Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm's green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed. PMID:20571885

  12. Quantitative polarization and flow evaluation of choroid and sclera by multifunctional Jones matrix optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Sugiyama, S.; Hong, Y.-J.; Kasaragod, D.; Makita, S.; Miura, M.; Ikuno, Y.; Yasuno, Y.

    2016-03-01

    Quantitative evaluation of optical properties of choroid and sclera are performed by multifunctional optical coherence tomography. Five normal eyes, five glaucoma eyes and one choroidal atrophy eye are examined. The refractive error was found to be correlated with choroidal birefringence, polarization uniformity, and flow in addition to scleral birefringence among normal eyes. The significant differences were observed between the normal and the glaucoma eyes, as for choroidal polarization uniformity, flow and scleral birefringence. An automatic segmentation algorithm of retinal pigment epithelium and chorioscleral interface based on multifunctional signals is also presented.

  13. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography.

    PubMed

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture. PMID:26891600

  14. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography

    NASA Astrophysics Data System (ADS)

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.

  15. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONTRACTING REQUIREMENTS CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 2936.604 Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  16. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CONTRACTING REQUIREMENTS CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 2936.604 Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  17. 48 CFR 2936.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CONTRACTING REQUIREMENTS CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 2936.604 Performance evaluation. (a) The HCA must establish procedures to evaluate architect-engineer contractor... reports must be made using Standard Form 1421, Performance Evaluation (Architect-Engineer) as...

  18. Quantitative evaluation of noise reduction and vesselness filters for liver vessel segmentation on abdominal CTA images

    NASA Astrophysics Data System (ADS)

    Luu, Ha Manh; Klink, Camiel; Moelker, Adriaan; Niessen, Wiro; van Walsum, Theo

    2015-05-01

    Liver vessel segmentation in CTA images is a challenging task, especially in the case of noisy images. This paper investigates whether pre-filtering improves liver vessel segmentation in 3D CTA images. We introduce a quantitative evaluation of several well-known filters based on a proposed liver vessel segmentation method on CTA images. We compare the effect of different diffusion techniques i.e. Regularized Perona-Malik, Hybrid Diffusion with Continuous Switch and Vessel Enhancing Diffusion as well as the vesselness approaches proposed by Sato, Frangi and Erdt. Liver vessel segmentation of the pre-processed images is performed using a histogram-based region grown with local maxima as seed points. Quantitative measurements (sensitivity, specificity and accuracy) are determined based on manual landmarks inside and outside the vessels, followed by T-tests for statistic comparisons on 51 clinical CTA images. The evaluation demonstrates that all the filters make liver vessel segmentation have a significantly higher accuracy than without using a filter (p  <  0.05) Hybrid Diffusion with Continuous Switch achieves the best performance. Compared to the diffusion filters, vesselness filters have a greater sensitivity but less specificity. In addition, the proposed liver vessel segmentation method with pre-filtering is shown to perform robustly on a clinical dataset having a low contrast-to-noise of up to 3 (dB). The results indicate that the pre-filtering step significantly improves liver vessel segmentation on 3D CTA images.

  19. Segmentation and quantitative evaluation of brain MRI data with a multiphase 3D implicit deformable model

    NASA Astrophysics Data System (ADS)

    Angelini, Elsa D.; Song, Ting; Mensh, Brett D.; Laine, Andrew

    2004-05-01

    Segmentation of three-dimensional anatomical brain images into tissue classes has applications in both clinical and research settings. This paper presents the implementation and quantitative evaluation of a four-phase three-dimensional active contour implemented with a level set framework for automated segmentation of brain MRIs. The segmentation algorithm performs an optimal partitioning of three-dimensional data based on homogeneity measures that naturally evolves to the extraction of different tissue types in the brain. Random seed initialization was used to speed up numerical computation and avoid the need for a priori information. This random initialization ensures robustness of the method to variation of user expertise, biased a priori information and errors in input information that could be influenced by variations in image quality. Experimentation on three MRI brain data sets showed that an optimal partitioning successfully labeled regions that accurately identified white matter, gray matter and cerebrospinal fluid in the ventricles. Quantitative evaluation of the segmentation was performed with comparison to manually labeled data and computed false positive and false negative assignments of voxels for the three organs. We report high accuracy for the two comparison cases. These results demonstrate the efficiency and flexibility of this segmentation framework to perform the challenging task of automatically extracting brain tissue volume contours.

  20. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder

    PubMed Central

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7–11 years (27 males, six females) and twenty five adults participants aged 21–29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD. PMID:26797613

  1. A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra

    NASA Astrophysics Data System (ADS)

    Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

    2013-06-01

    A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

  2. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD. PMID:26797613

  3. 48 CFR 236.604 - Performance evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 236.604 Performance evaluation. (a) Preparation of performance reports. Use DD Form 2631, Performance Evaluation (Architect-Engineer), instead of SF 1421. (2) Prepare a...

  4. Quantitative evaluation of stiffness of commercial suture materials.

    PubMed

    Chu, C C; Kizil, Z

    1989-03-01

    The bending stiffness of 22 commercial suture materials of varying size, chemical structure and physical form was quantitatively evaluated using a stiffness tester (Taber V-5, model 150B, Teledyne). The commercial sutures were Chromic catgut; Dexon (polyglycolic acid); Vicryl (polyglactin 910); PDS (polydioxanone); Maxon (polyglycolide-trimethylene carbonate); Silk (coated with silicone); Mersilene (polyester fiber); Tycron (polyester fiber); Ethibond (polyethylene terephthalate coated with polybutylene); Nurolon (nylon 66); Surgilon (nylon 66 coated with silicone); Ethilon (coated nylon 66), Prolene (polypropylene); Dermalene (polyethylene), and Gore-tex (polytetraflouroethylene). These are both natural and synthetic, absorbable and nonabsorbable and monofilament and multifilament sutures. All of these sutures were size 2-0, but Prolene sutures with sizes ranging from 1-0 to 9-0 were also tested to determine the effect of suture size on stiffness. The bending stiffness data obtained showed that a wide range of bending stiffness was observed among the 22 commercial sutures. The most flexible 2-0 suture was Gore-tex, followed by Dexon, Silk, Surgilon, Vicryl (uncoated), Tycron, Nurolon, Mersilene, Ethibond, Maxon, PDS, Ethilon, Prolene, Chromic catgut, coated Vicryl, and lastly, Dermalene. The large porous volume inherent in Gore-tex monofilament suture was the reason for its lowest flexural stiffness. Sutures with a braided structure were generally more flexible than those of a monofilament structure, irrespective of the chemical constituents. Coated sutures had significantly higher stiffness than the corresponding uncoated ones. This is particularly true when polymers rather than wax were used as the coating material. This increase in stiffness is attributable to the loss of mobility under bending force in the fibers and yarns that make up the sutures. An increase in the size of the suture significantly increased the stiffness, and the magnitude of increase

  5. Quantitative Evaluation of Atherosclerotic Plaque Using Ultrasound Tissue Characterization.

    NASA Astrophysics Data System (ADS)

    Yigiter, Ersin

    Evaluation of therapeutic methods directed toward interrupting and/or delaying atherogenesis is impeded by the lack of a reliable, non-invasive means for monitoring progression or regression of disease. The ability to characterize the predominant component of plaque may be very valuable in the study of this disease's natural history. The earlier the lesion, the more likely is lipid to be the predominant component. Progression of plaque is usually by way of overgrowth of fibrous tissues around the fatty pool. Calcification is usually a feature of the older or complicated lesion. To explore the feasibility of using ultrasound to characterize plaque we have conducted measurements of the acoustical properties of various atherosclerotic lesions found in freshly excised samples of human abdominal aorta. Our objective has been to determine whether or not the acoustical properties of plaque correlate with the type and/or chemical composition of plaque and, if so, to define a measurement scheme which could be done in-vivo and non-invasively. Our current data base consists of individual tissue samples from some 200 different aortas. Since each aorta yields between 10 to 30 tissue samples for study, we have data on some 4,468 different lesions or samples. Measurements of the acoustical properties of plaque were found to correlate well with the chemical composition of plaque. In short, measurements of impedance and attenuation seem sufficient to classify plaque as to type and to composition. Based on the in-vitro studies, the parameter of attenuation was selected as a means of classifying the plaque. For these measurements, an intravascular ultrasound scanner was modified according to our specifications. Signal processing algorithms were developed which would analyze the complex ultrasound waveforms and estimate tissue properties such as attenuation. Various methods were tried to estimate the attenuation from the pulse-echo backscattered signal. Best results were obtained by

  6. Quantitative, Notional, and Comprehensive Evaluations of Spontaneous Engaged Speech

    ERIC Educational Resources Information Center

    Molholt, Garry; Cabrera, Maria Jose; Kumar, V. K.; Thompsen, Philip

    2011-01-01

    This study provides specific evidence regarding the extent to which quantitative measures, common sense notional measures, and comprehensive measures adequately characterize spontaneous, although engaged, speech. As such, the study contributes to the growing body of literature describing the current limits of automatic systems for evaluating…

  7. QUANTITATIVE GENETIC ACTIVITY GRAPHICAL PROFILES FOR USE IN CHEMICAL EVALUATION

    EPA Science Inventory

    A graphic approach termed a Genetic Activity Profile (GAP) has been developed to display a matrix of data on the genetic and related effects of selected chemical agents. he profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each...

  8. INTEGRATED WATER TREATMENT SYSTEM PERFORMANCE EVALUATION

    SciTech Connect

    SEXTON RA; MEEUWSEN WE

    2009-03-12

    This document describes the results of an evaluation of the current Integrated Water Treatment System (IWTS) operation against design performance and a determination of short term and long term actions recommended to sustain IWTS performance.

  9. A new performance evaluation tool

    SciTech Connect

    Kindl, F.H.

    1996-12-31

    The paper describes a Steam Cycle Diagnostic Program (SCDP), that has been specifically designed to respond to the increasing need of electric power generators for periodic performance monitoring, and quick identification of the causes for any observed increase in fuel consumption. There is a description of program objectives, modeling and test data inputs, results, underlying program logic, validation of program accuracy by comparison with acceptance test quality data, and examples of program usage.

  10. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  11. Technology efficacy in active prosthetic knees for transfemoral amputees: a quantitative evaluation.

    PubMed

    El-Sayed, Amr M; Hamzaid, Nur Azah; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  12. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  13. S-191 sensor performance evaluation

    NASA Technical Reports Server (NTRS)

    Hughes, C. L.

    1975-01-01

    A final analysis was performed on the Skylab S-191 spectrometer data received from missions SL-2, SL-3, and SL-4. The repeatability and accuracy of the S-191 spectroradiometric internal calibration was determined by correlation to the output obtained from well-defined external targets. These included targets on the moon and earth as well as deep space. In addition, the accuracy of the S-191 short wavelength autocalibration was flight checked by correlation of the earth resources experimental package S-191 outputs and the Backup Unit S-191 outputs after viewing selected targets on the moon.

  14. Steroidomic Footprinting Based on Ultra-High Performance Liquid Chromatography Coupled with Qualitative and Quantitative High-Resolution Mass Spectrometry for the Evaluation of Endocrine Disrupting Chemicals in H295R Cells.

    PubMed

    Tonoli, David; Fürstenberger, Cornelia; Boccard, Julien; Hochstrasser, Denis; Jeanneret, Fabienne; Odermatt, Alex; Rudaz, Serge

    2015-05-18

    The screening of endocrine disrupting chemicals (EDCs) that may alter steroidogenesis represents a highly important field mainly due to the numerous pathologies, such as cancer, diabetes, obesity, osteoporosis, and infertility that have been related to impaired steroid-mediated regulation. The adrenal H295R cell model has been validated to study steroidogenesis by the Organization for Economic Co-operation and Development (OECD) guideline. However, this guideline focuses solely on testosterone and estradiol monitoring, hormones not typically produced by the adrenals, hence limiting possible in-depth mechanistic investigations. The present work proposes an untargeted steroidomic footprinting workflow based on ultra-high pressure liquid chromatography (UHPLC) coupled to high-resolution MS for the screening and mechanistic investigations of EDCs in H295R cell supernatants. A suspected EDC, triclocarban (TCC), used in detergents, cosmetics, and personal care products, was selected to demonstrate the efficiency of the reported methodology, allowing the simultaneous assessment of a steroidomic footprint and quantification of a selected subset of steroids in a single analysis. The effects of exposure to increasing TCC concentrations were assessed, and the selection of features with database matching followed by multivariate analysis has led to the selection of the most salient affected steroids. Using correlation analysis, 11 steroids were associated with a high, 18 with a medium, and 8 with a relatively low sensitivity behavior to TCC. Among the candidates, 13 identified steroids were simultaneously quantified, leading to the evaluation and localization of the disruption of steroidogenesis caused by TCC upstream of the formation of pregnenolone. The remaining candidates could be associated with a specific steroid class (progestogens and corticosteroids, or androgens) and represent a specific footprint of steroidogenesis disruption by TCC. This strategy was devised to be

  15. [Quantitative evaluation of the nitroblue tetrazolium reduction test].

    PubMed

    Vagner, V K; Nasonkin, O S; Boriskina, N D

    1989-01-01

    The results of NBT test were assessed by the visual cytochemical method and by the quantitative spectrophotometry technique developed by the authors for the NBT test. The results demonstrate a higher sensitivity and informative value of the new method, in case the neutrophilic tetrazolium activity is rather high; this recommends the NBT test spectrophotometric variant for wide clinical application in studies of the blood leukocyte functional and metabolic activity. PMID:2483198

  16. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    SciTech Connect

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  17. Evaluation of solar pond performance

    SciTech Connect

    Wittenberg, L.J.

    1980-01-01

    The City of Miamisburg, Ohio, constructed during 1978 a large, salt-gradient solar pond as part of its community park development project. The thermal energy stored in the pond is being used to heat an outdoor swimming pool in the summer and an adjacent recreational building during part of the winter. This solar pond, which occupies an area of 2020 m/sup 2/ (22,000 sq. ft.), was designed from experience obtained at smaller research ponds located at Ohio State University, the University of New Mexico and similar ponds operated in Israel. During the summer of 1979, the initial heat (40,000 kWh, 136 million Btu) was withdrawn from the solar pond to heat the outdoor swimming pool. All of the data collection systems were installed and functioned as designed so that operational data were obtained. The observed performance of the pond was compared with several of the predicted models for this type of pond. (MHR)

  18. Early Prediction and Evaluation of Breast Cancer Response to Neoadjuvant Chemotherapy Using Quantitative DCE-MRI.

    PubMed

    Tudorica, Alina; Oh, Karen Y; Chui, Stephen Y-C; Roy, Nicole; Troxell, Megan L; Naik, Arpana; Kemmer, Kathleen A; Chen, Yiyi; Holtorf, Megan L; Afzal, Aneela; Springer, Charles S; Li, Xin; Huang, Wei

    2016-02-01

    The purpose is to compare quantitative dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) metrics with imaging tumor size for early prediction of breast cancer response to neoadjuvant chemotherapy (NACT) and evaluation of residual cancer burden (RCB). Twenty-eight patients with 29 primary breast tumors underwent DCE-MRI exams before, after one cycle of, at midpoint of, and after NACT. MRI tumor size in the longest diameter (LD) was measured according to the RECIST (Response Evaluation Criteria In Solid Tumors) guidelines. Pharmacokinetic analyses of DCE-MRI data were performed with the standard Tofts and Shutter-Speed models (TM and SSM). After one NACT cycle the percent changes of DCE-MRI parameters K(trans) (contrast agent plasma/interstitium transfer rate constant), ve (extravascular and extracellular volume fraction), kep (intravasation rate constant), and SSM-unique τi (mean intracellular water lifetime) are good to excellent early predictors of pathologic complete response (pCR) vs. non-pCR, with univariate logistic regression C statistics value in the range of 0.804 to 0.967. ve values after one cycle and at NACT midpoint are also good predictors of response, with C ranging 0.845 to 0.897. However, RECIST LD changes are poor predictors with C = 0.609 and 0.673, respectively. Post-NACT K(trans), τi, and RECIST LD show statistically significant (P < .05) correlations with RCB. The performances of TM and SSM analyses for early prediction of response and RCB evaluation are comparable. In conclusion, quantitative DCE-MRI parameters are superior to imaging tumor size for early prediction of therapy response. Both TM and SSM analyses are effective for therapy response evaluation. However, the τi parameter derived only with SSM analysis allows the unique opportunity to potentially quantify therapy-induced changes in tumor energetic metabolism. PMID:26947876

  19. Theory and Practice on Teacher Performance Evaluation

    ERIC Educational Resources Information Center

    Yonghong, Cai; Chongde, Lin

    2006-01-01

    Teacher performance evaluation plays a key role in educational personnel reform, so it has been an important yet difficult issue in educational reform. Previous evaluations on teachers failed to make strict distinction among the three dominant types of evaluation, namely, capability, achievement, and effectiveness. Moreover, teacher performance…

  20. A Teacher's Guide to Teaching Performance Evaluation.

    ERIC Educational Resources Information Center

    Armstrong, Harold R.

    What is popularly known in teacher evaluation as "the Redfern Approach" has emerged from almost two decades of experimentation and discussion. This approach involves setting performance standards and job targets, monitoring the data, the evaluating, the evaluation conference, and related followup activities. This guide is intended to fill a gap in…

  1. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  2. Quantitative performance of a quadrupole-orbitrap-MS in targeted LC-MS determinations of small molecules.

    PubMed

    Grund, Baptiste; Marvin, Laure; Rochat, Bertrand

    2016-05-30

    High-resolution mass spectrometry (HRMS) has been associated with qualitative and research analysis and QQQ-MS with quantitative and routine analysis. This view is now challenged and for this reason, we have evaluated the quantitative LC-MS performance of a new high-resolution mass spectrometer (HRMS), a Q-orbitrap-MS, and compared the results obtained with a recent triple-quadrupole MS (QQQ-MS). High-resolution full-scan (HR-FS) and MS/MS acquisitions have been tested with real plasma extracts or pure standards. Limits of detection, dynamic range, mass accuracy and false positive or false negative detections have been determined or investigated with protease inhibitors, tyrosine kinase inhibitors, steroids and metanephrines. Our quantitative results show that today's available HRMS are reliable and sensitive quantitative instruments and comparable to QQQ-MS quantitative performance. Taking into account their versatility, user-friendliness and robustness, we believe that HRMS should be seen more and more as key instruments in quantitative LC-MS analyses. In this scenario, most targeted LC-HRMS analyses should be performed by HR-FS recording virtually "all" ions. In addition to absolute quantifications, HR-FS will allow the relative quantifications of hundreds of metabolites in plasma revealing individual's metabolome and exposome. This phenotyping of known metabolites should promote HRMS in clinical environment. A few other LC-HRMS analyses should be performed in single-ion-monitoring or MS/MS mode when increased sensitivity and/or detection selectivity will be necessary. PMID:26928213

  3. Quantitative Guidance for Stove Usage and Performance to Achieve Health and Environmental Targets

    PubMed Central

    Chiang, Ranyee A.

    2015-01-01

    Background Displacing the use of polluting and inefficient cookstoves in developing countries is necessary to achieve the potential health and environmental benefits sought through clean cooking solutions. Yet little quantitative context has been provided on how much displacement of traditional technologies is needed to achieve targets for household air pollutant concentrations or fuel savings. Objectives This paper provides instructive guidance on the usage of cooking technologies required to achieve health and environmental improvements. Methods We evaluated different scenarios of displacement of traditional stoves with use of higher performing technologies. The air quality and fuel consumption impacts were estimated for these scenarios using a single-zone box model of indoor air quality and ratios of thermal efficiency. Results Stove performance and usage should be considered together, as lower performing stoves can result in similar or greater benefits than a higher performing stove if the lower performing stove has considerably higher displacement of the baseline stove. Based on the indoor air quality model, there are multiple performance–usage scenarios for achieving modest indoor air quality improvements. To meet World Health Organization guidance levels, however, three-stone fire and basic charcoal stove usage must be nearly eliminated to achieve the particulate matter target (< 1–3 hr/week), and substantially limited to meet the carbon monoxide guideline (< 7–9 hr/week). Conclusions Moderate health gains may be achieved with various performance–usage scenarios. The greatest benefits are estimated to be achieved by near-complete displacement of traditional stoves with clean technologies, emphasizing the need to shift in the long term to near exclusive use of clean fuels and stoves. The performance–usage scenarios are also provided as a tool to guide technology selection and prioritize behavior change opportunities to maximize impact. Citation

  4. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  5. Evaluation of reference genes for quantitative RT-PCR in Lolium perenne

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative real-time RT-PCR provides an important tool for analyzing gene expression if proper internal standards are used. The aim of this study was to identify and evaluate reference genes for use in real-time quantitative RT-PCR in perennial ryegrass (Lolium perenne L.) during plant developmen...

  6. Anesthesia and the quantitative evaluation of neurovascular coupling

    PubMed Central

    Masamoto, Kazuto; Kanno, Iwao

    2012-01-01

    Anesthesia has broad actions that include changing neuronal excitability, vascular reactivity, and other baseline physiologies and eventually modifies the neurovascular coupling relationship. Here, we review the effects of anesthesia on the spatial propagation, temporal dynamics, and quantitative relationship between the neural and vascular responses to cortical stimulation. Previous studies have shown that the onset latency of evoked cerebral blood flow (CBF) changes is relatively consistent across anesthesia conditions compared with variations in the time-to-peak. This finding indicates that the mechanism of vasodilation onset is less dependent on anesthesia interference, while vasodilation dynamics are subject to this interference. The quantitative coupling relationship is largely influenced by the type and dosage of anesthesia, including the actions on neural processing, vasoactive signal transmission, and vascular reactivity. The effects of anesthesia on the spatial gap between the neural and vascular response regions are not fully understood and require further attention to elucidate the mechanism of vascular control of CBF supply to the underlying focal and surrounding neural activity. The in-depth understanding of the anesthesia actions on neurovascular elements allows for better decision-making regarding the anesthetics used in specific models for neurovascular experiments and may also help elucidate the signal source issues in hemodynamic-based neuroimaging techniques. PMID:22510601

  7. [Quantitative evaluation of acute myocardial infarction by In-111 antimyosin Fab myocardial imaging].

    PubMed

    Naruse, H; Morita, M; Itano, M; Yamamoto, J; Kawamoto, H; Fukutake, N; Ohyanagi, M; Iwasaki, T; Fukuchi, M

    1991-11-01

    For quantitative evaluation of acute myocardial infarction, In-111 antimyosin Fab myocardial imaging (InAM) was performed in 17 patients with myocardial infarction who underwent Tl-201 (TL) and Tc-99m pyrophosphate (PYP) myocardial imaging in acute phase. For calculating the infarct size, voxel counter method was used for analysis in PYP and InAM, and extent and severity score were used on bull's-eye polar map in TL. The most appropriate cut-off level ranged from 65 to 80% by the fundamental experiment using cardiac phantom. The cut-off level of 0.70 (InAM) and 0.65 (PYP) were used for clinical application of voxel counter analysis. The infarct size calculated by InAM and PYP was compared with wall motion abnormality index by echocardiography (WMAI), TL extent score, TL severity score, peak CK and sigma CK. Infarct size by InAM showed the following correlations with other indices. PYP: r = 0.26 (ns), TL extent score: r = 0.72 (p less than 0.01), TL severity score: r = 0.65 (p less than 0.05), WMAI: r = 0.69 (p less than 0.05). The infarct size by PYP did not show any correlations with these indices. Therefore, the infarct size by InAM showed better correlations with TL and WMAI than that of PYP. So InAM was considered superior to PYP for quantitative evaluation of acute myocardial infarction. PMID:1770642

  8. Managing Technological Change by Changing Performance Appraisal to Performance Evaluation.

    ERIC Educational Resources Information Center

    Marquardt, Steve

    1996-01-01

    Academic libraries can improve their management of change by reshaping performance appraisal into performance planning. This article notes problems with traditional employee evaluation as well as benefits of alternatives that focus on the future, on users, on planning and learning, and on skills needed to address problems and enhance individual…

  9. Using quantitative interference phase microscopy for sperm acrosome evaluation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Balberg, Michal; Kalinowski, Ksawery; Levi, Mattan; Shaked, Natan T.

    2016-03-01

    We demonstrate quantitative assessment of sperm cell morphology, primarily acrosomal volume, using quantitative interference phase microscopy (IPM). Normally, the area of the acrosome is assessed using dyes that stain the acrosomal part of the cell. We have imaged fixed individual sperm cells using IPM. Following, the sample was stained and the same cells were imaged using bright field microscopy (BFM). We identified the acrosome using the stained BFM image, and used it to define a quantitative corresponding area in the IPM image and determine a quantitative threshold for evaluating the volume of the acrosome.

  10. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  11. A New Simple Interferometer for Obtaining Quantitatively Evaluable Flow Patterns

    NASA Technical Reports Server (NTRS)

    Erdmann, S F

    1953-01-01

    The method described in the present report makes it possible to obtain interferometer records with the aid of any one of the available Schlieren optics by the addition of very simple expedients, which fundamentally need not to be inferior to those obtained by other methods, such as the Mach-Zehnder interferometer, for example. The method is based on the fundamental concept of the phase-contrast process developed by Zernike, but which in principle has been enlarged to such an extent that it practically represents an independent interference method for general applications. Moreover, the method offers the possibility, in case of necessity, of superposing any apparent wedge field on the density field to be gauged. The theory is explained on a purely physical basis and illustrated and proved by experimental data. A number of typical cases are cited and some quantitative results reported.

  12. Quantitative Percussion Diagnostics For Evaluating Bond Integrity Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott Leonard

    Conventional nondestructive testing (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was utilized based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Experimental results indicate that this technology is capable of detecting 'kiss' bonds (very low adhesive shear strength), caused by the application of release agents on the bonding surfaces, between flat composite laminates bonded together with epoxy adhesive. Specifically, the local value of the loss coefficient determined from quantitative percussion testing was found to be significantly greater for a release coated panel compared to that for a well bonded sample. Also, the local value of the probe force or force returned to the probe after impact was observed to be lower for the release coated panels. The increase in loss coefficient and decrease in probe force are thought to be due to greater internal friction during the percussion event for poorly bonded specimens. NDT standards were also fabricated by varying the cure parameters of an epoxy film adhesive. Results from QPD for the variable cure NDT standards and lap shear strength measurements taken of mechanical test specimens were compared and analyzed. Finally, experimental results have been compared to a finite element analysis to understand the visco-elastic behavior of the laminates during percussion testing. This comparison shows how a lower quality bond leads to a reduction in the percussion force by biasing strain in the percussion tested side of the panel.

  13. Conductor gestures influence evaluations of ensemble performance

    PubMed Central

    Morrison, Steven J.; Price, Harry E.; Smedley, Eric M.; Meals, Cory D.

    2014-01-01

    Previous research has found that listener evaluations of ensemble performances vary depending on the expressivity of the conductor’s gestures, even when performances are otherwise identical. It was the purpose of the present study to test whether this effect of visual information was evident in the evaluation of specific aspects of ensemble performance: articulation and dynamics. We constructed a set of 32 music performances that combined auditory and visual information and were designed to feature a high degree of contrast along one of two target characteristics: articulation and dynamics. We paired each of four music excerpts recorded by a chamber ensemble in both a high- and low-contrast condition with video of four conductors demonstrating high- and low-contrast gesture specifically appropriate to either articulation or dynamics. Using one of two equivalent test forms, college music majors and non-majors (N = 285) viewed sixteen 30 s performances and evaluated the quality of the ensemble’s articulation, dynamics, technique, and tempo along with overall expressivity. Results showed significantly higher evaluations for performances featuring high rather than low conducting expressivity regardless of the ensemble’s performance quality. Evaluations for both articulation and dynamics were strongly and positively correlated with evaluations of overall ensemble expressivity. PMID:25104944

  14. Conductor gestures influence evaluations of ensemble performance.

    PubMed

    Morrison, Steven J; Price, Harry E; Smedley, Eric M; Meals, Cory D

    2014-01-01

    Previous research has found that listener evaluations of ensemble performances vary depending on the expressivity of the conductor's gestures, even when performances are otherwise identical. It was the purpose of the present study to test whether this effect of visual information was evident in the evaluation of specific aspects of ensemble performance: articulation and dynamics. We constructed a set of 32 music performances that combined auditory and visual information and were designed to feature a high degree of contrast along one of two target characteristics: articulation and dynamics. We paired each of four music excerpts recorded by a chamber ensemble in both a high- and low-contrast condition with video of four conductors demonstrating high- and low-contrast gesture specifically appropriate to either articulation or dynamics. Using one of two equivalent test forms, college music majors and non-majors (N = 285) viewed sixteen 30 s performances and evaluated the quality of the ensemble's articulation, dynamics, technique, and tempo along with overall expressivity. Results showed significantly higher evaluations for performances featuring high rather than low conducting expressivity regardless of the ensemble's performance quality. Evaluations for both articulation and dynamics were strongly and positively correlated with evaluations of overall ensemble expressivity. PMID:25104944

  15. A Quantitative Approach to Evaluating Training Curriculum Content Sampling Adequacy.

    ERIC Educational Resources Information Center

    Bownas, David A.; And Others

    1985-01-01

    Developed and illustrated a technique depicting the fit between training curriculum content and job performance requirements for three Coast Guard schools. Generated a listing of tasks which receive undue emphasis in training, tasks not being taught, and tasks instructors intend to train, but which course graduates are unable to perform.…

  16. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  17. Quantitative sparse array vascular elastography: the impact of tissue attenuation and modulus contrast on performance

    PubMed Central

    Huntzicker, Steven; Nayak, Rohit; Doyley, Marvin M.

    2014-01-01

    Abstract. Quantitative sparse array vascular elastography visualizes the shear modulus distribution within vascular tissues, information that clinicans could use to reduce the number of strokes each year. However, the low transmit power sparse array (SA) imaging could hamper the clinical usefulness of the resulting elastograms. In this study, we evaluated the performance of modulus elastograms recovered from simulated and physical vessel phantoms with varying attenuation coefficients (0.6, 1.5, and 3.5  cm−1) and modulus contrasts (−12.04, −6.02, and −2.5  dB) using SA imaging relative to those obtained with conventional linear array (CLA) and plane-wave (PW) imaging techniques. Plaques were visible in all modulus elastograms, but those produced using SA and PW contained less artifacts. The modulus contrast-to-noise ratio decreased rapidly with increasing modulus contrast and attenuation coefficient, but more quickly when SA imaging was performed than for CLA or PW. The errors incurred varied from 10.9% to 24% (CLA), 1.8% to 12% (SA), and ≈4% (PW). Modulus elastograms produced with SA and PW imagings were not significantly different (p>0.05). Despite the low transmit power, SA imaging can produce useful modulus elastograms in superficial organs, such as the carotid artery. PMID:26158040

  18. Quantitative pharmaco-EEG and performance after administration of brotizolam to healthy volunteers

    PubMed Central

    Saletu, B.; Grünberger, J.; Linzmayer, L.

    1983-01-01

    1 The activity of brotizolam (0.1, 0.3 and 0.5 mg) was studied in normal subjects using quantitative pharmaco-EEG, psychometric and clinical evaluation. 2 Power spectral density analysis showed no changes after placebo, while brotizolam increased beta-activity, decreased alpha-activity and increased the average frequency (anxiolytic pharmaco-EEG profile). In addition, 0.3 and 0.5 mg brotizolam augmented delta-activity indicating hypnotic activity. 3 The highest dose (0.5 mg) of brotizolam decreased attention, concentration, psychomotor performance and affectivity, and increased reaction time. The lower doses of brotizolam also caused a decrease in attention and concentration, but tended to improve psychomotor performance, shorten reaction time, and did not influence mood or affectivity. 4 Brotizolam (0.1 mg) is the minimal effective psychoactive dose with a tranquillizing effect, while 0.5 mg and to some extent 0.3 mg induce a sedative effect and may be regarded as hypnotic doses. PMID:6661379

  19. Methodology for Evaluation of Diagnostic Performance

    SciTech Connect

    Metz, Charles E.

    2003-02-19

    The proliferation of expensive technology in diagnostic medicine demands objective, meaningful assessments of diagnostic performance. Receiver Operating Characteristic (ROC) analysis is now recognized widely as the best approach to the task of measuring and specifying diagnostic accuracy (Metz, 1978; Swets and Pickett, 1982; Beck and Schultz, 1986; Metz, 1986; Hanley, 1989; Zweig and Campbell, 1993), which is defined as the extent to which diagnoses agree with actual states of health or disease (Fryback and Thornbury, 1991; National Council on Radiation Protection and Measurements, 1995). The primary advantage of ROC analysis over alternative methodologies is that it separates differences among diagnostic decisions that are due to actual differences in discrimination capacity from those that are due to decision-threshold effects (e.g., ''under-reading'' or ''over-reading''). An ROC curve measures diagnostic accuracy by displaying True Positive Fraction (TPF: the fraction of patients actually having the disease in question that is diagnosed correctly as ''positive'') as a function of False Positive Fraction (FPF: the fraction of patients actually without the disease that is diagnosed incorrectly as ''positive''). Different points on the ROC curve--i.e., different compromises between the specificity and the sensitivity of a diagnostic test, for a given inherent accuracy--can be achieved by adopting different critical values of the diagnostic test's ''decision variable'' --e.g., the observer's degree of confidence that each case is positive or negative in a diagnostic image-reading task, or the numerical value of the result of a quantitative diagnostic test. ROC techniques have been used to measure and specify the diagnostic performance of medical imaging systems since the early 1970s, and the needs that arise in this application have spurred a variety of new methodological developments. In particular, substantial progress has been made in ROC curve fitting and in

  20. STATISTICAL BASIS FOR LABORATORY PERFORMANCE EVALUATION LIMITS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) conducts studies to evaluate the performance of drinking water and wastewater laboratories that analyze samples for major EPA programs. The studies involve sample concentrates which the participating laboratories dilute to volume wit...

  1. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604... require a performance evaluation report on the work done by the architect-engineer after the completion...

  2. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604... require a performance evaluation report on the work done by the architect-engineer after the completion...

  3. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604... require a performance evaluation report on the work done by the architect-engineer after the completion...

  4. 48 CFR 436.604 - Performance evaluation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.604... require a performance evaluation report on the work done by the architect-engineer after the completion...

  5. Performance Evaluation of Undulator Radiation at CEBAF

    SciTech Connect

    Chuyu Liu, Geoffrey Krafft, Guimei Wang

    2010-05-01

    The performance of undulator radiation (UR) at CEBAF with a 3.5 m helical undulator is evaluated and compared with APS undulator-A radiation in terms of brilliance, peak brilliance, spectral flux, flux density and intensity distribution.

  6. ATAMM enhancement and multiprocessor performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.; Som, Sukhamoy; Obando, Rodrigo; Malekpour, Mahyar R.; Jones, Robert L., III; Mandala, Brij Mohan V.

    1991-01-01

    ATAMM (Algorithm To Architecture Mapping Model) enhancement and multiprocessor performance evaluation is discussed. The following topics are included: the ATAMM model; ATAMM enhancement; ADM (Advanced Development Model) implementation of ATAMM; and ATAMM support tools.

  7. Digital holographic microscopy for quantitative cell dynamic evaluation during laser microsurgery

    PubMed Central

    Yu, Lingfeng; Mohanty, Samarendra; Zhang, Jun; Genc, Suzanne; Kim, Myung K.; Berns, Michael W.; Chen, Zhongping

    2010-01-01

    Digital holographic microscopy allows determination of dynamic changes in the optical thickness profile of a transparent object with subwavelength accuracy. Here, we report a quantitative phase laser microsurgery system for evaluation of cellular/ sub-cellular dynamic changes during laser micro-dissection. The proposed method takes advantage of the precise optical manipulation by the laser microbeam and quantitative phase imaging by digital holographic microscopy with high spatial and temporal resolution. This system will permit quantitative evaluation of the damage and/or the repair of the cell or cell organelles in real time. PMID:19582118

  8. Evaluation of high-performance computing software

    SciTech Connect

    Browne, S.; Dongarra, J.; Rowan, T.

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  9. Improvement of Automotive Part Supplier Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Kongmunee, Chalermkwan; Chutima, Parames

    2016-05-01

    This research investigates the problem of the part supplier performance evaluation in a major Japanese automotive plant in Thailand. Its current evaluation scheme is based on experiences and self-opinion of the evaluators. As a result, many poor performance suppliers are still considered as good suppliers and allow to supply parts to the plant without further improvement obligation. To alleviate this problem, the brainstorming session among stakeholders and evaluators are formally conducted. The result of which is the appropriate evaluation criteria and sub-criteria. The analytical hierarchy process is also used to find suitable weights for each criteria and sub-criteria. The results show that a newly developed evaluation method is significantly better than the previous one in segregating between good and poor suppliers.

  10. Class diagram based evaluation of software performance

    NASA Astrophysics Data System (ADS)

    Pham, Huong V.; Nguyen, Binh N.

    2013-03-01

    The evaluation of software performance in the early stages of the software life cycle is important and it has been widely studied. In the software model specification, class diagram is the important object-oriented software specification model. The measures based on a class diagram have been widely studied to evaluate quality of software such as complexity, maintainability, reuse capability, etc. However the software performance evaluation based on Class model has not been widely studied, especially for object-oriented design of embedded software. Therefore, in this paper we propose a new approach to directly evaluate the software performance based on class diagrams. From a class diagram, we determine the parameters which are used to evaluate and build formula of the measures such as Size of Class Variables, Size of Class Methods, Size of Instance Variables, Size of Instance Methods, etc. Then, we do analysis of the dependence of performance on these measures and build the performance evaluation function from class diagram. Thereby we can choose the best class diagram based on this evaluation function.

  11. Building Leadership Talent through Performance Evaluation

    ERIC Educational Resources Information Center

    Clifford, Matthew

    2015-01-01

    Most states and districts scramble to provide professional development to support principals, but "principal evaluation" is often lost amid competing priorities. Evaluation is an important method for supporting principal growth, communicating performance expectations to principals, and improving leadership practice. It provides leaders…

  12. Performance-Based Evaluation and School Librarians

    ERIC Educational Resources Information Center

    Church, Audrey P.

    2015-01-01

    Evaluation of instructional personnel is standard procedure in our Pre-K-12 public schools, and its purpose is to document educator effectiveness. With Race to the Top and No Child Left Behind waivers, states are required to implement performance-based evaluations that demonstrate student academic progress. This three-year study describes the…

  13. Evaluation of a quantitative fit testing method for N95 filtering facepiece respirators.

    PubMed

    Janssen, Larry; Luinenburg, Michael D; Mullins, Haskell E; Danisch, Susan G; Nelson, Thomas J

    2003-01-01

    A method for performing quantitative fit tests (QNFT) with N95 filtering facepiece respirators was developed by earlier investigators. The method employs a simple clamping device to allow the penetration of submicron aerosols through N95 filter media to be measured. The measured value is subtracted from total penetration, with the assumption that the remaining penetration represents faceseal leakage. The developers have used the clamp to assess respirator performance. This study evaluated the clamp's ability to measure filter penetration and determine fit factors. In Phase 1, subjects were quantitatively fit-tested with elastomeric half-facepiece respirators using both generated and ambient aerosols. QNFT were done with each aerosol with both P100 and N95 filters without disturbing the facepiece. In Phase 2 of the study elastomeric half facepieces were sealed to subjects' faces to eliminate faceseal leakage. Ambient aerosol QNFT were performed with P100 and N95 filters without disturbing the facepiece. In both phases the clamp was used to measure N95 filter penetration, which was then subtracted from total penetration for the N95 QNFT. It was hypothesized that N95 fit factors corrected for filter penetration would equal the P100 fit factors. Mean corrected N95 fit factors were significantly different from the P100 fit factors in each phase of the study. In addition, there was essentially no correlation between corrected N95 fit factors and P100 fit factors. It was concluded that the clamp method should not be used to fit-test N95 filtering facepieces or otherwise assess respirator performance. PMID:12908863

  14. A lighting metric for quantitative evaluation of accent lighting systems

    NASA Astrophysics Data System (ADS)

    Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.

    2014-09-01

    Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.

  15. African Primary Care Research: Performing a programme evaluation

    PubMed Central

    2014-01-01

    Abstract This article is part of a series on Primary Care Research in the African context and focuses on programme evaluation. Different types of programme evaluation are outlined: developmental, process, outcome and impact. Eight steps to follow in designing your programme evaluation are then described in some detail: engage stakeholders; establish what is known; describe the programme; define the evaluation and select a study design; define the indicators; plan and manage data collection and analysis; make judgements and recommendations; and disseminate the findings. Other articles in the series cover related topics such as writing your research proposal, performing a literature review, conducting surveys with questionnaires, qualitative interviewing and approaches to quantitative and qualitative data analysis. PMID:26245440

  16. A Quantitative Investigation of Stakeholder Variation in Training Program Evaluation.

    ERIC Educational Resources Information Center

    Michalski, Greg V.

    A survey was conducted to investigate variation in stakeholder perceptions of training results and evaluation within the context of a high-technology product development firm (the case organization). A scannable questionnaire survey booklet was developed and scanned data were exported and analyzed. Based on an achieved sample of 280 (70% response…

  17. Performance evaluation of video colonoscope systems

    NASA Astrophysics Data System (ADS)

    Picciano, Lawrence D.; Keller, James P.

    1994-05-01

    A comparative engineering performance evaluation was performed on video colonoscope systems from all three of the current U.S. suppliers: Fujinon, Olympus, and Pentax. Video system test methods, results, and conclusions based on their clinical significance are the focus of this paper.

  18. Quantitative Evaluation of 3 DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Sylto, R.

    1984-01-01

    Characteristics required for NASA scientific data base management application are listed as well as performance testing objectives. Results obtained for the ORACLE, SEED, and INGRES packages are presented in charts. It is concluded that vendor packages can manage 130 megabytes of data at acceptable load and query rates. Performance tests varying data base designs and various data base management system parameters are valuable to applications for choosing packages and critical to designing effective data bases. An applications productivity increases with the use of data base management system because of enhanced capabilities such as a screen formatter, a reporter writer, and a data dictionary.

  19. Photoacoustic microscopy for quantitative evaluation of angiogenesis inhibitor

    NASA Astrophysics Data System (ADS)

    Chen, Sung-Liang; Burnett, Joseph; Sun, Duxin; Xie, Zhixing; Wang, Xueding

    2014-03-01

    We present the photoacoustic microscopy (PAM) for evaluation of angiogenesis inhibitors on a chick embryo model. Microvasculature in the chorioallantoic membrane (CAM) of the chick embryos was imaged by PAM, and the optical microscopy (OM) images of the same set of CAMs were also acquired for comparisons, serving for validation of the results from PAM. The angiogenesis inhibitors, Sunitinib, with different concentrations applied to the CAM result in the change in microvascular density, which was quantified by both PAM and OM imaging. Similar change in microvascular density from PAM and OM imaging in response to angiogenesis inhibitor at different doses was observed, demonstrating that PAM has potential to provide objective evaluation of anti-angiogenesis medication. Besides, PAM is advantageous in three-dimensional and functional imaging compared with OM so that the emerging PAM technique may offer unique information on the efficacy of angiogenesis inhibitors and could benefit applications related to antiangiogenesis treatments.

  20. Quantitative vertebral compression fracture evaluation using a height compass

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Burns, Joseph E.; Wiese, Tatjana; Summers, Ronald M.

    2012-03-01

    Vertebral compression fractures can be caused by even minor trauma in patients with pathological conditions such as osteoporosis, varying greatly in vertebral body location and compression geometry. The location and morphology of the compression injury can guide decision making for treatment modality (vertebroplasty versus surgical fixation), and can be important for pre-surgical planning. We propose a height compass to evaluate the axial plane spatial distribution of compression injury (anterior, posterior, lateral, and central), and distinguish it from physiologic height variations of normal vertebrae. The method includes four steps: spine segmentation and partition, endplate detection, height compass computation and compression fracture evaluation. A height compass is computed for each vertebra, where the vertebral body is partitioned in the axial plane into 17 cells oriented about concentric rings. In the compass structure, a crown-like geometry is produced by three concentric rings which are divided into 8 equal length arcs by rays which are subtended by 8 common central angles. The radius of each ring increases multiplicatively, with resultant structure of a central node and two concentric surrounding bands of cells, each divided into octants. The height value for each octant is calculated and plotted against octants in neighboring vertebrae. The height compass shows intuitive display of the height distribution and can be used to easily identify the fracture regions. Our technique was evaluated on 8 thoraco-abdominal CT scans of patients with reported compression fractures and showed statistically significant differences in height value at the sites of the fractures.

  1. Quantitative evaluation of phonetograms in the case of functional dysphonia.

    PubMed

    Airainer, R; Klingholz, F

    1993-06-01

    According to the laryngeal clinical findings, figures making up a scale were assigned to vocally trained and vocally untrained persons suffering from different types of functional dysphonia. The different types of dysphonia--from the manifested hypofunctional to the extreme hyperfunctional dysphonia--were classified by means of this scale. Besides, the subjects' phonetograms were measured and approximated by three ellipses, what rendered possible the definition of phonetogram parameters. The combining of selected phonetogram parameters to linear combinations served the purpose of a phonetographic evaluation. The linear combinations were to bring phonetographic and clinical evaluations into correspondence as accurately as possible. It was necessary to use different kinds of linear combinations for male and female singers and nonsingers. As a result of the reclassification of 71 and the new classification of 89 patients, it was possible to graduate the types of functional dysphonia by means of computer-aided phonetogram evaluation with a clinically acceptable error rate. This method proved to be an important supplement to the conventional diagnostics of functional dysphonia. PMID:8353627

  2. Quantitative analysis and purity evaluation of medroxyprogesterone acetate by HPLC.

    PubMed

    Cavina, G; Valvo, L; Alimenti, R

    1985-01-01

    A reversed-phase high-performance liquid chromatographic method was developed for the assay of medroxyprogesterone acetate and for the detection and determination of related steroids present as impurities in the drug. The method was compared with the normal-phase technique of the USP XX and was also applied to the analysis of tablets and injectable suspensions. PMID:16867645

  3. Image Evaluation For Sensor Performance Standards

    NASA Astrophysics Data System (ADS)

    Peck, Lorin C.

    1989-02-01

    The subject of imagery evaluation as it applies to electro-optical (EO) sensor performance testing standards is discussed. Some of the difficulties encountered in the development of these standards for the various aircraft Line Replaceable Units (LRUs) are listed. The use of system performance testing is regarded as a requirement for the depot maintenance program to insure the integrity of total system performance requirements for EO imaging systems such as the Advanced Tactical Air Reconnaissance System (ATARS). The necessity for tying NATO Essential Elements of Information (EEIs) together with Imagery Interpretation Rating Scale (IIRS) numbers is explained. The requirements for a field target suitable for EO imagery evaluation is explained.

  4. Performability evaluation of the SIFT computer

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Furchtgott, D. G.; Wu, L. T.

    1979-01-01

    Performability modeling and evaluation techniques are applied to the SIFT computer as it might operate in the computational evironment of an air transport mission. User-visible performance of the total system (SIFT plus its environment) is modeled as a random variable taking values in a set of levels of accomplishment. These levels are defined in terms of four attributes of total system behavior: safety, no change in mission profile, no operational penalties, and no economic process whose states describe the internal structure of SIFT as well as relavant conditions of the environment. Base model state trajectories are related to accomplishment levels via a capability function which is formulated in terms of a 3-level model hierarchy. Performability evaluation algorithms are then applied to determine the performability of the total system for various choices of computer and environment parameter values. Numerical results of those evaluations are presented and, in conclusion, some implications of this effort are discussed.

  5. Quantitative image analysis for evaluating the abrasion resistance of nanoporous silica films on glass

    NASA Astrophysics Data System (ADS)

    Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar

    2015-12-01

    The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5-6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance.

  6. Quantitative image analysis for evaluating the abrasion resistance of nanoporous silica films on glass

    PubMed Central

    Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar

    2015-01-01

    The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5–6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance. PMID:26656260

  7. Extending the detectability index to quantitative imaging performance: applications in tomosynthesis and CT

    NASA Astrophysics Data System (ADS)

    Richard, Samuel; Chen, Baiyu; Samei, Ehsan

    2010-04-01

    This study aimed to extend Fourier-based imaging metrics for the modeling of quantitative imaging performance. Breast tomosynthesis was used as a platform for investigating acquisition and processing parameters (e.g., acquisition angle and dose) that can significantly affect 3D signal and noise, and consequently quantitative imaging performance. The detectability index was computed using the modulation transfer function and noise-power spectrum combined with a Fourier description of imaging task. Three imaging tasks were considered: detection, area estimation (in coronal slice), and volume estimation of a 4 mm diameter spherical target. Task functions for size estimation were generated by using measured performance of the maximum-likelihood estimator as training data. The detectability index computed with the size estimation tasks correlated well with precision measurements for area and volume estimation over a fairly broad range of imaging conditions and provided a meaningful figure of merit for quantitative imaging performance. Furthermore, results highlighted that optimal breast tomosynthesis acquisition parameters depend significantly on imaging task. Mass detection was optimal at an acquisition angle of 85° while area and volume estimation for the same mass were optimal at ~100° and 125° acquisition angles, respectively. These findings provide key initial validation that the Fourier-based detectability index extended to estimation tasks can represent a meaningful metric and predictor of quantitative imaging performance.

  8. A method for the quantitative evaluation of SAR distribution in deep regional hyperthermia.

    PubMed

    Baroni, C; Giri, M G; Meliadó, G; Maluta, S; Chierego, G

    2001-01-01

    The Specific Absorption Rate (SAR) distribution pattern visualization by a matrix of E-field light-emitting sensors has demonstrated to be a useful tool to evaluate the characteristics of the applicators used in deep regional hyperthermia and to perform a quality assurance programme. A method to quantify the SAR from photographs of the sensor array--the so-called 'Power Stepping Technique'--has already been proposed. This paper presents a new approach to the quantitative determination of the SAR profiles in a liquid phantom exposed to electromagnetic fields from the Sigma-60 applicator (BSD-2000 system for deep regional hyperthermia). The method is based on the construction of a 'calibration curve' modelling the light-output of an E-field sensor as a function of the supplied voltage and on the use of a reference light source to 'normalize' the light-output readings from the photos of the sensor array, in order to minimize the errors introduced by the non-uniformity of the photographic process. Once the calibration curve is obtained, it is possible, with only one photo, to obtain the quantitative SAR distribution in the operating conditions. For this reason, this method is suitable for equipment characterization and also for the control of the repeatability of power deposition in time. PMID:11587076

  9. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model

    PubMed Central

    Yoshioka, S.; Matsuhana, B.; Tanaka, S.; Inouye, Y.; Oshima, N.; Kinoshita, S.

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model. PMID:20554565

  10. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model.

    PubMed

    Yoshioka, S; Matsuhana, B; Tanaka, S; Inouye, Y; Oshima, N; Kinoshita, S

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model. PMID:20554565

  11. A front-of-pack nutrition logo: a quantitative and qualitative process evaluation in the Netherlands.

    PubMed

    Vyth, Ellis L; Steenhuis, Ingrid H M; Mallant, Sanne F; Mol, Zinzi L; Brug, Johannes; Temminghoff, Marcel; Feunekes, Gerda I; Jansen, Leon; Verhagen, Hans; Seidell, Jacob C

    2009-01-01

    This study aimed to perform a quantitative and qualitative process evaluation of the introduction of the Choices logo, a front-of-pack nutrition logo on products with a favorable product composition, adopted by many food producers, retail and food service organizations, conditionally endorsed by the Dutch government, validated by scientists, and in the process of international dissemination. An online questionnaire was sent to adult consumers 4 months after the introduction of the logo (n = 1,032) and 1 year later (n = 1,127). Additionally, seven consumer focus groups (n = 41) were conducted to provide more insight into the questionnaire responses. Quantitative analyses showed that exposure to the logo had significantly increased. Elderly and obese respondents reported to be more in need of a logo than younger and normal-weight individuals. Women perceived the logo more attractive and credible than men did. Further qualitative analyses indicated that the logo's credibility would improve if it became known that governmental and scientific authorities support it. Elderly respondents indicated that they needed a logo due to health concerns. Consumers interested in health reported that they used the logo. Further research focusing on specific target groups, forming healthful diets, and health outcomes is needed to investigate the effectiveness of the Choices logo. PMID:19851915

  12. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  13. Quantitative Evaluation of the Reticuloendothelial System Function with Dynamic MRI

    PubMed Central

    Liu, Ting; Choi, Hoon; Zhou, Rong; Chen, I-Wei

    2014-01-01

    Purpose To evaluate the reticuloendothelial system (RES) function by real-time imaging blood clearance as well as hepatic uptake of superparamagnetic iron oxide nanoparticle (SPIO) using dynamic magnetic resonance imaging (MRI) with two-compartment pharmacokinetic modeling. Materials and Methods Kinetics of blood clearance and hepatic accumulation were recorded in young adult male 01b74 athymic nude mice by dynamic T2* weighted MRI after the injection of different doses of SPIO nanoparticles (0.5, 3 or 10 mg Fe/kg). Association parameter, Kin, dissociation parameter, Kout, and elimination constant, Ke, derived from dynamic data with two-compartment model, were used to describe active binding to Kupffer cells and extrahepatic clearance. The clodrosome and liposome were utilized to deplete macrophages and block the RES function to evaluate the capability of the kinetic parameters for investigation of macrophage function and density. Results The two-compartment model provided a good description for all data and showed a low sum squared residual for all mice (0.27±0.03). A lower Kin, a lower Kout and a lower Ke were found after clodrosome treatment, whereas a lower Kin, a higher Kout and a lower Ke were observed after liposome treatment in comparison to saline treatment (P<0.005). Conclusion Dynamic SPIO-enhanced MR imaging with two-compartment modeling can provide information on RES function on both a cell number and receptor function level. PMID:25090653

  14. Computerized quantitative evaluation of mammographic accreditation phantom images

    SciTech Connect

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria, the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.

  15. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  16. Effects of Performers' External Characteristics on Performance Evaluations.

    ERIC Educational Resources Information Center

    Bermingham, Gudrun A.

    2000-01-01

    States that fairness has been a major concern in the field of music adjudication. Reviews the research literature to reveal information about three external characteristics (race, gender, and physical attractiveness) that may affect judges' performance evaluations and influence fairness of music adjudication. Includes references. (CMK)

  17. Performance Evaluation of Nano-JASMINE

    NASA Astrophysics Data System (ADS)

    Hatsutori, Y.; Kobayashi, Y.; Gouda, N.; Yano, T.; Murooka, J.; Niwa, Y.; Yamada, Y.

    2011-02-01

    We report the results of performance evaluation of the first Japanese astrometry satellite, Nano-JASMINE. It is a very small satellite and weighs only 35 kg. It aims to carry out astrometry measurement of nearby bright stars (z ≤ 7.5 mag) with an accuracy of 3 milli-arcseconds. Nano-JASMINE will be launched by Cyclone-4 rocket in August 2011 from Brazil. The current status is in the process of evaluating the performances. A series of performance tests and numerical analysis were conducted. As a result, the engineering model (EM) of the telescope was measured to be achieving a diffraction-limited performance and confirmed that it has enough performance for scientific astrometry.

  18. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  19. Towards the quantitative evaluation of visual attention models.

    PubMed

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. PMID:25951756

  20. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  1. Quantitative Evaluation of Strain Near Tooth Fillet by Image Processing

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Yoshiizumi, Satoshi; Inoue, Katsumi

    The accurate measurement of strain and stress in a tooth is important for the reliable evaluation of the strength or life of gears. In this research, a strain measurement method which is based on image processing is applied to the analysis of strain near the tooth fillet. The loaded tooth is photographed using a CCD camera and stored as a digital image. The displacement of the point in the tooth flank is tracked by the cross-correlation method, and then, the strain is calculated. The interrogation window size of the correlation method and the overlap amount affect the accuracy and resolution. In the case of measurements at structures with complicated profiles such as fillets, the interrogation window maintains a large size and the overlap amount should be large. The surface condition also affects the accuracy. The white painted surface with a small black particle is suitable for measurement.

  2. Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment

    NASA Technical Reports Server (NTRS)

    Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.

    1979-01-01

    The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.

  3. A Quantitative Evaluation of Medication Histories and Reconciliation by Discipline

    PubMed Central

    Stewart, Michael R.; Fogg, Sarah M.; Schminke, Brandon C.; Zackula, Rosalee E.; Nester, Tina M.; Eidem, Leslie A.; Rosendale, James C.; Ragan, Robert H.; Bond, Jack A.; Goertzen, Kreg W.

    2014-01-01

    Abstract Background/Objective: Medication reconciliation at transitions of care decreases medication errors, hospitalizations, and adverse drug events. We compared inpatient medication histories and reconciliation across disciplines and evaluated the nature of discrepancies. Methods: We conducted a prospective cohort study of patients admitted from the emergency department at our 760-bed hospital. Eligible patients had their medication histories conducted and reconciled in order by the admitting nurse (RN), certified pharmacy technician (CPhT), and pharmacist (RPh). Discharge medication reconciliation was not altered. Admission and discharge discrepancies were categorized by discipline, error type, and drug class and were assigned a criticality index score. A discrepancy rating system systematically measured discrepancies. Results: Of 175 consented patients, 153 were evaluated. Total admission and discharge discrepancies were 1,461 and 369, respectively. The average number of medications per participant at admission was 8.59 (1,314) with 9.41 (1,374) at discharge. Most discrepancies were committed by RNs: 53.2% (777) at admission and 56.1% (207) at discharge. The majority were omitted or incorrect. RNs had significantly higher admission discrepancy rates per medication (0.59) compared with CPhTs (0.36) and RPhs (0.16) (P < .001). RPhs corrected significantly more discrepancies per participant than RNs (6.39 vs 0.48; P < .001); average criticality index reduction was 79.0%. Estimated prevented adverse drug events (pADEs) cost savings were $589,744. Conclusions: RPhs committed the fewest discrepancies compared with RNs and CPhTs, resulting in more accurate medication histories and reconciliation. RPh involvement also prevented the greatest number of medication errors, contributing to considerable pADE-related cost savings. PMID:25477614

  4. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  5. Noninvasive Quantitative Evaluation of the Dentin Layer during Dental Procedures Using Optical Coherence Tomography

    PubMed Central

    Sinescu, Cosmin; Negrutiu, Meda Lavinia; Bradu, Adrian; Duma, Virgil-Florin; Podoleanu, Adrian Gh.

    2015-01-01

    A routine cavity preparation of a tooth may lead to opening the pulp chamber. The present study evaluates quantitatively, in real time, for the first time to the best of our knowledge, the drilled cavities during dental procedures. An established noninvasive imaging technique, Optical Coherence Tomography (OCT), is used. The main scope is to prevent accidental openings of the dental pulp chamber. Six teeth with dental cavities have been used in this ex vivo study. The real time assessment of the distances between the bottom of the drilled cavities and the top of the pulp chamber was performed using an own assembled OCT system. The evaluation of the remaining dentin thickness (RDT) allowed for the positioning of the drilling tools in the cavities in relation to the pulp horns. Estimations of the safe and of the critical RDT were made; for the latter, the opening of the pulp chamber becomes unavoidable. Also, by following the fractures that can occur when the extent of the decay is too large, the dentist can decide upon the right therapy to follow, endodontic or conventional filling. The study demonstrates the usefulness of OCT imaging in guiding such evaluations during dental procedures. PMID:26078779

  6. Quantitative evaluation of automatic methods for lesions detection in breast ultrasound images

    NASA Astrophysics Data System (ADS)

    Marcomini, Karem D.; Schiabel, Homero; Carneiro, Antonio Adilton O.

    2013-02-01

    Ultrasound (US) is a useful diagnostic tool to distinguish benign from malignant breast masses, providing more detailed evaluation in dense breasts. Due to the subjectivity in the images interpretation, computer-aid diagnosis (CAD) schemes have been developed, increasing the mammography analysis process to include ultrasound images as complementary exams. As one of most important task in the evaluation of this kind of images is the mass detection and its contours interpretation, automated segmentation techniques have been investigated in order to determine a quite suitable procedure to perform such an analysis. Thus, the main goal in this work is investigating the effect of some processing techniques used to provide information on the determination of suspicious breast lesions as well as their accurate boundaries in ultrasound images. In tests, 80 phantom and 50 clinical ultrasound images were preprocessed, and 5 segmentation techniques were tested. By using quantitative evaluation metrics the results were compared to a reference image delineated by an experienced radiologist. A self-organizing map artificial neural network has provided the most relevant results, demonstrating high accuracy and low error rate in the lesions representation, corresponding hence to the segmentation process for US images in our CAD scheme under tests.

  7. SAT-M Performance of Women Intending Quantitative Fields of Study.

    ERIC Educational Resources Information Center

    Ethington, Corinna A.

    This study assessed patterns of differences in quantitative performance across groups of intended undergraduate majors consistent with those previously found for students who had completed their undergraduate study. Data were drawn from the College Board Admissions Testing Program's national sample of 10,000 college-bound high school seniors in…

  8. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  9. Quantitative evaluation of proximal contacts in posterior composite restorations. Part I. Methodology.

    PubMed

    Wang, J C; Hong, J M

    1989-07-01

    An in vivo method of quantitative measuring intertooth distance before and after placement of a Class 2 composite resin restoration has been developed. A Kaman Sciences KD-2611 non-contact displacement measuring system with a 1 U unshield sensor, based upon the variable resistance of eddy current, was used for the intraoral measurement. Quantitative evaluation of proximal wear, therefore, can be made preoperatively, postoperatively, and at subsequent recall interval for posterior composite resin restorations. PMID:2810447

  10. Smith Newton Vehicle Performance Evaluation (Brochure)

    SciTech Connect

    Not Available

    2012-08-01

    The Fleet Test and Evaluation Team at the U.S. Department of Energy's National Renewable Energy Laboratory is evaluating and documenting the performance of electric and plug-in hybrid electric drive systems in medium-duty trucks across the nation. Through this project, Smith Electric Vehicles will build and deploy 500 all-electric medium-duty trucks. The trucks will be deployed in diverse climates across the country.

  11. A new method to evaluate human-robot system performance

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  12. A new method to evaluate human-robot system performance.

    PubMed

    Rodriguez, G; Weisbin, C R

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated. PMID:12703512

  13. Developmental modeling effects on the quantitative and qualitative aspects of motor performance.

    PubMed

    McCullagh, P; Stiehl, J; Weiss, M R

    1990-12-01

    The purpose of the present experiment was to replicate and extend previous developmental modeling research by examining the qualitative as well as quantitative aspects of motor performance. Eighty females of two age groups (5-0 to 6-6 and 7-6 to 9-0 years) were randomly assigned to conditions within a 2 x 2 x 2 (Age x Model Type x Rehearsal) factorial design. Children received either verbal instructions only (no model) or a visual demonstration with experimenter-given verbal cues (verbal model) of a five-part dance skill sequence. Children were either prompted to verbally rehearse before skill execution or merely asked to reproduce the sequence without prompting. Both quantitative (order) and qualitative (form) performances were assessed. Results revealed a significant age main effect for both order and form performance, with older children performing better than younger children. A model type main effect was also found for both order and form performance. The verbal model condition produced better qualitative performance, whereas the no model condition resulted in better quantitative scores. These results are discussed in terms of differential coding strategies that may influence task components in modeling. PMID:2132893

  14. Quantitative Fundus Autofluorescence for the Evaluation of Retinal Diseases.

    PubMed

    Armenti, Stephen T; Greenberg, Jonathan P; Smith, R Theodore

    2016-01-01

    The retinal pigment epithelium (RPE) is juxtaposed to the overlying sensory retina, and supports the function of the visual system. Among the tasks performed by the RPE are phagocytosis and processing of outer photoreceptor segments through lysosome-derived organelles. These degradation products, stored and referred to as lipofuscin granules, are composed partially of bisretinoids, which have broad fluorescence absorption and emission spectra that can be detected clinically as fundus autofluorescence with confocal scanning laser ophthalmoscopy (cSLO). Lipofuscin accumulation is associated with increasing age, but is also found in various patterns in both acquired and inherited degenerative diseases of the retina. Thus, studying its pattern of accumulation and correlating such patterns with changes in the overlying sensory retina are essential to understanding the pathophysiology and progression of retinal disease. Here, we describe a technique employed by our lab and others that uses cSLO in order to quantify the level of RPE lipofuscin in both healthy and diseased eyes. PMID:27023389

  15. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  16. Smith Newton Vehicle Performance Evaluation - Cumulative (Brochure)

    SciTech Connect

    Not Available

    2014-08-01

    The Fleet Test and Evaluation Team at the U.S. Department of Energy's National Renewable Energy Laboratory is evaluating and documenting the performance of electric and plug-in hybrid electric drive systems in medium-duty trucks across the nation. U.S. companies participating in this evaluation project received funding from the American Recovery and Reinvestment Act to cover part of the cost of purchasing these vehicles. Through this project, Smith Electric Vehicles is building and deploying 500 all-electric medium-duty trucks that will be deployed by a variety of companies in diverse climates across the country.

  17. Performance evaluations of demountable electrical connections

    SciTech Connect

    Niemann, R.C.; Cha, Y.S.; Hull, J.R.; Buckles, W.E.; Daugherty, M.A.

    1993-07-01

    Electrical conductors operating in cryogenic environments can require demountable connections along their lengths. The connections must have low resistance and high reliability and should allow ready assembly and disassembly. In this work, the performance of two types of connections has been evaluated. The first connection type is a clamped surface-to-surface joint. The second connection type is a screwed joint that incorporates male and female machine-thread components. The connections for copper conductors have been evaluated experimentally at 77 K. Experimental variables included thread surface treatment and assembly methods. The results of the evaluations are presented.

  18. A quantitative evaluation of dry-sensor electroencephalography

    NASA Astrophysics Data System (ADS)

    Uy, E. Timothy

    Neurologists, neuroscientists, and experimental psychologists study electrical activity within the brain by recording voltage fluctuations at the scalp. This is electroencephalography (EEG). In conventional or "wet" EEG, scalp abrasion and use of electrolytic paste are required to insure good electrical connection between sensor and skin. Repeated abrasion quickly becomes irritating to subjects, severely limiting the number and frequency of sessions. Several groups have produced "dry" EEG sensors that do not require abrasion or conductive paste. These, in addition to sidestepping the issue of abrasion, promise to reduce setup time from about 30 minutes with a technician to less than 30 seconds without one. The availability of such an instrument would (1) reduce the cost of brain-related medical care, (2) lower the barrier of entry on brain experimentation, and (3) allow individual subjects to contribute substantially more data without fear of abrasion or fatigue. Accuracy of the EEG is paramount in the medical diagnosis of epilepsy, in experimental psychology and in the burgeoning field of brain-computer interface. Without a sufficiently accurate measurement, the advantages of dry sensors remain a moot point. However, even after nearly a decade, demonstrations of dry EEG accuracy with respect to wet have been limited to visual comparison of short snippets of spontaneous EEG, averaged event-related potentials or plots of power spectrum. In this dissertation, I propose a detailed methodology based on single-trial EEG classification for comparing dry EEG sensors to their wet counterparts. Applied to a set of commercially fabricated dry sensors, this work reveals that dry sensors can perform as well their wet counterparts with careful screening and attention to the bandwidth of interest.

  19. Simulation-based evaluation of the resolution and quantitative accuracy of temperature-modulated fluorescence tomography

    PubMed Central

    Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C.; Gulsen, Gultekin

    2016-01-01

    Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed “temperature-modulated fluorescence tomography” (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm × W :100 mm) is recovered as an elongated object in the conventional FT (x = 4.5 mm; y = 10.4 mm), while TM-FT recovers it successfully in both directions (x = 3.8 mm; y = 4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT. PMID:26368884

  20. Prospective safety performance evaluation on construction sites.

    PubMed

    Wu, Xianguo; Liu, Qian; Zhang, Limao; Skibniewski, Miroslaw J; Wang, Yanhong

    2015-05-01

    This paper presents a systematic Structural Equation Modeling (SEM) based approach for Prospective Safety Performance Evaluation (PSPE) on construction sites, with causal relationships and interactions between enablers and the goals of PSPE taken into account. According to a sample of 450 valid questionnaire surveys from 30 Chinese construction enterprises, a SEM model with 26 items included for PSPE in the context of Chinese construction industry is established and then verified through the goodness-of-fit test. Three typical types of construction enterprises, namely the state-owned enterprise, private enterprise and Sino-foreign joint venture, are selected as samples to measure the level of safety performance given the enterprise scale, ownership and business strategy are different. Results provide a full understanding of safety performance practice in the construction industry, and indicate that the level of overall safety performance situation on working sites is rated at least a level of III (Fair) or above. This phenomenon can be explained that the construction industry has gradually matured with the norms, and construction enterprises should improve the level of safety performance as not to be eliminated from the government-led construction industry. The differences existing in the safety performance practice regarding different construction enterprise categories are compared and analyzed according to evaluation results. This research provides insights into cause-effect relationships among safety performance factors and goals, which, in turn, can facilitate the improvement of high safety performance in the construction industry. PMID:25746166

  1. Hypersonic Interceptor Performance Evaluation Center aero-optics performance predictions

    NASA Astrophysics Data System (ADS)

    Sutton, George W.; Pond, John E.; Snow, Ronald; Hwang, Yanfang

    1993-06-01

    This paper describes the Hypersonic Interceptor Performance Evaluation Center's (HIPEC) aerooptics performance predictions capability. It includes code results for three dimensional shapes and comparisons to initial experiments. HIPEC consists of a collection of aerothermal, aerodynamic computational codes which are capable of covering the entire flight regime from subsonic to hypersonic flow and include chemical reactions and turbulence. Heat transfer to the various surfaces is calculated as an input to cooling and ablation processes. HIPEC also has aero-optics codes to determine the effect of the mean flowfield and turbulence on the tracking and imaging capability of on-board optical sensors. The paper concentrates on the latter aspects.

  2. Evaluating Performance Portability of OpenACC

    SciTech Connect

    Sabne, Amit J; Sakdhnagool, Putt; Lee, Seyong; Vetter, Jeffrey S

    2015-01-01

    Accelerator-based heterogeneous computing is gaining momentum in High Performance Computing arena. However, the increased complexity of the accelerator architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle the problem. While the abstraction endowed by OpenACC offers productivity, it raises questions on its portability. This paper evaluates the performance portability obtained by OpenACC on twelve OpenACC programs on NVIDIA CUDA, AMD GCN, and Intel MIC architectures. We study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.

  3. ASBESTOS IN DRINKING WATER PERFORMANCE EVALUATION STUDIES

    EPA Science Inventory

    Performance evaluations of laboratories testing for asbestos in drinking water according to USEPA Test Method 100.1 or 100.2 are complicated by the difficulty of providing stable sample dispersions of asbestos in water. Reference samples of a graduated series of chrysotile asbes...

  4. ASBESTOS IN DRINKING WATER PERFORMANCE EVALUATION STUDIES

    EPA Science Inventory

    Performance evaluations of laboratories testing for asbestos in drinking water according to USEPA Test Method 100.1 or 100.2 are complicated by the difficulty of providing stable sample dispersions of asbestos in water. Reference samples of a graduated series of chrysotile asbest...

  5. EVALUATION OF CONFOCAL MICROSCOPY SYSTEM PERFORMANCE

    EPA Science Inventory

    BACKGROUND. The confocal laser scanning microscope (CLSM) has enormous potential in many biological fields. Currently there is a subjective nature in the assessment of a confocal microscope's performance by primarily evaluating the system with a specific test slide provided by ea...

  6. A New Approach to Evaluating Performance.

    PubMed

    Bleich, Michael R

    2016-09-01

    A leadership task is evaluating the performance of individuals for organizational fit. Traditional approaches have included leader-subordinate reviews, self-review, and peer review. A new approach is evolving in team-based organizations, introduced in this article. J Contin Educ Nurs. 2016;47(9):393-394. PMID:27580504

  7. Optical Storage Performance Modeling and Evaluation.

    ERIC Educational Resources Information Center

    Behera, Bailochan; Singh, Harpreet

    1990-01-01

    Evaluates different types of storage media for long-term archival storage of large amounts of data. Existing storage media are reviewed, including optical disks, optical tape, magnetic storage, and microfilm; three models are proposed based on document storage requirements; performance analysis is considered; and cost effectiveness is discussed.…

  8. GENERAL METHODS FOR REMEDIAL PERFORMANCE EVALUATIONS

    EPA Science Inventory

    This document was developed by an EPA-funded project to explain technical considerations and principles necessary to evaluated the performance of ground-water contamination remediations at hazardous waste sites. This is neither a "cookbook", nor an encyclopedia of recommended fi...

  9. PERFORMANCE EVALUATION OF AN IMPROVED STREET SWEEPER

    EPA Science Inventory

    The report gives results of an extensive evaluation of the Improved Street Sweeper (ISS) in Bellevue, WA, and in San Diego, CA. The cleaning performance of the ISS was compared with that of broom sweepers and a vacuum sweeper. The ISS cleaned streets better than the other sweeper...

  10. Quantitative evaluation of 3D dosimetry for stereotactic volumetric-modulated arc delivery using COMPASS.

    PubMed

    Vikraman, Subramani; Manigandan, Durai; Karrthick, Karukkupalayam Palaniappan; Sambasivaselli, Raju; Senniandavar, Vellaingiri; Ramu, Mahendran; Rajesh, Thiyagarajan; Lutz, Muller; Muthukumaran, Manavalan; Karthikeyan, Nithyanantham; Tejinder, Kataria

    2015-01-01

    The purpose of this study was to evaluate quantitatively the patient-specific 3D dosimetry tool COMPASS with 2D array MatriXX detector for stereotactic volumetric-modulated arc delivery. Twenty-five patients CT images and RT structures from different sites (brain, head & neck, thorax, abdomen, and spine) were taken from CyberKnife Multiplan planning system for this study. All these patients underwent radical stereotactic treatment in CyberKnife. For each patient, linac based volumetric-modulated arc therapy (VMAT) stereotactic plans were generated in Monaco TPS v3.1 using Elekta Beam Modulator MLC. Dose prescription was in the range of 5-20 Gy per fraction. Target prescription and critical organ constraints were tried to match the delivered treatment plans. Each plan quality was analyzed using conformity index (CI), conformity number (CN), gradient Index (GI), target coverage (TC), and dose to 95% of volume (D95). Monaco Monte Carlo (MC)-calculated treatment plan delivery accuracy was quantitatively evaluated with COMPASS-calculated (CCA) dose and COMPASS indirectly measured (CME) dose based on dose-volume histogram metrics. In order to ascertain the potential of COMPASS 3D dosimetry for stereotactic plan delivery, 2D fluence verification was performed with MatriXX using MultiCube phantom. Routine quality assurance of absolute point dose verification was performed to check the overall delivery accuracy. Quantitative analyses of dose delivery verification were compared with pass and fail criteria of 3 mm and 3% distance to agreement and dose differences. Gamma passing rate was compared with 2D fluence verification from MatriXX with MultiCube. Comparison of COMPASS reconstructed dose from measured fluence and COMPASS computed dose has shown a very good agreement with TPS calculated dose. Each plan was evaluated based on dose volume parameters for target volumes such as dose at 95% of volume (D95) and average dose. For critical organs dose at 20% of volume (D20), dose

  11. Performance evaluation of two personal bioaerosol samplers.

    PubMed

    Tolchinsky, Alexander D; Sigaev, Vladimir I; Varfolomeev, Alexander N; Uspenskaya, Svetlana N; Cheng, Yung S; Su, Wei-Chung

    2011-01-01

    In this study, the performance of two newly developed personal bioaerosol samplers for monitoring the level of environmental and occupational airborne microorganisms was evaluated. These new personal bioaerosol samplers were designed based on a swirling cyclone with recirculating liquid film. The performance evaluation included collection efficiency tests using inert aerosols, the bioaerosol survival test using viable airborne microorganism, and the evaluation of using non-aqueous collection liquid for long-period sampling. The test results showed that these two newly developed personal bioaerosol samplers are capable of doing high efficiency, aerosol sampling (the cutoff diameters are around 0.7 μm for both samplers), and have proven to provide acceptable survival for the collected bioaerosols. By using an appropriate non-aqueous collection liquid, these two personal bioaerosol samplers should be able to permit continuous, long-period bioaerosol sampling with considerable viability for the captured bioaerosols. PMID:22175872

  12. Performance Evaluation of Dense Gas Dispersion Models.

    NASA Astrophysics Data System (ADS)

    Touma, Jawad S.; Cox, William M.; Thistle, Harold; Zapert, James G.

    1995-03-01

    This paper summarizes the results of a study to evaluate the performance of seven dense gas dispersion models using data from three field experiments. Two models (DEGADIS and SLAB) are in the public domain and the other five (AIRTOX, CHARM, FOCUS, SAFEMODE, and TRACE) are proprietary. The field data used are the Desert Tortoise pressurized ammonia releases, Burro liquefied natural gas spill tests, and the Goldfish anhydrous hydrofluoric acid spill experiments. Desert Tortoise and Goldfish releases were simulated as horizontal jet releases, and Burro as a liquid pool. Performance statistics were used to compare maximum observed concentrations and plume half-width to those predicted by each model. Model performance varied and no model exhibited consistently good performance across all three databases. However, when combined across the three databases, all models performed within a factor of 2. Problems encountered are discussed in order to help future investigators.

  13. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... procedures in EPA Method 3B of appendix A to 40 CFR part 60 to determine an oxygen correction factor if... performance test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies....

  14. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... procedures in EPA Method 3B of appendix A to 40 CFR part 60 to determine an oxygen correction factor if... performance test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies....

  15. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... procedures in EPA Method 3B of appendix A to 40 CFR part 60 to determine an oxygen correction factor if... performance test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies....

  16. Quantitative evaluation of a low-cost noninvasive hybrid interface based on EEG and eye movement.

    PubMed

    Kim, Minho; Kim, Byung Hyung; Jo, Sungho

    2015-03-01

    This paper describes a low-cost noninvasive brain-computer interface (BCI) hybridized with eye tracking. It also discusses its feasibility through a Fitts' law-based quantitative evaluation method. Noninvasive BCI has recently received a lot of attention. To bring the BCI applications into real life, user-friendly and easily portable devices need to be provided. In this work, as an approach to realize a real-world BCI, electroencephalograph (EEG)-based BCI combined with eye tracking is investigated. The two interfaces can be complementary to attain improved performance. Especially to consider public availability, a low-cost interface device is intentionally used for test. A low-cost commercial EEG recording device is integrated with an inexpensive custom-built eye tracker. The developed hybrid interface is evaluated through target pointing and selection experiments. Eye movement is interpreted as cursor movement and noninvasive BCI selects a cursor point with two selection confirmation schemes. Using Fitts' law, the proposed interface scheme is compared with other interface schemes such as mouse, eye tracking with dwell time, and eye tracking with keyboard. In addition, the proposed hybrid BCI system is discussed with respect to a practical interface scheme. Although further advancement is required, the proposed hybrid BCI system has the potential to be practically useful in a natural and intuitive manner. PMID:25376041

  17. Extended-performance thruster technology evaluation

    NASA Technical Reports Server (NTRS)

    Beattie, J. R.; Poeschel, R. L.; Bechtel, R. T.

    1978-01-01

    Two 30-cm ion thruster technology areas are investigated in support of the extended-performance thruster operation required for the Halley's comet rendezvous mission. These areas include an evaluation of the thruster performance and lifetime characteristics at increased specific impulse and power levels, and the design and evaluation of a high-voltage propellant electrical isolator. Experimental results are presented indicating that all elements of the thruster design function well at the higher specific impulse and power levels. It is shown that the only thruster modifications required for extended-performance operation are a respacing of the ion optics assembly and a redesign of the propellant isolators. Experimental results obtained from three isolator designs are presented, and it is concluded that the design and development of a high-voltage isolator is possible using existing technology.

  18. Blind Source Parameters for Performance Evaluation of Despeckling Filters.

    PubMed

    Biradar, Nagashettappa; Dewal, M L; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  19. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    PubMed Central

    Biradar, Nagashettappa; Dewal, M. L.; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  20. PEAPOL (Program Evaluation at the Performance Objective Level) Outside Evaluation.

    ERIC Educational Resources Information Center

    Auvil, Mary S.

    In evaluating this pilot project, which developed a computer system for assessing student progress and cost effectiveness as related to achievement of performance objectives, interviews were conducted with project participants, including project staff, school administrators, and the auto shop instructors. Project documents were reviewed and a…

  1. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  2. Quantitative phase evaluation of dynamic changes on cell membrane during laser microsurgery.

    PubMed

    Yu, Lingfeng; Mohanty, Samarendra; Liu, Gangjun; Genc, Suzanne; Chen, Zhongping; Berns, Michael W

    2008-01-01

    The ability to inject exogenous material as well as to alter subcellular structures in a minimally invasive manner using a laser microbeam has been useful for cell biologists to study the structure-function relationship in complex biological systems. We describe a quantitative phase laser microsurgery system, which takes advantage of the combination of laser microirradiation and short-coherence interference microscopy. Using this method, quantitative phase images and the dynamic changes of phase during the process of laser microsurgery of red blood cells (RBCs) can be evaluated in real time. This system would enable absolute quantitation of localized alteration/damage to transparent phase objects, such as the cell membrane or intracellular structures, being exposed to the laser microbeam. Such quantitation was not possible using conventional phase-contrast microscopy. PMID:19021378

  3. Evaluation testbed for ATD performance prediction (ETAPP)

    NASA Astrophysics Data System (ADS)

    Ralph, Scott K.; Eaton, Ross; Snorrason, Magnús; Irvine, John; Vanstone, Steve

    2007-04-01

    Automatic target detection (ATD) systems process imagery to detect and locate targets in imagery in support of a variety of military missions. Accurate prediction of ATD performance would assist in system design and trade studies, collection management, and mission planning. A need exists for ATD performance prediction based exclusively on information available from the imagery and its associated metadata. We present a predictor based on image measures quantifying the intrinsic ATD difficulty on an image. The modeling effort consists of two phases: a learning phase, where image measures are computed for a set of test images, the ATD performance is measured, and a prediction model is developed; and a second phase to test and validate performance prediction. The learning phase produces a mapping, valid across various ATR algorithms, which is even applicable when no image truth is available (e.g., when evaluating denied area imagery). The testbed has plug-in capability to allow rapid evaluation of new ATR algorithms. The image measures employed in the model include: statistics derived from a constant false alarm rate (CFAR) processor, the Power Spectrum Signature, and others. We present performance predictors for two trained ATD classifiers, one constructed using using GENIE Pro TM, a tool developed at Los Alamos National Laboratory, and the other eCognition TM, developed by Definiens (http://www.definiens.com/products). We present analyses of the two performance predictions, and compare the underlying prediction models. The paper concludes with a discussion of future research.

  4. Error Reduction Program. [combustor performance evaluation codes

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.

    1985-01-01

    The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.

  5. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  6. Detection and quantitation of HBV DNA in miniaturized samples: multi centre study to evaluate the performance of the COBAS ® AmpliPrep/COBAS ® TaqMan ® hepatitis B virus (HBV) test v2.0 by the use of plasma or serum specimens.

    PubMed

    Berger, Annemarie; Gohl, Peter; Stürmer, Martin; Rabenau, Holger Felix; Nauck, Markus; Doerr, Hans Wilhelm

    2010-11-01

    Laboratory analysis of blood specimens is an increasingly important tool for rapid diagnosis and control of therapy. So, miniaturization of test systems is needed, but reduced specimens might impair test quality. For rapid detection and quantitation of HBV DNA, the COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV test has proved a robust instrument in routine diagnostic services. The test system has been modified recently for application of reduced samples of blood plasma and for blood serum, too. The performance of this modified COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV v2.0 (HBV v2.0 (this test is currently not available in the USA)) test was evaluated by comparison with the former COBAS(®) AmpliPrep/COBAS(®) TaqMan(®) HBV v1.0 (HBV v1.0) test. In this study a platform correlation of both assay versions was done including 275 HBV DNA positive EDTA plasma samples. Comparable results were obtained (R(2)=0.97, mean difference -0.03 log(10)IU/ml). The verification of equivalency of the sample matrix (plasma vs. serum samples tested in HBV v2.0 in the same run) showed comparable results for all 278 samples with a R(2)=0.99 and a mean difference of 0.06 log(10)IU/ml. In conclusion, the new test version HBV v2.0 is highly specific and reproducible and quantifies accurately HBV DNA in EDTA plasma and serum samples from patients with chronic HBV infection. PMID:20728470

  7. Evaluation of IR technology applied to cooling tower performance

    NASA Astrophysics Data System (ADS)

    MacNamara, Neal A.; Zayicek, Paul A.

    1999-03-01

    Infrared thermography (IR) is widely used by electric utilities as an integral part of their predictive maintenance program. IR is utilized for inspection of a variety of plant mechanical and electrical components. Additionally, IR can be used to provide thermal performance information for other key plant systems, including assessment of cooling towers. Cooling tower performance directly affects availability and heat rate in fossil and nuclear power plants. Optimal tower performance contributes to efficient turbine operation and maximum power output. It is estimated that up to half of the cooling towers installed have failed to meet their design performance specifications. As a result, any additional degradation of tower performance resulting from fouling, valve degradation, unbalanced flow, or a poor maintenance practice has a direct effect on generation output. We have collected infrared thermography images of mechanical draft cooling towers, as part of Evaluation of IR Technology Applied to Cooling Tower Performance. IR images have been analyzed to provide information regarding general performance conditions and identification of operational deficiencies related to thermal performance. Similarly, IR can be implemented for monitoring of tower flow balance activities and for post-maintenance surveillance. To date, IR images have been used to identify areas of general flow imbalance, flooding or limited flow in individual cells, missing or broken tower fill material, fan performance and other problems related to maintenance or operational issues. Additionally, an attempt is being made to use quantitative thermal data, provided by the IR image analysis software, in conjunction with condenser input/output site ambient information, to evaluate and compare individual tower cell performance.

  8. The Class C Passive Performance Evaluation Program

    NASA Astrophysics Data System (ADS)

    1981-09-01

    The Class-C performance which provides information on qualities of passive solar features which make them attractive to buyers was evaluated. The following topics are discussed: design of an audit form; design of regionally specific audit addenda; determination of site selection criteria; identification of sites; selection, training, and management of auditors; and packaging of materials of subcontractors for evaluation. Results and findings are presented as follows: demographic profile, passive solar home profile, cost, financing, and payback considerations, expectations, realizations, and satisfaction, and decisionmaking.

  9. Evaluation of Fourier Transform Profilometry for Quantitative Waste Volume Determination under Simulated Hanford Tank Conditions

    SciTech Connect

    Etheridge, J.A.; Jang, P.R.; Leone, T.; Long, Z.; Norton, O.P.; Okhuysen, W.P.; Monts, D.L.; Coggins, T.L.

    2008-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of chemical and radioactive species of interest. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We are conducting a multi-stage performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. The successive stages impose aspects that present increasing difficulty and increasingly more accurate approximations of in-tank environments. In this paper, we report our investigations of the dependence of the analyst upon FTP volume determination results and of the

  10. Performance Evaluation of a Data Validation System

    NASA Technical Reports Server (NTRS)

    Wong, Edmond (Technical Monitor); Sowers, T. Shane; Santi, L. Michael; Bickford, Randall L.

    2005-01-01

    Online data validation is a performance-enhancing component of modern control and health management systems. It is essential that performance of the data validation system be verified prior to its use in a control and health management system. A new Data Qualification and Validation (DQV) Test-bed application was developed to provide a systematic test environment for this performance verification. The DQV Test-bed was used to evaluate a model-based data validation package known as the Data Quality Validation Studio (DQVS). DQVS was employed as the primary data validation component of a rocket engine health management (EHM) system developed under NASA's NGLT (Next Generation Launch Technology) program. In this paper, the DQVS and DQV Test-bed software applications are described, and the DQV Test-bed verification procedure for this EHM system application is presented. Test-bed results are summarized and implications for EHM system performance improvements are discussed.

  11. Evaluation of Performance Management in State Schools: A Case of North Cyprus

    ERIC Educational Resources Information Center

    Atamturk, Hakan; Aksal, Fahriye A.; Gazi, Zehra A.; Atamturk, A. Nurdan

    2011-01-01

    The research study aims to evaluate performance management in the state secondary schools in North Cyprus. This study is significant by shedding a light on perceptions of teachers and headmasters regarding quality control of schools through performance management. In this research, quantitative research was employed, and a survey was conducted to…

  12. Performance Evaluation Methods for Assistive Robotic Technology

    NASA Astrophysics Data System (ADS)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  13. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  14. Performance evaluation of an improved street sweeper

    SciTech Connect

    Duncan, M.W.; Jain, R.C.; Yung, S.C.; Patterson, R.G.

    1985-10-01

    The paper gives results of an evaluation of the performance of an improved street sweeper (ISS) and conventional sweepers. Dust emissions from paved roads are a major source of urban airborne particles. These emissions can be controlled by street cleaning, but commonly used sweepers were not designed for fine particle collection. A sweeper was modified to improve its ability to remove fine particles from streets and to contain its dust dispersions. Performance was measured by sampling street solids with a vacuum system before and after sweeping. Sieve analyses were made on these samples. During sampling, cascade impactor subsamples were collected to measure the finer particles. Also, dust dispersions were measured.

  15. Performance evaluation of a dataflow architecture

    SciTech Connect

    Ghosal, D. . Computer Science Center); Bhuyan, L.N. . Dept. of Computer Science)

    1990-05-01

    This paper deals with formulation and validation of an analytical approach for the performance evaluation of the Manchester dataflow computer. The analytical approach is based on closed queuing network models. The average parallelism of the dataflow graph being executed on the dataflow architecture is shown to be related to the population of the closed network. The model of the dataflow computer has been validated by comparing the analytical results to those obtained from the prototype Manchester dataflow computer and our simulation. The bottleneck centers in the prototype machine have been identified through the model and various architectural modifications have been investigated from performance considerations.

  16. Performance evaluation and clinical applications of 3D plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Decker, Ryan; Shademan, Azad; Opfermann, Justin; Leonard, Simon; Kim, Peter C. W.; Krieger, Axel

    2015-06-01

    The observation and 3D quantification of arbitrary scenes using optical imaging systems is challenging, but increasingly necessary in many fields. This paper provides a technical basis for the application of plenoptic cameras in medical and medical robotics applications, and rigorously evaluates camera integration and performance in the clinical setting. It discusses plenoptic camera calibration and setup, assesses plenoptic imaging in a clinically relevant context, and in the context of other quantitative imaging technologies. We report the methods used for camera calibration, precision and accuracy results in an ideal and simulated surgical setting. Afterwards, we report performance during a surgical task. Test results showed the average precision of the plenoptic camera to be 0.90mm, increasing to 1.37mm for tissue across the calibrated FOV. The ideal accuracy was 1.14mm. The camera showed submillimeter error during a simulated surgical task.

  17. Performance evaluation of two OCR systems

    SciTech Connect

    Chen, S.; Subramaniam, S.; Haralick, R.M.; Phillips, I.T.

    1994-12-31

    An experimental protocol for the performance evaluation of Optical Character Recognition (OCR) algorithms is described. The protocol is intended to serve as a model for using the University of Washington English Document Image Database-I to evaluate OCR systems. The plain text zones (without special symbols) in this database have over 2,300,000 characters. The performances of two UNIX-based OCR systems, namely Caere OCR v109a and Xerox ScanWorX v2.0, are measured. The results suggest that Caere OCR outperforms ScanWorX in terms of recognition accuracy; however, ScanWorX is more robust in the presence of image flaws.

  18. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  19. Evaluation of reference genes for quantitative RT-PCR in Lolium temulentum under abiotic stress

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Lolium temulentum is a valuable model grass species for the study of stress in forage and turf grasses. Gene expression analysis by quantitative real time RT-PCR relies on the use of proper internal standards. The aim of this study was to identify and evaluate reference genes for use in real-time q...

  20. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  1. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  2. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  3. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  4. Toward Web-Site Quantitative Evaluation: Defining Quality Characteristics and Attributes.

    ERIC Educational Resources Information Center

    Olsina, L; Rossi, G.

    This paper identifies World Wide Web site characteristics and attributes and groups them in a hierarchy. The primary goal is to classify the elements that might be part of a quantitative evaluation and comparison process. In order to effectively select quality characteristics, different users' needs and behaviors are considered. Following an…

  5. Interdisciplinary program for quantitative nondestructive evaluation. Semi-annual report, October 1, 1982-February 28, 1983

    SciTech Connect

    Not Available

    1983-01-01

    Separate abstracts were prepared for the papers published in the following areas: (1) Application of Ultrasonic Quantitative Nondestructive Evaluation to Radio Frequency System Window Problems, (a) Improvements in Probability of Detection and (b) Sizing of Internal Flaws in Bore and Web Geometries; (2) Electromagnetic Detection and Sizing; (3) New Technical Opportunities; and (4) New Flaw Detection Techniques.

  6. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  7. Evaluation of quantitative planar 90Y bremsstrahlung whole-body imaging

    NASA Astrophysics Data System (ADS)

    Minarik, D.; Ljungberg, M.; Segars, P.; Sjögreen Gleisner, K.

    2009-10-01

    With high-dose administration of 90Y labeled antibodies, it is possible to image 90Y without an admixture of 111In. We have earlier shown that it is possible to perform quantitative 90Y bremsstrahlung SPECT for dosimetry purposes with reasonable accuracy. However, whole-body (WB) activity quantification with the conjugate view method is not as time consuming as SPECT and has been the method of choice for dosimetry. We have investigated the possibility of using a conjugate view method where scatter-, backscatter- and septal-penetration compensations are performed by inverse filtering and attenuation correction is performed with a WB x-ray image, for total-body and organ activity quantification of 90Y. The method was evaluated using both Monte Carlo simulated scintillation camera images using realistic source distributions, and by an experimental phantom study. The method was evaluated in terms of image quality and accuracy of the activity quantification. The experimental phantom study was performed using the RSD torso phantom with 90Y activity uniformly distributed in the liver insert. A GE Discovery VH/Hawkeye system was used to acquire the image. The simulation study was performed for a realistic activity distribution in the NCAT anthropomorphic phantom where 90Y bremsstrahlung images were generated using the SIMIND MC program. Two different phantom configurations and two activity distributions were simulated. To mimic the RSD phantom experiment one simulation study was also made with 90Y activity located only in the liver. The SIMIND program was configured to resemble a GE Discovery VH/Hawkeye system. An x-ray projector program was used to generate whole-body x-ray images from the NCAT phantom for attenuation correction in the conjugate view method. Organ activities were calculated from ROIs that exactly covered the organs. Corrections for background activity, overlapping activity and source extension in the depth direction were applied on the ROI data. The total

  8. Analytical performance evaluation for autonomous sensor fusion

    NASA Astrophysics Data System (ADS)

    Chang, K. C.

    2008-04-01

    A distributed data fusion system consists of a network of sensors, each capable of local processing and fusion of sensor data. There has been a great deal of work in developing distributed fusion algorithms applicable to a network centric architecture. Currently there are at least a few approaches including naive fusion, cross-correlation fusion, information graph fusion, maximum a posteriori (MAP) fusion, channel filter fusion, and covariance intersection fusion. However, in general, in a distributed system such as the ad hoc sensor networks, the communication architecture is not fixed. Each node has knowledge of only its local connectivity but not the global network topology. In those cases, the distributed fusion algorithm based on information graph type of approach may not scale due to its requirements to carry long pedigree information for decorrelation. In this paper, we focus on scalable fusion algorithms and conduct analytical performance evaluation to compare their performance. The goal is to understand the performance of those algorithms under different operating conditions. Specifically, we evaluate the performance of channel filter fusion, Chernoff fusion, Shannon Fusion, and Battachayya fusion algorithms. We also compare their results to NaÃve fusion and "optimal" centralized fusion algorithms under a specific communication pattern.

  9. Group 3: Performance evaluation and assessment

    NASA Technical Reports Server (NTRS)

    Frink, A.

    1981-01-01

    Line-oriented flight training provides a unique learning experience and an opportunity to look at aspects of performance other types of training did not provide. Areas such as crew coordination, resource management, leadership, and so forth, can be readily evaluated in such a format. While individual performance is of the utmost importance, crew performance deserves equal emphasis, therefore, these areas should be carefully observed by the instructors as an rea for discussion in the same way that individual performane is observed. To be effective, it must be accepted by the crew members, and administered by the instructors as pure training-learning through experience. To keep open minds, to benefit most from the experience, both in the doing and in the follow-on discussion, it is essential that it be entered into with a feeling of freedom, openness, and enthusiasm. Reserve or defensiveness because of concern for failure must be inhibit participation.

  10. Evaluating Algorithm Performance Metrics Tailored for Prognostics

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2009-01-01

    Prognostics has taken a center stage in Condition Based Maintenance (CBM) where it is desired to estimate Remaining Useful Life (RUL) of the system so that remedial measures may be taken in advance to avoid catastrophic events or unwanted downtimes. Validation of such predictions is an important but difficult proposition and a lack of appropriate evaluation methods renders prognostics meaningless. Evaluation methods currently used in the research community are not standardized and in many cases do not sufficiently assess key performance aspects expected out of a prognostics algorithm. In this paper we introduce several new evaluation metrics tailored for prognostics and show that they can effectively evaluate various algorithms as compared to other conventional metrics. Specifically four algorithms namely; Relevance Vector Machine (RVM), Gaussian Process Regression (GPR), Artificial Neural Network (ANN), and Polynomial Regression (PR) are compared. These algorithms vary in complexity and their ability to manage uncertainty around predicted estimates. Results show that the new metrics rank these algorithms in different manner and depending on the requirements and constraints suitable metrics may be chosen. Beyond these results, these metrics offer ideas about how metrics suitable to prognostics may be designed so that the evaluation procedure can be standardized. 1

  11. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  12. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory.

  13. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    PubMed

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills. PMID:22665301

  14. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  15. Flexor and extensor muscle tone evaluated using the quantitative pendulum test in stroke and parkinsonian patients.

    PubMed

    Huang, Han-Wei; Ju, Ming-Shaung; Lin, Chou-Ching K

    2016-05-01

    The aim of this study was to evaluate the flexor and extensor muscle tone of the upper limbs in patients with spasticity or rigidity and to investigate the difference in hypertonia between spasticity and rigidity. The two experimental groups consisted of stroke patients and parkinsonian patients. The control group consisted of age and sex-matched normal subjects. Quantitative upper limb pendulum tests starting from both flexed and extended joint positions were conducted. System identification with a simple linear model was performed and model parameters were derived. The differences between the three groups and two starting positions were investigated by these model parameters and tested by two-way analysis of variance. In total, 57 subjects were recruited, including 22 controls, 14 stroke patients and 21 parkinsonian patients. While stiffness coefficient showed no difference among groups, the number of swings, relaxation index and damping coefficient showed changes suggesting significant hypertonia in the two patient groups. There was no difference between these two patient groups. The test starting from the extended position constantly manifested higher muscle tone in all three groups. In conclusion, the hypertonia of parkinsonian and stroke patients could not be differentiated by the modified pendulum test; the elbow extensors showed a higher muscle tone in both control and patient groups; and hypertonia of both parkinsonian and stroke patients is velocity dependent. PMID:26765753

  16. The Evaluation and Quantitation of Dihydrogen Metabolism Using Deuterium Isotope in Rats

    PubMed Central

    Hyspler, Radomir; Ticha, Alena; Schierbeek, Henk; Galkin, Alexander; Zadak, Zdenek

    2015-01-01

    Purpose Despite the significant interest in molecular hydrogen as an antioxidant in the last eight years, its quantitative metabolic parameters in vivo are still lacking, as is an appropriate method for determination of hydrogen effectivity in the mammalian organism under various conditions. Basic Procedures Intraperitoneally-applied deuterium gas was used as a metabolic tracer and deuterium enrichment was determined in the body water pool. Also, in vitro experiments were performed using bovine heart submitochondrial particles to evaluate superoxide formation in Complex I of the respiratory chain. Main Findings A significant oxidation of about 10% of the applied dose was found under physiological conditions in rats, proving its antioxidant properties. Hypoxia or endotoxin application did not exert any effect, whilst pure oxygen inhalation reduced deuterium oxidation. During in vitro experiments, a significant reduction of superoxide formation by Complex I of the respiratory chain was found under the influence of hydrogen. The possible molecular mechanisms of the beneficial effects of hydrogen are discussed, with an emphasis on the role of iron sulphur clusters in reactive oxygen species generation and on iron species-dihydrogen interaction. Principal Conclusions According to our findings, hydrogen may be an efficient, non-toxic, highly bioavailable and low-cost antioxidant supplement for patients with pathological conditions involving ROS-induced oxidative stress. PMID:26103048

  17. Quantitative evaluation of radiation-induced changes in sperm morphology and chromatin distribution

    SciTech Connect

    Aubele, M.; Juetting, U.R.; Rodenacker, K.; Gais, P.; Burger, G.; Hacker-Klom, U. )

    1990-01-01

    Sperm head cytometry provides a useful assay for the detection of radiation-induced damage in mouse germ cells. Exposure of the gonads to radiation is known to lead to an increase of diploid and higher polyploid sperm and of sperm with head shape abnormalities. In the pilot studies reported here quantitative analysis of the total DNA content, the morphology, and the chromatin distribution of mouse sperm was performed. The goal was to evaluate the discriminative power of features derived by high resolution image cytometry in distinguishing sperm of control and irradiated mice. Our results suggest that besides the induction of the above mentioned variations in DNA content and shape of sperm head, changes of the nonhomogeneous chromatin distribution within the sperm may also be used to quantify the radiation effect on sperm cells. Whereas the chromatin distribution features show larger variations for sperm 21 days after exposure (dpr), the shape parameters seem to be more important to discriminate sperm 35 dpr. This may be explained by differentiation processes, which take place in different stages during mouse spermatogenesis.

  18. A Comprehensive Framework for Quantitative Evaluation of Downscaled Climate Predictions and Projections

    NASA Astrophysics Data System (ADS)

    Barsugli, J. J.; Guentchev, G.

    2012-12-01

    The variety of methods used for downscaling climate predictions and projections is large and growing larger. Comparative studies of downscaling techniques to date are often initiated in relation to specific projects, are focused on limited sets of downscaling techniques, and hence do not allow for easy comparison of outcomes. In addition, existing information about the quality of downscaled datasets is not available in digital form. There is a strong need for systematic evaluation of downscaling methods using standard protocols which will allow for a fair comparison of their advantages and disadvantages with respect to specific user needs. The National Climate Predictions and Projections platform, with the contributions of NCPP's Climate Science Advisory Team, is developing community-based standards and a prototype framework for the quantitative evaluation of downscaling techniques and datasets. Certain principles guide the development of this framework. We want the evaluation procedures to be reproducible and transparent, simple to understand, and straightforward to implement. To this end we propose a set of open standards that will include the use of specific data sets, time periods of analysis, evaluation protocols, evaluation tests and metrics. Secondly, we want the framework to be flexible and extensible to downscaling techniques which may be developed in the future, to high-resolution global models, and to evaluations that are meaningful for additional applications and sectors. Collaboration among practitioners who will be using the downscaled data and climate scientists who develop downscaling methods will therefore be essential to the development of this framework. The proposed framework consists of three analysis protocols, along with two tiers of specific metrics and indices that are to be calculated. The protocols describe the following types of evaluation that can be performed: 1) comparison to observations, 2) comparison to a "perfect model" simulation

  19. Performance evaluation of vector-machine architectures

    SciTech Connect

    Tang, Ju-ho.

    1989-01-01

    Vector machines are well known for their high-peak performance, but the delivered performance varies greatly over different workloads and depends strongly on compiler optimizations. Recently it has been claimed that several horizontal superscalar architectures, e.g., VLIW and polycyclic architectures, provide a more balanced performance across a wider range of scientific workloads than do vector machines. The purpose of this research is to study the performance of register-register vector processors, such as Cray supercomputers, as a function of their architectural features, scheduling schemes, compiler optimization capabilities, and program parameters. The results of this study also provide a base for comparing vector machines with horizontal superscalar machines. An evaluation methodology, based on timing parameters, bottle-necks, and run time bounds, is developed. Cray-1 performance is degraded by the multiple memory loads of index-misaligned vectors and the inability of the Cray Fortran Compiler (CFT) to produce code that hits all the chain slot times. The impact of chaining and two instruction scheduling schemes on one-memory-port vector supercomputers, illustrated by the Cray-1 and Cray-2, is studied. The lack of instruction chaining on the Cray-2 requires a different instruction scheduling scheme from that of the Cray-1. Situations are characterized in which simple vector scheduling can generate code that fully utilizes one functional unit for machines with chaining. Even without chaining, polycyclic scheduling guarantees full utilization of one functional unit, after an initial transient, for loops with acyclic dependence graphs.

  20. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... procedures in EPA Method 3B of appendix A to 40 CFR part 60 to determine an oxygen correction factor if... test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to you... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies....

  1. 40 CFR 63.5850 - How do I conduct performance tests, performance evaluations, and design evaluations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... test, performance evaluation, and design evaluation in 40 CFR part 63, subpart SS, that applies to you... requirements in § 63.7(e)(1) and under the specific conditions that 40 CFR part 63, subpart SS, specifies. (c... and under the specific conditions that 40 CFR part 63, subpart SS, specifies. (d) You may not...

  2. Performance evaluation of an automotive thermoelectric generator

    NASA Astrophysics Data System (ADS)

    Dubitsky, Andrei O.

    Around 40% of the total fuel energy in typical internal combustion engines (ICEs) is rejected to the environment in the form of exhaust gas waste heat. Efficient recovery of this waste heat in automobiles can promise a fuel economy improvement of 5%. The thermal energy can be harvested through thermoelectric generators (TEGs) utilizing the Seebeck effect. In the present work, a versatile test bench has been designed and built in order to simulate conditions found on test vehicles. This allows experimental performance evaluation and model validation of automotive thermoelectric generators. An electrically heated exhaust gas circuit and a circulator based coolant loop enable integrated system testing of hot and cold side heat exchangers, thermoelectric modules (TEMs), and thermal interface materials at various scales. A transient thermal model of the coolant loop was created in order to design a system which can maintain constant coolant temperature under variable heat input. Additionally, as electrical heaters cannot match the transient response of an ICE, modelling was completed in order to design a relaxed exhaust flow and temperature history utilizing the system thermal lag. This profile reduced required heating power and gas flow rates by over 50%. The test bench was used to evaluate a DOE/GM initial prototype automotive TEG and validate analytical performance models. The maximum electrical power generation was found to be 54 W with a thermal conversion efficiency of 1.8%. It has been found that thermal interface management is critical for achieving maximum system performance, with novel designs being considered for further improvement.

  3. Performance Assessment of Human and Cattle Associated Quantitative Real-time PCR Assays - slides

    EPA Science Inventory

    The presentation overview is (1) Single laboratory performance assessment of human- and cattle associated PCR assays and (2) A Field Study: Evaluation of two human fecal waste management practices in Ohio watershed.

  4. Evaluating Internet End-to-end Performance

    PubMed Central

    Wood, Fred B.; Cid, Victor H.; Siegel, Elliot R.

    1998-01-01

    Abstract Objective: An evaluation of Internet end-to-end performance was conducted for the purpose of better understanding the overall performance of Internet pathways typical of those used to access information in National Library of Medicine (NLM) databases and, by extension, other Internet-based biomedical information resources. Design: The evaluation used a three-level test strategy: 1) user testing to collect empirical data on Internet performance as perceived by users when accessing NLM Web-based databases, 2) technical testing to analyze the Internet paths between the NLM and the user's desktop computer terminal, and 3) technical testing between the NLM and the World Wide Web (“Web”) server computer at the user's institution to help characterize the relative performance of Internet pathways. Measurements: Time to download the front pages of NLM Web sites and conduct standardized searches of NLM databases, data transmission capacity between NLM and remote locations (known as the bulk transfer capacity [BTC], “ping” round-trip time as an indication of the latency of the network pathways, and the network routing of the data transmissions (number and sequencing of hops). Results: Based on 347 user tests spread over 16 locations, the median time per location to download the main NLM home page ranged from 2 to 59 seconds, and 1 to 24 seconds for the other NLM Web sites tested. The median time to conduct standardized searches and get search results ranged from 2 to 14 seconds for PubMed and 4 to 18 seconds for Internet Grateful Med. The overall problem rate was about 1 percent; that is, on the average, users experienced a problem once every 100 test measurements. The user terminal tests at five locations and Web host tests at 13 locations provided profiles of BTC, RTT, and network routing for both dial-up and fixed Internet connections. Conclusion: The evaluation framework provided a profile of typical Internet performance and insights into network

  5. Evaluating iterative reconstruction performance in computed tomography

    SciTech Connect

    Chen, Baiyu Solomon, Justin; Ramirez Giraldo, Juan Carlos; Samei, Ehsan

    2014-12-15

    Purpose: Iterative reconstruction (IR) offers notable advantages in computed tomography (CT). However, its performance characterization is complicated by its potentially nonlinear behavior, impacting performance in terms of specific tasks. This study aimed to evaluate the performance of IR with both task-specific and task-generic strategies. Methods: The performance of IR in CT was mathematically assessed with an observer model that predicted the detection accuracy in terms of the detectability index (d′). d′ was calculated based on the properties of the image noise and resolution, the observer, and the detection task. The characterizations of image noise and resolution were extended to accommodate the nonlinearity of IR. A library of tasks was mathematically modeled at a range of sizes (radius 1–4 mm), contrast levels (10–100 HU), and edge profiles (sharp and soft). Unique d′ values were calculated for each task with respect to five radiation exposure levels (volume CT dose index, CTDI{sub vol}: 3.4–64.8 mGy) and four reconstruction algorithms (filtered backprojection reconstruction, FBP; iterative reconstruction in imaging space, IRIS; and sinogram affirmed iterative reconstruction with strengths of 3 and 5, SAFIRE3 and SAFIRE5; all provided by Siemens Healthcare, Forchheim, Germany). The d′ values were translated into the areas under the receiver operating characteristic curve (AUC) to represent human observer performance. For each task and reconstruction algorithm, a threshold dose was derived as the minimum dose required to achieve a threshold AUC of 0.9. A task-specific dose reduction potential of IR was calculated as the difference between the threshold doses for IR and FBP. A task-generic comparison was further made between IR and FBP in terms of the percent of all tasks yielding an AUC higher than the threshold. Results: IR required less dose than FBP to achieve the threshold AUC. In general, SAFIRE5 showed the most significant dose reduction

  6. Quantitative evaluation of simulated human enamel caries kinetics using photothermal radiometry and modulated luminescence

    NASA Astrophysics Data System (ADS)

    Hellen, Adam; Mandelis, Andreas; Finer, Yoav; Amaechi, Bennett T.

    2011-03-01

    Photothermal radiometry and modulated luminescence (PTR-LUM) is a non-destructive methodology applied toward the detection, monitoring and quantification of dental caries. The purpose of this study was to evaluate the efficacy of PTRLUM to detect incipient caries lesions and quantify opto-thermophysical properties as a function of treatment time. Extracted human molars (n=15) were exposed to an acid demineralization gel (pH 4.5) for 10 or 40 days in order to simulate incipient caries lesions. PTR-LUM frequency scans (1 Hz - 1 kHz) were performed prior to and during demineralization. Transverse Micro-Radiography (TMR) analysis followed at treatment conclusion. A coupled diffusephoton- density-wave and thermal-wave theoretical model was applied to PTR experimental amplitude and phase data across the frequency range of 4 Hz - 354 Hz, to quantitatively evaluate changes in thermal and optical properties of sound and demineralized enamel. Excellent fits with small residuals were observed experimental and theoretical data illustrating the robustness of the computational algorithm. Increased scattering coefficients and poorer thermophysical properties were characteristic of demineralized lesion bodies. Enhanced optical scattering coefficients of demineralized lesions resulted in poorer luminescence yield due to scattering of both incident and converted luminescent photons. Differences in the rate of lesion progression for the 10-day and 40-day samples points to a continuum of surface and diffusion controlled mechanism of lesion formation. PTR-LUM sensitivity to changes in tooth mineralization coupled with opto-thermophysical property extraction illustrates the technique's potential for non-destructive quantification of enamel caries.

  7. Performance evaluation of swimmers: scientific tools.

    PubMed

    Smith, David J; Norris, Stephen R; Hogg, John M

    2002-01-01

    The purpose of this article is to provide a critical commentary of the physiological and psychological tools used in the evaluation of swimmers. The first-level evaluation should be the competitive performance itself, since it is at this juncture that all elements interplay and provide the 'highest form' of assessment. Competition video analysis of major swimming events has progressed to the point where it has become an indispensable tool for coaches, athletes, sport scientists, equipment manufacturers, and even the media. The breakdown of each swimming performance at the individual level to its constituent parts allows for comparison with the predicted or sought after execution, as well as allowing for comparison with identified world competition levels. The use of other 'on-going' monitoring protocols to evaluate training efficacy typically involves criterion 'effort' swims and specific training sets where certain aspects are scrutinised in depth. Physiological parameters that are often examined alongside swimming speed and technical aspects include oxygen uptake, heart rate, blood lactate concentration, blood lactate accumulation and clearance rates. Simple and more complex procedures are available for in-training examination of technical issues. Strength and power may be quantified via several modalities although, typically, tethered swimming and dry-land isokinetic devices are used. The availability of a 'swimming flume' does afford coaches and sport scientists a higher degree of flexibility in the type of monitoring and evaluation that can be undertaken. There is convincing evidence that athletes can be distinguished on the basis of their psychological skills and emotional competencies and that these differences become further accentuated as the athlete improves. No matter what test format is used (physiological, biomechanical or psychological), similar criteria of validity must be ensured so that the test provides useful and associative information

  8. Performance evaluation of TCP over ABT protocols

    NASA Astrophysics Data System (ADS)

    Ata, Shingo; Murata, Masayuki; Miyahara, Hideo

    1998-10-01

    ABT is promising for effectively transferring a highly bursty data traffic in ATM networks. Most of past studies focused on the data transfer capability of ABT within the ATM layer. In actual, however, we need to consider the upper layer transport protocol since the transport layer protocol also supports a network congestion control mechanism. One such example is TCP, which is now widely used in the Internet. In this paper, we evaluate the performance of TCP over ABT protocols. Simulation results show that the retransmission mechanism of ABT can effectively overlay the TCP congestion control mechanism so that TCP operates in a stable fashion and works well only as an error recovery mechanism.

  9. Performance Evaluation of Emerging High Performance Computing Technologies using WRF

    NASA Astrophysics Data System (ADS)

    Newby, G. B.; Morton, D.

    2008-12-01

    The Arctic Region Supercomputing Center (ARSC) has evaluated multicore processors and other emerging processor technologies for a variety of high performance computing applications in the earth and space sciences, especially climate and weather applications. A flagship effort has been to assess dual core processor nodes on ARSC's Midnight supercomputer, in which two-socket systems were compared to eight-socket systems. Midnight is utilized for ARSC's twice-daily weather research and forecasting (WRF) model runs, available at weather.arsc.edu. Among other findings on Midnight, it was found that the Hypertransport system for interconnecting Opteron processors, memory, and other subsystems does not scale as well on eight-socket (sixteen processor) systems as well as two-socket (four processor) systems. A fundamental limitation is the cache snooping operation performed whenever a computational thread accesses main memory. This increases memory latency as the number of processor sockets increases. This is particularly noticeable on applications such as WRF that are primarily CPU-bound, versus applications that are bound by input/output or communication. The new Cray XT5 supercomputer at ARSC features quad core processors, and will host a variety of scaling experiments for WRF, CCSM4, and other models. Early results will be presented, including a series of WRF runs for Alaska with grid resolutions under 2km. ARSC will discuss a set of standardized test cases for the Alaska domain, similar to existing test cases for CONUS. These test cases will provide different configuration sizes and resolutions, suitable for single processors up to thousands. Beyond multi-core Opteron-based supercomputers, ARSC has examined WRF and other applications on additional emerging technologies. One such technology is the graphics processing unit, or GPU. The 9800-series nVidia GPU was evaluated with the cuBLAS software library. While in-socket GPUs might be forthcoming in the future, current

  10. Evaluation of three MRI-based anatomical priors for quantitative PET brain imaging.

    PubMed

    Vunckx, Kathleen; Atre, Ameya; Baete, Kristof; Reilhac, Anthonin; Deroose, Christophe M; Van Laere, Koen; Nuyts, Johan

    2012-03-01

    In emission tomography, image reconstruction and therefore also tracer development and diagnosis may benefit from the use of anatomical side information obtained with other imaging modalities in the same subject, as it helps to correct for the partial volume effect. One way to implement this, is to use the anatomical image for defining the a priori distribution in a maximum-a-posteriori (MAP) reconstruction algorithm. In this contribution, we use the PET-SORTEO Monte Carlo simulator to evaluate the quantitative accuracy reached by three different anatomical priors when reconstructing positron emission tomography (PET) brain images, using volumetric magnetic resonance imaging (MRI) to provide the anatomical information. The priors are: 1) a prior especially developed for FDG PET brain imaging, which relies on a segmentation of the MR-image (Baete , 2004); 2) the joint entropy-prior (Nuyts, 2007); 3) a prior that encourages smoothness within a position dependent neighborhood, computed from the MR-image. The latter prior was recently proposed by our group in (Vunckx and Nuyts, 2010), and was based on the prior presented by Bowsher (2004). The two latter priors do not rely on an explicit segmentation, which makes them more generally applicable than a segmentation-based prior. All three priors produced a compromise between noise and bias that was clearly better than that obtained with postsmoothed maximum likelihood expectation maximization (MLEM) or MAP with a relative difference prior. The performance of the joint entropy prior was slightly worse than that of the other two priors. The performance of the segmentation-based prior is quite sensitive to the accuracy of the segmentation. In contrast to the joint entropy-prior, the Bowsher-prior is easily tuned and does not suffer from convergence problems. PMID:22049363

  11. Style-independent document labeling: design and performance evaluation

    NASA Astrophysics Data System (ADS)

    Mao, Song; Kim, Jong Woo; Thoma, George R.

    2003-12-01

    The Medical Article Records System or MARS has been developed at the U.S. National Library of Medicine (NLM) for automated data entry of bibliographical information from medical journals into MEDLINE, the premier bibliographic citation database at NLM. Currently, a rule-based algorithm (called ZoneCzar) is used for labeling important bibliographical fields (title, author, affiliation, and abstract) on medical journal article page images. While rules have been created for medical journals with regular layout types, new rules have to be manually created for any input journals with arbitrary or new layout types. Therefore, it is of interest to label any journal articles independent of their layout styles. In this paper, we first describe a system (called ZoneMatch) for automated generation of crucial geometric and non-geometric features of important bibliographical fields based on string-matching and clustering techniques. The rule based algorithm is then modified to use these features to perform style-independent labeling. We then describe a performance evaluation method for quantitatively evaluating our algorithm and characterizing its error distributions. Experimental results show that the labeling performance of the rule-based algorithm is significantly improved when the generated features are used.

  12. Performance evaluation of bound diamond ring tools

    SciTech Connect

    Piscotty, M.A.; Taylor, J.S.; Blaedel, K.L.

    1995-07-14

    LLNL is collaborating with the Center for Optics Manufacturing (COM) and the American Precision Optics Manufacturers Association (APOMA) to optimize bound diamond ring tools for the spherical generation of high quality optical surfaces. An important element of this work is establishing an experimentally-verified link between tooling properties and workpiece quality indicators such as roughness, subsurface damage and removal rate. In this paper, we report on a standardized methodology for assessing ring tool performance and its preliminary application to a set of commercially-available wheels. Our goals are to (1) assist optics manufacturers (users of the ring tools) in evaluating tools and in assessing their applicability for a given operation, and (2) provide performance feedback to wheel manufacturers to help optimize tooling for the optics industry. Our paper includes measurements of wheel performance for three 2-4 micron diamond bronze-bond wheels that were supplied by different manufacturers to nominally- identical specifications. Preliminary data suggests that the difference in performance levels among the wheels were small.

  13. Qualitative and quantitative evaluation of Simon™, a new CE-based automated Western blot system as applied to vaccine development.

    PubMed

    Rustandi, Richard R; Loughney, John W; Hamm, Melissa; Hamm, Christopher; Lancaster, Catherine; Mach, Anna; Ha, Sha

    2012-09-01

    Many CE-based technologies such as imaged capillary IEF, CE-SDS, CZE, and MEKC are well established for analyzing proteins, viruses, or other biomolecules such as polysaccharides. For example, imaged capillary isoelectric focusing (charge-based protein separation) and CE-SDS (size-based protein separation) are standard replacement methods in biopharmaceutical industries for tedious and labor intensive IEF and SDS-PAGE methods, respectively. Another important analytical tool for protein characterization is a Western blot, where after size-based separation in SDS-PAGE the proteins are transferred to a membrane and blotted with specific monoclonal or polyclonal antibodies. Western blotting analysis is applied in many areas such as biomarker research, therapeutic target identification, and vaccine development. Currently, the procedure is very manual, laborious, and time consuming. Here, we evaluate a new technology called Simple Western™ (or Simon™) for performing automated Western analysis. This new technology is based on CE-SDS where the separated proteins are attached to the wall of capillary by a proprietary photo activated chemical crosslink. Subsequent blotting is done automatically by incubating and washing the capillary with primary and secondary antibodies conjugated with horseradish peroxidase and detected with chemiluminescence. Typically, Western blots are not quantitative, hence we also evaluated the quantitative aspect of this new technology. We demonstrate that Simon™ can quantitate specific components in one of our vaccine candidates and it provides good reproducibility and intermediate precision with CV <10%. PMID:22965727

  14. 40 CFR 35.9055 - Evaluation of recipient performance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Evaluation of recipient performance. 35... Evaluation of recipient performance. The Regional Administrator will oversee each recipient's performance... schedule for evaluation in the assistance agreement and will evaluate recipient performance and...

  15. 48 CFR 436.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Construction 436.201 Evaluation of contractor performance. Preparation of performance evaluation reports. In addition to the requirements of FAR 36.201, performance evaluation reports shall be prepared for indefinite... of services to be ordered exceeds $500,000.00. For these contracts, performance evaluation...

  16. Real-time polymerase chain reaction of immunoglobulin rearrangements for quantitative evaluation of minimal residual disease in myeloma.

    PubMed

    Compagno, Mara; Mantoan, Barbara; Astolfi, Monica; Boccadoro, Mario; Ladetto, Marco

    2005-01-01

    The evaluation of minimal residual disease (MRD) is critical in the evaluation of treatments aimed at maximal cytoreduction in multiple myeloma (MM). Qualitative evaluation of MRD now has a 10-yr-long history, but it remains a relatively sophisticated procedure. More recently, real-time quantitative approaches have also been developed. These approaches allow a very effective monitoring of disease but introduce additional complexity and costs to the procedure. This chapter describes how we currently perform real-time polymerase chain reaction (PCR) in MM. Compared to the first description of the assay in June 2000, significant improvements have been made. Although real-time PCR is the main focus of the chapter, most of the information suitable for a proper setup of a qualitative approach is also provided. PMID:15968100

  17. Performance analysis of quantitative phase retrieval method in Zernike phase contrast X-ray microscopy

    NASA Astrophysics Data System (ADS)

    Heng, Chen; Kun, Gao; Da-Jiang, Wang; Li, Song; Zhi-Li, Wang

    2016-02-01

    Since the invention of Zernike phase contrast method in 1930, it has been widely used in optical microscopy and more recently in X-ray microscopy. Considering the image contrast is a mixture of absorption and phase information, we recently have proposed and demonstrated a method for quantitative phase retrieval in Zernike phase contrast X-ray microscopy. In this contribution, we analyze the performance of this method at different photon energies. Intensity images of PMMA samples are simulated at 2.5 keV and 6.2 keV, respectively, and phase retrieval is performed using the proposed method. The results demonstrate that the proposed phase retrieval method is applicable over a wide energy range. For weakly absorbing features, the optimal photon energy is 2.5 keV, from the point of view of image contrast and accuracy of phase retrieval. On the other hand, in the case of strong absorption objects, a higher photon energy is preferred to reduce the error of phase retrieval. These results can be used as guidelines to perform quantitative phase retrieval in Zernike phase contrast X-ray microscopy with the proposed method. Supported by the State Key Project for Fundamental Research (2012CB825801), National Natural Science Foundation of China (11475170, 11205157 and 11179004) and Anhui Provincial Natural Science Foundation (1508085MA20).

  18. Evaluation of right and left ventricular function by quantitative blood-pool SPECT (QBS): comparison with conventional methods and quantitative gated SPECT (QGS).

    PubMed

    Odagiri, Keiichi; Wakabayashi, Yasushi; Tawarahara, Kei; Kurata, Chinori; Urushida, Tsuyoshi; Katoh, Hideki; Satoh, Hiroshi; Hayashi, Hideharu

    2006-10-01

    Though quantitative ECG-gated blood-pool SPECT (QBS) has become a popular tool in research settings, more verification is necessary for its utilization in clinical medicine. To evaluate the reliability of the measurements of left and right ventricular functions with QBS, we performed QBS, as well as first-pass pool (FPP) and ECG-gated blood-pool (GBP) studies on planar images in 41 patients and 8 healthy volunteers. Quantitative ECG-gated myocardial perfusion SPECT (QGS) was also performed in 30 of 49 subjects. First, we assessed the reproducibility of the measurements of left and right ventricular ejection fraction (LVEF, RVEF) and left and right ventricular end-diastolic volume (LVEDV, RVEDV) with QBS. Second, LVEF and RVEF obtained from QBS were compared with those from FPP and GBP, respectively. Third, LVEF and LVEDV obtained from QBS were compared with those from QGS, respectively. The intra- and inter-observer reproducibilities were excellent for LVEF, LVEDV, RVEF and RVEDV measured with QBS (r = 0.88 to 0.96, p < 0.01), while the biases in the measurements of RVEF and RVEDV were relatively large. LVEF obtained from QBS correlated significantly with those from FPP and GBP, while RVEF from QBS did not. LVEF and LVEDV obtained from QBS were significantly correlated with those from QGS, but the regression lines were not close to the lines of identity. In conclusion, the measurements of LVEF and LVEDV with QBS have good reproducibility and are useful clinically, while those of RVEF and RVEDV are less useful compared with LVEF and LVEDV. The algorithm of QBS for the measurements of RVEF and RVEDV remains to be improved. PMID:17134018

  19. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... HUMAN SERVICES Food and Drug Administration Use of Influenza Disease Models To Quantitatively Evaluate... public workshop entitled: ``Use of Influenza Disease Models to Quantitatively Evaluate the Benefits and... hypothetical influenza vaccine, and to seek from a range of experts, feedback on the current version of...

  20. Quantitative evaluation of ozone and selected climate parameters in a set of EMAC simulations

    NASA Astrophysics Data System (ADS)

    Righi, M.; Eyring, V.; Gottschaldt, K.-D.; Klinger, C.; Frank, F.; Jöckel, P.; Cionni, I.

    2015-03-01

    Four simulations with the ECHAM/MESSy Atmospheric Chemistry (EMAC) model have been evaluated with the Earth System Model Validation Tool (ESMValTool) to identify differences in simulated ozone and selected climate parameters that resulted from (i) different setups of the EMAC model (nudged vs. free-running) and (ii) different boundary conditions (emissions, sea surface temperatures (SSTs) and sea ice concentrations (SICs)). To assess the relative performance of the simulations, quantitative performance metrics are calculated consistently for the climate parameters and ozone. This is important for the interpretation of the evaluation results since biases in climate can impact on biases in chemistry and vice versa. The observational data sets used for the evaluation include ozonesonde and aircraft data, meteorological reanalyses and satellite measurements. The results from a previous EMAC evaluation of a model simulation with nudging towards realistic meteorology in the troposphere have been compared to new simulations with different model setups and updated emission data sets in free-running time slice and nudged quasi chemistry-transport model (QCTM) mode. The latter two configurations are particularly important for chemistry-climate projections and for the quantification of individual sources (e.g., the transport sector) that lead to small chemical perturbations of the climate system, respectively. With the exception of some specific features which are detailed in this study, no large differences that could be related to the different setups (nudged vs. free-running) of the EMAC simulations were found, which offers the possibility to evaluate and improve the overall model with the help of shorter nudged simulations. The main differences between the two setups is a better representation of the tropospheric and stratospheric temperature in the nudged simulations, which also better reproduce stratospheric water vapor concentrations, due to the improved simulation of

  1. Quantitative evaluation of ozone and selected climate parameters in a set of EMAC simulations

    NASA Astrophysics Data System (ADS)

    Righi, M.; Eyring, V.; Gottschaldt, K.-D.; Klinger, C.; Frank, F.; Jöckel, P.; Cionni, I.

    2014-10-01

    Four simulations with the ECHAM/MESSy Atmospheric Chemistry (EMAC) model have been evaluated with the Earth System Model Validation Tool (ESMValTool) to identify differences in simulated ozone and selected climate parameters that resulted from (i) different setups of the EMAC model (nudged vs. free-running) and (ii) different boundary conditions (emissions, sea surface temperatures (SSTs) and sea-ice concentrations (SICs)). To assess the relative performance of the simulations, quantitative performance metrics are calculated consistently for the climate parameters and ozone. This is important for the interpretation of the evaluation results since biases in climate can impact on biases in chemistry and vice versa. The observational datasets used for the evaluation include ozonesonde and aircraft data, meteorological reanalyses and satellite measurements. The results from a previous EMAC evaluation of a model simulation with weak nudging towards realistic meteorology in the troposphere have been compared to new simulations with different model setups and updated emission datasets in free-running timeslice and nudged Quasi Chemistry-Transport Model (QCTM) mode. The latter two configurations are particularly important for chemistry-climate projections and for the quantification of individual sources (e.g. transport sector) that lead to small chemical perturbations of the climate system, respectively. With the exception of some specific features which are detailed in this study, no large differences that could be related to the different setups of the EMAC simulations (nudged vs. free-running) were found, which offers the possibility to evaluate and improve the overall model with the help of shorter nudged simulations. The main differences between the two setups is a better representation of the tropospheric and stratospheric temperature in the nudged simulations, which also better reproduce stratospheric water vapour concentrations, due to the improved simulation of

  2. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788. PMID:21071908

  3. Business Scenario Evaluation Method Using Monte Carlo Simulation on Qualitative and Quantitative Hybrid Model

    NASA Astrophysics Data System (ADS)

    Samejima, Masaki; Akiyoshi, Masanori; Mitsukuni, Koshichiro; Komoda, Norihisa

    We propose a business scenario evaluation method using qualitative and quantitative hybrid model. In order to evaluate business factors with qualitative causal relations, we introduce statistical values based on propagation and combination of effects of business factors by Monte Carlo simulation. In propagating an effect, we divide a range of each factor by landmarks and decide an effect to a destination node based on the divided ranges. In combining effects, we decide an effect of each arc using contribution degree and sum all effects. Through applied results to practical models, it is confirmed that there are no differences between results obtained by quantitative relations and results obtained by the proposed method at the risk rate of 5%.

  4. Evaluation of quantitative accuracy in CZT-based pre-clinical SPECT for various isotopes

    NASA Astrophysics Data System (ADS)

    Park, S.-J.; Yu, A. R.; Kim, Y.-s.; Kang, W.-S.; Jin, S. S.; Kim, J.-S.; Son, T. J.; Kim, H.-J.

    2015-05-01

    In vivo pre-clinical single-photon emission computed tomography (SPECT) is a valuable tool for functional small animal imaging, but several physical factors, such as scatter radiation, limit the quantitative accuracy of conventional scintillation crystal-based SPECT. Semiconductor detectors such as CZT overcome these deficiencies through superior energy resolution. To our knowledge, little scientific information exists regarding the accuracy of quantitative analysis in CZT-based pre-clinical SPECT systems for different isotopes. The aim of this study was to assess the quantitative accuracy of CZT-based pre-clinical SPECT for four isotopes: 201Tl, 99mTc, 123I, and 111In. The quantitative accuracy of the CZT-based Triumph X-SPECT (Gamma-Medica Ideas, Northridge, CA, U.S.A.) was compared with that of a conventional SPECT using GATE simulation. Quantitative errors due to the attenuation and scatter effects were evaluated for all four isotopes with energy windows of 5%, 10%, and 20%. A spherical source containing the isotope was placed at the center of the air-or-water-filled mouse-sized cylinder phantom. The CZT-based pre-clinical SPECT was more accurate than the conventional SPECT. For example, in the conventional SPECT with an energy window of 10%, scatter effects degraded quantitative accuracy by up to 11.52%, 5.10%, 2.88%, and 1.84% for 201Tl, 99mTc, 123I, and 111In, respectively. However, with the CZT-based pre-clinical SPECT, the degradations were only 9.67%, 5.45%, 2.36%, and 1.24% for 201Tl, 99mTc, 123I, and 111In, respectively. As the energy window was increased, the quantitative errors increased in both SPECT systems. Additionally, the isotopes with lower energy of photon emissions had greater quantitative error. Our results demonstrated that the CZT-based pre-clinical SPECT had lower overall quantitative errors due to reduced scatter and high detection efficiency. Furthermore, the results of this systematic assessment quantifying the accuracy of these SPECT

  5. Quantitative Methods for Evaluating the Efficacy of Thalamic Deep Brain Stimulation in Patients with Essential Tremor

    PubMed Central

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Background Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. Methods We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. Results The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Discussion Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life. PMID:24255800

  6. Manipulator Performance Evaluation Using Fitts' Taping Task

    SciTech Connect

    Draper, J.V.; Jared, B.C.; Noakes, M.W.

    1999-04-25

    Metaphorically, a teleoperator with master controllers projects the user's arms and hands into a re- mote area, Therefore, human users interact with teleoperators at a more fundamental level than they do with most human-machine systems. Instead of inputting decisions about how the system should func- tion, teleoperator users input the movements they might make if they were truly in the remote area and the remote machine must recreate their trajectories and impedance. This intense human-machine inter- action requires displays and controls more carefully attuned to human motor capabilities than is neces- sary with most systems. It is important for teleoperated manipulators to be able to recreate human trajectories and impedance in real time. One method for assessing manipulator performance is to observe how well a system be- haves while a human user completes human dexterity tasks with it. Fitts' tapping task has been, used many times in the past for this purpose. This report describes such a performance assessment. The International Submarine Engineering (ISE) Autonomous/Teleoperated Operations Manipulator (ATOM) servomanipulator system was evalu- ated using a generic positioning accuracy task. The task is a simple one but has the merits of (1) pro- ducing a performance function estimate rather than a point estimate and (2) being widely used in the past for human and servomanipulator dexterity tests. Results of testing using this task may, therefore, allow comparison with other manipulators, and is generically representative of a broad class of tasks. Results of the testing indicate that the ATOM manipulator is capable of performing the task. Force reflection had a negative impact on task efficiency in these data. This was most likely caused by the high resistance to movement the master controller exhibited with the force reflection engaged. Measurements of exerted forces were not made, so it is not possible to say whether the force reflection helped partici- pants

  7. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  8. [Quantitative analysis of (-)-epigallocatechin gallate in tea leaves by high-performance liquid chromatography].

    PubMed

    Sakata, I; Ikeuchi, M; Maruyama, I; Okuda, T

    1991-12-01

    The quantitative analysis of (-)-epigallocatechin gallate (EGCG) in tea (Camellia sinensis L.) was performed by high-performance liquid chromatography (HPLC) with a C-18 reversed-phase column. EGCG was then eluted within 20 min by using methanol-water-acetic acid (20:75:5 (v/v/v)) as an eluent. As an internal standard, tryptophan was used. The content of EGCG in five kinds of green tea (sencha, gyokuro, bancha, matsucha and oolong tea) and in a cup of those was determined by both the extraction method with 50% (v/v) methanol and the infusion method with water. The largest amount of EGCG was obtained from matsucha by the extraction method, or from sencha by the infusion method. Furthermore, EGCG contents in various parts of the tea plant were examined. The first leaf had the highest concentration of EGCG, and the concentration of EGCG decreased with the aging of the leaf. PMID:1806661

  9. Performance Evaluation of the SPT-140

    NASA Technical Reports Server (NTRS)

    Manzella, David; Sarmiento, Charles; Sankovic, John; Haag, Tom

    1997-01-01

    As part of an on-going cooperative program with industry, an engineering model SPT-140 Hall thruster, which may be suitable for orbit insertion and station-keeping of geosynchronous communication satellites, was evaluated with respect to thrust and radiated electromagnetic interference at the NASA Lewis Research Center. Performance measurements were made using a laboratory model propellant feed system and commercial power supplies. The engine was operated in a space simulation chamber capable of providing background pressures of 4 x 10(exp -6) Torr or less during thruster operation. Thrust was measured at input powers ranging from 1.5 to 5 kilowatts with two different output filter configurations. The broadband electromagnetic emission spectra generated by the engine was also measured for a range of frequencies from 0.01 to 18,000 Mhz. These results are compared to the noise threshold of the measurement system and MIL-STD-461C where appropriate.

  10. A Method for Missile Autopilot Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Eguchi, Hirofumi

    The essential benefit of HardWare-In-the-Loop (HWIL) simulation can be summarized as that the performance of autopilot system is evaluated realistically without the modeling error by using actual hardware such as seeker systems, autopilot systems and servo equipments. HWIL simulation, however, requires very expensive facilities: in these facilities, the target model generator is the indispensable subsystem. In this paper, one example of HWIL simulation facility with a target model generator for RF seeker systems is introduced at first. But this generator has the functional limitation on the line-of-sight angle as almost other generators, then, a test method to overcome the line-of-sight angle limitation is proposed.

  11. Performance evaluation of conventional chiller systems

    SciTech Connect

    Beyene, A.

    1995-06-01

    This article describes an optimization technique to reduce chiller energy usage by evaluating energy saving strategies. In most commercial buildings and industrial plants, HVAC systems are the largest energy consumers and offer the owners significant potential for savings. Chiller machines are also of interest to utility companies because they operate during cooling times that overlap peak hours of warmer climate zones, thereby contributing to peak energy demands. The key performance parameter in chiller analysis is the kW/ton of refrigeration, which is the ratio of the amount of electrical energy consumed relative to the amount of cooling energy delivers. To obtain the kW/ton refrigeration for a chiller, the electric power consumption (kW) of the compressor should be measured, or calculated if the instantaneous current and voltage are known.

  12. Performance evaluation of mail-scanning cameras

    NASA Astrophysics Data System (ADS)

    Rajashekar, Umesh; Vu, Tony Tuan; Hooning, John E.; Bovik, Alan Conrad

    2010-04-01

    Letter-scanning cameras (LSCs) form the front- end imaging systems for virtually all mail-scanning systems that are currently used to automatically sort mail products. As with any vision-dependent technology, the quality of the images generated by the camera is fundamental to the overall performance of the system. We present novel techniques for objective evaluation of LSCs using comparative imaging-a technique that involves measuring the fidelity of target images produced by a camera with reference to an image of the same target captured at very high quality. Such a framework provides a unique opportunity to directly quantify the camera's ability to capture real-world targets, such as handwritten and printed text. Noncomparative techniques were also used to measure properties such as the camera's modulation transfer function, dynamic range, and signal-to-noise ratio. To simulate real-world imaging conditions, application-specific test samples were designed using actual mail product materials.

  13. A performance evaluation system for photomultiplier tubes

    NASA Astrophysics Data System (ADS)

    Xia, J.; Qian, S.; Wang, W.; Ning, Z.; Cheng, Y.; Wang, Z.; Li, X.; Qi, M.; Heng, Y.; Liu, S.; Lei, X.

    2015-03-01

    A comprehensive performance evaluation system for Photomultiplier tubes has been built up. The system is able to review diverse cathode and anode properties for PMTs with different sizes and dimensions. Relative and direct methods were developed for the quantum efficiency measurement and the results are consistent with each other. Two-dimensional and three-dimensional scanning platforms were built to test both the cathode and anode uniformity for either the plane type or spherical type photocathode. A Flash Analog-to-Digital Convertor module is utilized to achieve high speed waveforms sampling. The entire system is highly automatic and flexible. Details of the system and some typical experimental results are presented in this paper.

  14. Evaluation of Iron Content in Human Cerebral Cavernous Malformation using Quantitative Susceptibility Mapping

    PubMed Central

    Tan, Huan; Liu, Tian; Wu, Ying; Thacker, Jon; Shenkar, Robert; Mikati, Abdul Ghani; Shi, Changbin; Dykstra, Conner; Wang, Yi; Prasad, Pottumarthi V.; Edelman, Robert R.; Awad, Issam A.

    2014-01-01

    Objectives To investigate and validate quantitative susceptibility mapping (QSM) for lesional iron quantification in cerebral cavernous malformations (CCM). Materials and Methods Magnetic resonance imaging (MRI) studies were performed in phantoms and 16 patients on a 3T scanner. QSM, susceptibility weighted imaging (SWI), and R2* maps were reconstructed from in vivo data acquired with a three-dimensional, multi-echo, and T2*-weighted gradient echo sequence. Magnetic susceptibility measurements were correlated to SWI and R2* results. In addition, iron concentrations from surgically excised CCM lesion specimens were determined using inductively coupled plasma mass spectrometry and correlated with QSM measurements. Results The QSM images demonstrated excellent image quality for depicting CCM lesions in both sporadic and familial cases. Susceptibility measurements revealed a positive linear correlation with R2* values (R2 = 0.99 for total, R2 = 0.69 for mean; p < 0.01). QSM values of known iron-rich brain regions matched closely with previous studies and in interobserver consistency. A strong correlation was found between QSM and the concentration of iron phantoms (0.925, p < 0.01), as well as between QSM and mass spectroscopy estimation of iron deposition (0.999 for total iron, 0.86 for iron concentration; p < 0.01) in 18 fragments of 4 excised human CCM lesion specimens. Conclusions The ability of QSM to evaluate iron deposition in CCM lesions was illustrated via phantom, in vivo and ex vivo validation studies. QSM may be a potential biomarker for monitoring CCM disease activity and response to treatments. PMID:24619210

  15. Evaluation of a quantitative magnetic resonance imaging system for whole body composition analysis in rodents.

    PubMed

    Nixon, Joshua P; Zhang, Minzhi; Wang, ChuanFeng; Kuskowski, Michael A; Novak, Colleen M; Levine, James A; Billington, Charles J; Kotz, Catherine M

    2010-08-01

    We evaluated the EchoMRI-900 combination rat and mouse quantitative magnetic resonance (QMR) body composition method in comparison to traditional whole-body chemical carcass composition analysis (CCA) for measurements of fat and fat-free mass in rodents. Live and postmortem (PM) QMR fat and lean mass measurements were obtained for lean, obese and outbred strains of rats and mice, and compared with measurements obtained using CCA. A second group of rats was measured before and after 18 h food or water deprivation. Significant positive correlations between QMR and CCA fat and lean mass measurements were shown for rats and mice. Although all live QMR fat and lean measurements were more precise than CCA for rats, values obtained for mice significantly differed from CCA for lean mass only. QMR performed PM slightly overestimated fat and lean values relative to live QMR but did not show lower precision than live QMR. Food deprivation reduced values for both fat and lean mass; water deprivation reduced estimates of lean mass only. In summary, all measurements using this QMR system were comparable to those obtained by CCA, but with higher overall precision, similar to previous reports for the murine QMR system. However, PM QMR measurements slightly overestimated live QMR values, and lean and fat mass measurements in this QMR system are influenced by hydration status and animal size, respectively. Despite these caveats, we conclude that the EchoMRI QMR system offers a fast in vivo method of body composition analysis, well correlated to but with greater overall precision than CCA. PMID:20057373

  16. Performance Evaluations of Ceramic Wafer Seals

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.; DeMange, Jeffrey J.; Steinetz, Bruce M.

    2006-01-01

    Future hypersonic vehicles will require high temperature, dynamic seals in advanced ramjet/scramjet engines and on the vehicle airframe to seal the perimeters of movable panels, flaps, and doors. Seal temperatures in these locations can exceed 2000 F, especially when the seals are in contact with hot ceramic matrix composite sealing surfaces. NASA Glenn Research Center is developing advanced ceramic wafer seals to meet the needs of these applications. High temperature scrub tests performed between silicon nitride wafers and carbon-silicon carbide rub surfaces revealed high friction forces and evidence of material transfer from the rub surfaces to the wafer seals. Stickage between adjacent wafers was also observed after testing. Several design changes to the wafer seals were evaluated as possible solutions to these concerns. Wafers with recessed sides were evaluated as a potential means of reducing friction between adjacent wafers. Alternative wafer materials are also being considered as a means of reducing friction between the seals and their sealing surfaces and because the baseline silicon nitride wafer material (AS800) is no longer commercially available.

  17. QUANTITATIVE EVALUATION OF ANTERIOR SEGMENT PARAMETERS IN THE ERA OF IMAGING

    PubMed Central

    Dorairaj, Syril; Liebmann, Jeffrey M.; Ritch, Robert

    2007-01-01

    Purpose To review the parameters for quantitative assessment of the anterior segment and iridocorneal angle and to develop a comprehensive schematic for the evaluation of angle anatomy and pathophysiology by high-resolution imaging. Methods The published literature of the last 15 years was reviewed, analyzed, and organized into a construct for assessment of anterior segment processes. Results Modern anterior segment imaging techniques have allowed us to devise new quantitative parameters to improve the information obtained. Ultrasound biomicroscopy, slit-lamp optical coherence tomography, and anterior segment optical coherence tomography provide high-resolution images for analysis of physiologic and pathologic processes. These include iridocorneal angle analysis (eg, angle opening distance, angle recess area, trabecular-iris space area), anterior and posterior chamber depth and area, iris and ciliary body cross-sectional area and volume, quantitative anatomic relationships between structures, and videographic analysis of iris movement and accommodative changes under various conditions. Modern devices permit imaging of the entire anterior chamber, allowing calculation of anterior chamber and pupillary diameters and correlating these with measurement of anterior chamber dynamics in light vs dark conditions. We have tabulated all reported anterior segment measurement modalities and devised a construct for assessment of normal and abnormal conditions. Conclusion Quantitative measurement of static and dynamic anterior segment parameters, both normal and abnormal, provides a broad range of parameters for analysis of the numerous aspects of the pathophysiology of the anterior segment of the eye. PMID:18427599

  18. 48 CFR 236.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONTRACTS Special Aspects of Contracting for Construction 236.201 Evaluation of contractor performance. (a) Preparation of performance evaluation reports. Use DD Form 2626, Performance Evaluation (Construction... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Evaluation of...

  19. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    EPA Science Inventory

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  20. IDENTIFICATION AND QUANTITATION OF ALKYLATED NUCLEOBASIS BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY WITH UV PHOTODIODE ARRAY DETECTION

    EPA Science Inventory

    The application of UV diode array detection in high-performance liquid chromatographic (HPLC) identification and quantitation of several classes of synthetic and commercially available alkylated nucleobases is investigated. uantitative spectral overlays of these compounds to meth...

  1. Evaluating the Performance of Calculus Classes Using Operational Research Tools.

    ERIC Educational Resources Information Center

    Soares de Mello, Joao Carlos C. B.; Lins, Marcos P. E.; Soares de Mello, Maria Helena C.; Gomes, Eliane G.

    2002-01-01

    Compares the efficiency of calculus classes and evaluates two kinds of classes: traditional and others that use computational methods in teaching. Applies quantitative evaluation methods using two operational research tools, multicriteria decision aid methods (mainly using the MACBETH approach) and data development analysis. (Author/YDS)

  2. Performance Evaluation Modeling of Network Sensors

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Jennings, Esther H.; Gao, Jay L.

    2003-01-01

    Substantial benefits are promised by operating many spatially separated sensors collectively. Such systems are envisioned to consist of sensor nodes that are connected by a communications network. A simulation tool is being developed to evaluate the performance of networked sensor systems, incorporating such metrics as target detection probabilities, false alarms rates, and classification confusion probabilities. The tool will be used to determine configuration impacts associated with such aspects as spatial laydown, and mixture of different types of sensors (acoustic, seismic, imaging, magnetic, RF, etc.), and fusion architecture. The QualNet discrete-event simulation environment serves as the underlying basis for model development and execution. This platform is recognized for its capabilities in efficiently simulating networking among mobile entities that communicate via wireless media. We are extending QualNet's communications modeling constructs to capture the sensing aspects of multi-target sensing (analogous to multiple access communications), unimodal multi-sensing (broadcast), and multi-modal sensing (multiple channels and correlated transmissions). Methods are also being developed for modeling the sensor signal sources (transmitters), signal propagation through the media, and sensors (receivers) that are consistent with the discrete event paradigm needed for performance determination of sensor network systems. This work is supported under the Microsensors Technical Area of the Army Research Laboratory (ARL) Advanced Sensors Collaborative Technology Alliance.

  3. Performance evaluations of the ATST secondary mirror

    NASA Astrophysics Data System (ADS)

    Cho, Myung K.; DeVries, Joseph; Hansen, Eric

    2007-09-01

    The Advanced Technology Solar Telescope (ATST) has a 4.24m off-axis primary mirror designed to deliver diffraction-limited images of the sun. Its baseline secondary mirror (M2) design uses a 0.65m diameter Silicon Carbide mirror mounted kinematically by a bi-pod flexure mechanism at three equally spaced locations. Unlike other common telescopes, the ATST M2 is to be exposed to a significant solar heat loading. A thermal management system will be developed to accommodate the solar loading and minimize "mirror seeing effect" by controlling the temperature difference between the M2 optical surface and the ambient air at the site. Thermo-elastic analyses for steady state thermal behaviors of the ATST secondary mirror was performed using finite element analysis by I-DEAS TM and PCFRINGE TM for the optical analysis. We examined extensive heat transfer simulation cases and their results are discussed. The goal of this study is to evaluate the optical performances of M2 using thermal models and mechanical models. Thermal responses from the models enable us to manipulate time dependent thermal loadings to synthesize the operational environment for the design and development of TMS.

  4. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  5. Space Shuttle UHF Communications Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Loh, Yin-Chung; Kroll, Quin D.; Sham, Catherine C.

    2004-01-01

    An extension boom is to be installed on the starboard side of the Space Shuttle Orbiter (SSO) payload bay for thermal tile inspection and repairing. As a result, the Space Shuttle payload bay Ultra High Frequency (UHF) antenna will be under the boom. This study is to evaluate the Space Shuttle UHF communication performance for antenna at a suitable new location. To insure the RF coverage performance at proposed new locations, the link margin between the UHF payload bay antenna and Extravehicular Activity (EVA) Astronauts at a range distance of 160 meters from the payload bay antenna was analyzed. The communication performance between Space Shuttle Orbiter and International Space Station (SSO-ISS) during rendezvous was also investigated. The multipath effects from payload bay structures surrounding the payload bay antenna were analyzed. The computer simulation tool based on the Geometrical Theory of Diffraction method (GTD) was used to compute the signal strengths. The total field strength was obtained by summing the direct fields from the antennas and the reflected and diffracted fields from the surrounding structures. The computed signal strengths were compared to the signal strength corresponding to the 0 dB link margin. Based on the results obtained in this study, RF coverage for SSO-EVA and SSO- ISS communication links was determined for the proposed payload bay antenna UHF locations. The RF radiation to the Orbiter Docking System (ODS) pyros, the payload bay avionics, and the Shuttle Remote Manipulator System (SRMS) from the new proposed UHF antenna location was also investigated to ensure the EMC/EMI compliances.

  6. 48 CFR 1252.216-72 - Performance evaluation plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....216-72 Performance evaluation plan. As prescribed in (TAR) 48 CFR 1216.406(b), insert the following clause: Performance Evaluation Plan (OCT 1994) (a) A Performance Evaluation Plan shall be unilaterally... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Performance...

  7. Rapid Quantitation of Furanocoumarins and Flavonoids in Grapefruit Juice using Ultra Performance Liquid Chromatography

    PubMed Central

    VanderMolen, Karen M.; Cech, Nadja B.; Paine, Mary F.

    2013-01-01

    Introduction Grapefruit juice can increase or decrease the systemic exposure of myriad oral medications, leading to untoward effects or reduced efficacy. Furanocoumarins in grapefruit juice have been established as inhibitors of cytochrome P450 3A (CYP3A)-mediated metabolism and P-glycoprotein (P-gp)-mediated efflux, while flavonoids have been implicated as inhibitors of organic anion transporting polypeptide (OATP)-mediated absorptive uptake in the intestine. The potential for drug interactions with a food product necessitates an understanding of the expected concentrations of a suite of structurally diverse and potentially bioactive compounds. Objective Develop methods for the rapid quantitation of two furanocoumarins (bergamottin and 6′,7′-dihydroxybergamottin) and four flavonoids (naringin, naringenin, narirutin, and hesperidin) in five grapefruit juice products using ultra performance liquid chromatography (UPLC). Methodology Grapefruit juice products were extracted with ethyl acetate; the concentrated extract was analyzed by UPLC using acetonitrile:water gradients and a C18 column. Analytes were detected using a photodiode array detector, set at 250 nm (furanocoumarins) and 310 nm (flavonoids). Intraday and interday precision and accuracy and limits of detection and quantitation were determined. Results Rapid (<5.0 min) UPLC methods were developed to measure the aforementioned furanocoumarins and flavonoids. R2 values for the calibration curves of all analytes were >0.999. Considerable between-juice variation in the concentrations of these compounds was observed, and the quantities measured were in agreement with the concentrations published in HPLC studies. Conclusion These analytical methods provide an expedient means to quantitate key furanocoumarins and flavonoids in grapefruit juice and other foods used in dietary substance-drug interaction studies. PMID:23780830

  8. Quantitative morphological evaluation of laser ablation on calculus using full-field optical coherence microscopy

    NASA Astrophysics Data System (ADS)

    Xiao, Q.; Lü, T.; Li, Z.; Fu, L.

    2011-10-01

    The quantitative morphological evaluation at high resolution is of significance for the study of laser-tissue interaction. In this paper, a full-field optical coherence microscopy (OCM) system with high resolution of ˜2 μm was developed to investigate the ablation on urinary calculus by a free-running Er:YAG laser. We studied the morphological variation quantitatively corresponding to change of energy setting of the Er:YAG laser. The experimental results show that the full-field OCM enables quantitative evaluation of the morphological shape of craters and material removal, and particularly the fine structure. We also built a heat conduction model to simulate the process of laser-calculus interaction by using finite element method. Through the simulation, the removal region of the calculus was calculated according to the temperature distribution. As a result, the depth, width, volume, and the cross-sectional profile of the crater in calculus measured by full-field OCM matched well with the theoretical results based on the heat conduction model. Both experimental and theoretical results confirm that the thermal interaction is the dominant effect in the ablation of calculus by Er:YAG laser, demonstrating the effectiveness of full-field OCM in studying laser-tissue interactions.

  9. Quantitative morphologic evaluation of magnetic resonance imaging during and after treatment of childhood leukemia

    PubMed Central

    Reddick, Wilburn E.; Laningham, Fred H.; Glass, John O.; Pui, Ching-Hon

    2008-01-01

    Introduction Medical advances over the last several decades, including CNS prophylaxis, have greatly increased survival in children with leukemia. As survival rates have increased, clinicians and scientists have been afforded the opportunity to further develop treatments to improve the quality of life of survivors by minimizing the long-term adverse effects. When evaluating the effect of antileukemia therapy on the developing brain, magnetic resonance (MR) imaging has been the preferred modality because it quantifies morphologic changes objectively and noninvasively. Method and results Computer-aided detection of changes on neuroimages enables us to objectively differentiate leukoencephalopathy from normal maturation of the developing brain. Quantitative tissue segmentation algorithms and relaxometry measures have been used to determine the prevalence, extent, and intensity of white matter changes that occur during therapy. More recently, diffusion tensor imaging has been used to quantify microstructural changes in the integrity of the white matter fiber tracts. MR perfusion imaging can be used to noninvasively monitor vascular changes during therapy. Changes in quantitative MR measures have been associated, to some degree, with changes in neurocognitive function during and after treatment Conclusion In this review, we present recent advances in quantitative evaluation of MR imaging and discuss how these methods hold the promise to further elucidate the pathophysiologic effects of treatment for childhood leukemia. PMID:17653705

  10. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. PMID:27566933

  11. A new dynamic myocardial phantom for evaluation of SPECT and PET quantitation in systolic and diastolic conditions

    SciTech Connect

    Dreuille, O. de; Bendriem, B.; Riddell, C.

    1996-12-31

    We present a new dynamic myocardial phantom designed to evaluate SPECT and PET imaging in systolic and diastolic conditions. The phantom includes a thoracic attenuating media and the myocardial wall thickness varying during the scan can be performed. In this study the phantom was used with three different wall thickness characteristic of a systolic, end-diastolic and pathologic end-diastolic condition. The myocardium was filled with {sup 99m}Tc, {sup 18}F and Gd and imaged by SPECT, PET and MRI. SPECT attenuation correction was performed using a modified PET transmission. A bull`s eyes image was obtained for all data and wall ROI were then drawn for analysis. Using MRI as a reference, error from PET, SPECT and attenuation corrected SPECT were calculated. Systolic PET performances agree with MRI. Quantitation loss due to wall thickness reduction compared to the systole. Attenuation correction in SPECT leads to significant decrease of the error both in systole (from 29% to 14%) and diastole (35% to 22%). This is particularly sensitive for septum and inferior walls. SPECT residual errors (14% in systole and 22% in pathologic end-diastole) are likely caused by scatter, noise and depth dependent resolution effect. The results obtained with this dynamical phantom demonstrate the quantitation improvement achieved in SPECT with attenuation correction and also reinforce the need for variable resolution correction in addition to attenuation correction.

  12. Quantitative evaluation of atherosclerotic plaque phantom by near-infrared multispectral imaging with three wavelengths

    NASA Astrophysics Data System (ADS)

    Nagao, Ryo; Ishii, Katsunori; Awazu, Kunio

    2014-03-01

    Atherosclerosis is a primary cause of critical ischemic disease. The risk of critical event is involved the content of lipid in unstable plaque. Near-infrared (NIR) range is effective for diagnosis of atherosclerotic plaque because of the absorption peaks of lipid. NIR multispectral imaging (NIR-MSI) is suitable for the evaluation of plaque because it can provide spectroscopic information and spatial image quickly with a simple measurement system. The purpose of this study is to evaluate the lipid concentrations in plaque phantoms quantitatively with a NIR-MSI system. A NIR-MSI system was constructed with a supercontinuum light, a grating spectrometer and a MCT camera. Plaque phantoms with different concentrations of lipid were prepared by mixing bovine fat and a biological soft tissue model to mimic the different stages of unstable plaque. We evaluated the phantoms by the NIR-MSI system with three wavelengths in the band at 1200 nm. Multispectral images were processed by spectral angle mapper method. As a result, the lipid areas of phantoms were effectively highlighted by using three wavelengths. In addition, the concentrations of lipid areas were classified according to the similarity between measured spectra and a reference spectrum. These results suggested the possibility of image enhancement and quantitative evaluation of lipid in unstable plaque with a NIR-MSI.

  13. Quantitative evaluation of six graph based semi-automatic liver tumor segmentation techniques using multiple sets of reference segmentation

    NASA Astrophysics Data System (ADS)

    Su, Zihua; Deng, Xiang; Chefd'hotel, Christophe; Grady, Leo; Fei, Jun; Zheng, Dong; Chen, Ning; Xu, Xiaodong

    2011-03-01

    Graph based semi-automatic tumor segmentation techniques have demonstrated great potential in efficiently measuring tumor size from CT images. Comprehensive and quantitative validation is essential to ensure the efficacy of graph based tumor segmentation techniques in clinical applications. In this paper, we present a quantitative validation study of six graph based 3D semi-automatic tumor segmentation techniques using multiple sets of expert segmentation. The six segmentation techniques are Random Walk (RW), Watershed based Random Walk (WRW), LazySnapping (LS), GraphCut (GHC), GrabCut (GBC), and GrowCut (GWC) algorithms. The validation was conducted using clinical CT data of 29 liver tumors and four sets of expert segmentation. The performance of the six algorithms was evaluated using accuracy and reproducibility. The accuracy was quantified using Normalized Probabilistic Rand Index (NPRI), which takes into account of the variation of multiple expert segmentations. The reproducibility was evaluated by the change of the NPRI from 10 different sets of user initializations. Our results from the accuracy test demonstrated that RW (0.63) showed the highest NPRI value, compared to WRW (0.61), GWC (0.60), GHC (0.58), LS (0.57), GBC (0.27). The results from the reproducibility test indicated that GBC is more sensitive to user initialization than the other five algorithms. Compared to previous tumor segmentation validation studies using one set of reference segmentation, our evaluation methods use multiple sets of expert segmentation to address the inter or intra rater variability issue in ground truth annotation, and provide quantitative assessment for comparing different segmentation algorithms.

  14. On the quantitative characterization of human body sway in experiments with long-term performance.

    PubMed

    Seidel, H; Bräuer, D; Bastek, R; Issel, I

    1978-01-01

    Two different conditions of standing were examined in young male subjects with an intact vestibular system by means of stabilography. Low-frequency sways were eliminated by means of a digital high-pass filter. The parameters of the histograms of amplitudes, the autocorrelation functions, and the power-spectral density were estimated. The quantitative parameters of the stabilograms are compared for the two standing positions, both for the frontal and sagittal planes. Quotients of estimates of variance components calculated on the basis of analysis of variance are stated as measures of reliability for the parameters of histograms of amplitudes and power-spectral density in defined frequency bands. The reliability of the standard deviations of amplitudes was much better than the reliability of skewness and excess. The reliability of spectral density differed for frequency ranges investigated. Generally, the quotients of estimates of variance components were higher for the spectral density in the frequency bands of sagittal stabilograms than in those of frontal ones. Performance significantly affected several parameters of histograms and the distributions of power-spectral density. Standard deviation of amplitudes and power-spectral density proved to be suitable for the quantitative characterization of stabilograms. PMID:752208

  15. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    PubMed

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174

  16. Importance of Purity Evaluation and the Potential of Quantitative 1H NMR as a Purity Assay

    PubMed Central

    2015-01-01

    In any biomedical and chemical context, a truthful description of chemical constitution requires coverage of both structure and purity. This qualification affects all drug molecules, regardless of development stage (early discovery to approved drug) and source (natural product or synthetic). Purity assessment is particularly critical in discovery programs and whenever chemistry is linked with biological and/or therapeutic outcome. Compared with chromatography and elemental analysis, quantitative NMR (qNMR) uses nearly universal detection and provides a versatile and orthogonal means of purity evaluation. Absolute qNMR with flexible calibration captures analytes that frequently escape detection (water, sorbents). Widely accepted structural NMR workflows require minimal or no adjustments to become practical 1H qNMR (qHNMR) procedures with simultaneous qualitative and (absolute) quantitative capability. This study reviews underlying concepts, provides a framework for standard qHNMR purity assays, and shows how adequate accuracy and precision are achieved for the intended use of the material. PMID:25295852

  17. Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders

    PubMed Central

    Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu

    2014-01-01

    Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709

  18. Early Prediction and Evaluation of Breast Cancer Response to Neoadjuvant Chemotherapy Using Quantitative DCE-MRI1

    PubMed Central

    Tudorica, Alina; Oh, Karen Y; Chui, Stephen Y-C; Roy, Nicole; Troxell, Megan L; Naik, Arpana; Kemmer, Kathleen A; Chen, Yiyi; Holtorf, Megan L; Afzal, Aneela; Springer, Charles S; Li, Xin; Huang, Wei

    2016-01-01

    The purpose is to compare quantitative dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) metrics with imaging tumor size for early prediction of breast cancer response to neoadjuvant chemotherapy (NACT) and evaluation of residual cancer burden (RCB). Twenty-eight patients with 29 primary breast tumors underwent DCE-MRI exams before, after one cycle of, at midpoint of, and after NACT. MRI tumor size in the longest diameter (LD) was measured according to the RECIST (Response Evaluation Criteria In Solid Tumors) guidelines. Pharmacokinetic analyses of DCE-MRI data were performed with the standard Tofts and Shutter-Speed models (TM and SSM). After one NACT cycle the percent changes of DCE-MRI parameters Ktrans (contrast agent plasma/interstitium transfer rate constant), ve (extravascular and extracellular volume fraction), kep (intravasation rate constant), and SSM-unique τi (mean intracellular water lifetime) are good to excellent early predictors of pathologic complete response (pCR) vs. non-pCR, with univariate logistic regression C statistics value in the range of 0.804 to 0.967. ve values after one cycle and at NACT midpoint are also good predictors of response, with C ranging 0.845 to 0.897. However, RECIST LD changes are poor predictors with C = 0.609 and 0.673, respectively. Post-NACT Ktrans, τi, and RECIST LD show statistically significant (P < .05) correlations with RCB. The performances of TM and SSM analyses for early prediction of response and RCB evaluation are comparable. In conclusion, quantitative DCE-MRI parameters are superior to imaging tumor size for early prediction of therapy response. Both TM and SSM analyses are effective for therapy response evaluation. However, the τi parameter derived only with SSM analysis allows the unique opportunity to potentially quantify therapy-induced changes in tumor energetic metabolism. PMID:26947876

  19. Quantitative evaluation of mask phase defects from through-focus EUV aerial images

    SciTech Connect

    Mochi, Iacopo; Yamazoe, Kenji; Neureuther, Andrew; Goldberg, Kenneth A.

    2011-02-21

    Mask defects inspection and imaging is one of the most important issues for any pattern transfer lithography technology. This is especially true for EUV lithography where the wavelength-specific properties of masks and defects necessitate actinic inspection for a faithful prediction of defect printability and repair performance. In this paper we will present a technique to obtain a quantitative characterization of mask phase defects from EUV aerial images. We apply this technique to measure the aerial image phase of native defects on a blank mask, measured with the SEMATECH Berkeley Actinic Inspection Tool (AIT) an EUV zoneplate microscope that operates at Lawrence Berkeley National Laboratory. The measured phase is compared with predictions made from AFM top-surface measurements of those defects. While amplitude defects are usually easy to recognize and quantify with standard inspection techniques like scanning electron microscopy (SEM), defects or structures that have a phase component can be much more challenging to inspect. A phase defect can originate from the substrate or from any level of the multilayer. In both cases its effect on the reflected field is not directly related to the local topography of the mask surface, but depends on the deformation of the multilayer structure. Using the AIT, we have previously showed that EUV inspection provides a faithful and reliable way to predict the appearance of mask defect on the printed wafer; but to obtain a complete characterization of the defect we need to evaluate quantitatively its phase component. While aerial imaging doesn't provide a direct measurement of the phase of the object, this information is encoded in the through focus evolution of the image intensity distribution. Recently we developed a technique that allows us to extract the complex amplitude of EUV mask defects using two aerial images from different focal planes. The method for the phase reconstruction is derived from the Gerchberg-Saxton (GS

  20. DRACS thermal performance evaluation for FHR

    SciTech Connect

    Lv, Q.; Lin, H. C.; Kim, I. H.; Sun, X.; Christensen, R. N.; Blue, T. E.; Yoder, G. L.; Wilson, D. F.; Sabharwall, P.

    2015-03-01

    Direct Reactor Auxiliary Cooling System (DRACS) is a passive decay heat removal system proposed for the Fluoride-salt-cooled High-temperature Reactor (FHR) that combines coated particle fuel and a graphite moderator with a liquid fluoride salt as the coolant. The DRACS features three coupled natural circulation/convection loops, relying completely on buoyancy as the driving force. These loops are coupled through two heat exchangers, namely, the DRACS Heat Exchanger and the Natural Draft Heat Exchanger. In addition, a fluidic diode is employed to minimize the parasitic flow into the DRACS primary loop and correspondingly the heat loss to the DRACS during normal operation of the reactor, and to keep the DRACS ready for activation, if needed, during accidents. To help with the design and thermal performance evaluation of the DRACS, a computer code using MATLAB has been developed. This code is based on a one-dimensional formulation and its principle is to solve the energy balance and integral momentum equations. By discretizing the DRACS system in the axial direction, a bulk mean temperature is assumed for each mesh cell. The temperatures of all the cells, as well as the mass flow rates in the DRACS loops, are predicted by solving the governing equations that are obtained by integrating the energy conservation equation over each cell and integrating the momentum conservation equation over each of the DRACS loops. In addition, an intermediate heat transfer loop equipped with a pump has also been modeled in the code. This enables the study of flow reversal phenomenon in the DRACS primary loop, associated with the pump trip process. Experimental data from a High-Temperature DRACS Test Facility (HTDF) are not available yet to benchmark the code. A preliminary code validation is performed by using natural circulation experimental data available in the literature, which are as closely relevant as possible. The code is subsequently applied to the HTDF that is under

  1. High-Performance Monopropellants and Catalysts Evaluated

    NASA Technical Reports Server (NTRS)

    Reed, Brian D.

    2004-01-01

    The NASA Glenn Research Center is sponsoring efforts to develop advanced monopropellant technology. The focus has been on monopropellant formulations composed of an aqueous solution of hydroxylammonium nitrate (HAN) and a fuel component. HAN-based monopropellants do not have a toxic vapor and do not need the extraordinary procedures for storage, handling, and disposal required of hydrazine (N2H4). Generically, HAN-based monopropellants are denser and have lower freezing points than N2H4. The performance of HAN-based monopropellants depends on the selection of fuel, the HAN-to-fuel ratio, and the amount of water in the formulation. HAN-based monopropellants are not seen as a replacement for N2H4 per se, but rather as a propulsion option in their own right. For example, HAN-based monopropellants would prove beneficial to the orbit insertion of small, power-limited satellites because of this propellant's high performance (reduced system mass), high density (reduced system volume), and low freezing point (elimination of tank and line heaters). Under a Glenn-contracted effort, Aerojet Redmond Rocket Center conducted testing to provide the foundation for the development of monopropellant thrusters with an I(sub sp) goal of 250 sec. A modular, workhorse reactor (representative of a 1-lbf thruster) was used to evaluate HAN formulations with catalyst materials. Stoichiometric, oxygen-rich, and fuelrich formulations of HAN-methanol and HAN-tris(aminoethyl)amine trinitrate were tested to investigate the effects of stoichiometry on combustion behavior. Aerojet found that fuelrich formulations degrade the catalyst and reactor faster than oxygen-rich and stoichiometric formulations do. A HAN-methanol formulation with a theoretical Isp of 269 sec (designated HAN269MEO) was selected as the baseline. With a combustion efficiency of at least 93 percent demonstrated for HAN-based monopropellants, HAN269MEO will meet the I(sub sp) 250 sec goal.

  2. Performance and evaluation of real-time multicomputer control systems

    NASA Technical Reports Server (NTRS)

    Shin, K. G.

    1983-01-01

    New performance measures, detailed examples, modeling of error detection process, performance evaluation of rollback recovery methods, experiments on FTMP, and optimal size of an NMR cluster are discussed.

  3. Space Suit Performance: Methods for Changing the Quality of Quantitative Data

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew; Benson, Elizabeth; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. To verify that new suits will enable astronauts to perform to their maximum capacity, prototype suits must be built and tested with human subjects. However, engineers and flight surgeons often have difficulty understanding and applying traditional representations of human data without training. To overcome these challenges, NASA is developing modern simulation and analysis techniques that focus on 3D visualization. Early understanding of actual performance early on in the design cycle is extremely advantageous to increase performance capabilities, reduce the risk of injury, and reduce costs. The primary objective of this project was to test modern simulation and analysis techniques for evaluating the performance of a human operating in extra-vehicular space suits.

  4. Application of Organosilane Monolayer Template to Quantitative Evaluation of Cancer Cell Adhesive Ability

    NASA Astrophysics Data System (ADS)

    Tanii, Takashi; Sasaki, Kosuke; Ichisawa, Kota; Demura, Takanori; Beppu, Yuichi; Vu, Hoan Anh; Thanh Chi, Hoan; Yamamoto, Hideaki; Sato, Yuko

    2011-06-01

    The adhesive ability of two human pancreatic cancer cell lines was evaluated using organosilane monolayer templates (OMTs). Using the OMT, the spreading area of adhered cells can be limited, and this enables us to focus on the initial attachment process of adhesion. Moreover, it becomes possible to arrange the cells in an array and to quantitatively evaluate the number of attached cells. The adhesive ability of the cancer cells cultured on the OMT was controlled by adding (-)-epigallocatechin-3-gallate (EGCG), which blocks a receptor that mediates cell adhesion and is overexpressed in cancer cells. Measurement of the relative ability of the cancer cells to attach to the OMT revealed that the ability for attachment decreased with increasing EGCG concentration. The results agreed well with the western blot analysis, indicating that the OMT can potentially be employed to evaluate the adhesive ability of various cancer cells.

  5. Automatic quantitative evaluation of autoradiographic band films by computerized image analysis

    SciTech Connect

    Masseroli, M.; Messori, A.; Bendotti, C.; Ponti, M.; Forloni, G. )

    1993-01-01

    The present paper describes a new image processing method for automatic quantitative analysis of autoradiographic band films. It was developed in a specific image analysis environment (IBAS 2.0) but the algorithms and methods can be utilized elsewhere. The program is easy to use and presents some particularly useful features for evaluation of autoradiographic band films, such as the choice of whole film or single lane background determination; the possibility of evaluating bands with film scratch artifacts and the quantification in absolute terms or relative to reference values. The method was tested by comparison with laser-scanner densitometric quantifications of the same autoradiograms. The results show the full compatibility of the two methods and demonstrate the reliability and sensitivity of image analysis. The method can be used not only to evaluate autoradiographic band films, but to analyze any type of signal bands on other materials (e.g electrophoresis gel, chromatographic paper, etc.).

  6. Quantitative determination of triterpenoid glycosides in Fatsia japonica Decne. & Planch. using high performance liquid chromatography.

    PubMed

    Ye, Xuewei; Yu, Siran; Lian, Xiao-Yuan; Zhang, Zhizhen

    2014-01-01

    Fatsia japonica Decne. & Planch. is a triterpenoid glycoside-rich herb with anti-inflammatory activity for the treatment of rheumatoid arthritis. A method for quantitative analysis of the complex triterpenoid glycosides in this medicinal plant has not been established so far. In this study, a high performance liquid chromatography (HPLC) method was developed for simultaneous qualification of 11 glycosides in F. japonica. The analysis was performed on an ODS-2 Hypersil column (250mm×4.6mm, 5μm) with a binary gradient mobile phase of water and acetonitrile. The established HPLC method was validated in terms of linearity, sensitivity, stability, precision, accuracy, and recovery. Results showed that this method had good linearity with R(2) at 0.99992-0.99999 in the test range of 0.04-9.00μg/μL. The limit of detection (LOD) and limit of quantification (LOQ) for the standard compounds were 0.013-0.020μg/μL and 0.040-0.060μg/μL. The relative standard deviations (RSDs%) of run variations were 0.83-1.40% for intra-day and 0.84-3.59% for inter-day. The analyzed compounds in the samples were stable for at least 36h, and the spike recoveries of the detected glycosides were 99.67-103.11%. The developed HPLC method was successfully applied for the measurements of the contents of 11 triterpenoid glycoside in different parts of F. japonica. Taken together, the HPLC method newly developed in this study could be used for qualitative and quantitative analysis of the bioactive triterpenoid glycosides in F. japonica and its products. PMID:24176752

  7. LANDSAT-4 horizon scanner performance evaluation

    NASA Technical Reports Server (NTRS)

    Bilanow, S.; Chen, L. C.; Davis, W. M.; Stanley, J. P.

    1984-01-01

    Representative data spans covering a little more than a year since the LANDSAT-4 launch were analyzed to evaluate the flight performance of the satellite's horizon scanner. High frequency noise was filtered out by 128-point averaging. The effects of Earth oblateness and spacecraft altitude variations are modeled, and residual systematic errors are analyzed. A model for the predicted radiance effects is compared with the flight data and deficiencies in the radiance effects modeling are noted. Correction coefficients are provided for a finite Fourier series representation of the systematic errors in the data. Analysis of the seasonal dependence of the coefficients indicates the effects of some early mission problems with the reference attitudes which were computed by the onboard computer using star trackers and gyro data. The effects of sun and moon interference, unexplained anomalies in the data, and sensor noise characteristics and their power spectrum are described. The variability of full orbit data averages is shown. Plots of the sensor data for all the available data spans are included.

  8. Human performance evaluation in dual-axis critical task tracking

    NASA Technical Reports Server (NTRS)

    Ritchie, M. L.; Nataraj, N. S.

    1975-01-01

    A dual axis tracking using a multiloop critical task was set up to evaluate human performance. The effects of control stick variation and display formats are evaluated. A secondary loading was used to measure the degradation in tracking performance.

  9. Clinical value of real-time elastography quantitative parameters in evaluating the stage of liver fibrosis and cirrhosis

    PubMed Central

    GE, LAN; SHI, BAOMIN; SONG, YE; LI, YUAN; WANG, SHUO; WANG, XIUYAN

    2015-01-01

    The aim of the present study was to assess the value of real-time elastography (RTE) quantitative parameters, namely the liver fibrosis (LF) index and the ratio of blue area (%AREA), in evaluating the stage of liver fibrosis. RTE quantitative analysis software was used to examine 120 patients with chronic hepatitis in order to obtain the values for 12 quantitative parameters from the elastic images. The diagnostic performance of two such parameters, the LF index and %AREA, were assessed with a receiver operating characteristic (ROC) curve to determine the optimal diagnostic cut-off values for liver cirrhosis and fibrosis. A good correlation was observed between the LF index and %AREA with the fibrosis stage. The areas under the ROC curve for the LF index were 0.985 for the diagnosis of liver cirrhosis and 0.790 for liver fibrosis. With regard to %AREA, the areas under the ROC curve for the diagnosis of liver cirrhosis and fibrosis were 0.963 and 0.770, respectively. An LF index of >3.25 and a %AREA of >28.83 for the diagnosis of cirrhosis stage resulted in sensitivity values of 100 and 100%, specificity values of 88.9 and 85.9% and accuracy values of 90.8 and 88.3%, respectively. The LF index and %AREA parameters exhibited higher reliability in the diagnosis of liver cirrhosis compared with the diagnosis of the liver fibrosis stage. However, the two parameters possessed a similar efficacy in the diagnosis of liver cirrhosis and the stage of liver fibrosis. Therefore, the quantitative RTE parameters of the LF index and %AREA may be clinically applicable as reliable indices for the early diagnosis of liver cirrhosis, without the requirement of an invasive procedure. PMID:26622426

  10. A quasi-experimental quantitative study of the effect of IB on science performance

    NASA Astrophysics Data System (ADS)

    Healer, Margaret Irene

    The purpose of this quasi-experimental quantitative research study was to investigate the effect of participation in the International Baccalaureate (IB) program on science performance. The findings of the 2x3 mixed ANOVA and Eta square analysis indicated a significant difference (in science CSAP mean scores between the treatment group: IB students ( n = 50) and the control group: non-IB students (n = 50) at the 5th through 10th grade level. The analysis of data concluded that although scores declined between 5th, 8th, and 10th grades with IB and non-IB students, a statistical difference was indicated at each level between the two groups: IB and non-IB in the area of science performance as measured by the CSAP assessment. Educational leaders can use the findings of this study to maximize student science achievement. Further research is recommended through a mixed study to determine the effectiveness of participation in the IB Program and a study of specificity of pedagogical strategies used with science performance with a larger sample size of IB and non-IB students longitudinally.

  11. Quantitative local cerebral blood flow measurements with technetium-99m HM-PAO: evaluation using multiple radionuclide digital quantitative autoradiography

    SciTech Connect

    Lear, J.L.

    1988-08-01

    We investigated d,1 (/sup 99m/Tc)hexamethylpropyleneamine oxime complex (HM-PAO) as a tracer for quantitative measurement of local cerebral blood flow (LCBF) in a series of awake male rats. LCBF measurements with HM-PAO were compared to those of two other tracers, (/sup 14/C) iodoantipyrine (IAP) and (/sup 201/Tl)diethyldithiocarbamate (DDC), using quantitative double and triple tracer digital autoradiography. LCBF values with HM-PAO averaged 64% those of IAP and were generally linearly related. Detailed analysis suggested that the underestimation of LCBF by HM-PAO was related to blood constituent binding and/or rapid conversion to a noncerebrophilic compound, as well as noninstantaneous cerebral trapping, rather than to diffusion limitation.

  12. Quantitative evaluation of unrestrained human gait on change in walking velocity.

    PubMed

    Makino, Yuta; Tsujiuchi, Nobutaka; Ito, Akihito; Koizumi, Takayuki; Nakamura, Shota; Matsuda, Yasushi; Tsuchiya, Youtaro; Hayashi, Yuichiro

    2014-01-01

    In human gait motion analysis, which is one useful method for efficient physical rehabilitation to define various quantitative evaluation indices, ground reaction force, joint angle and joint loads are measured during gait. To obtain these data as unrestrained gait measurement, a novel gait motion analysis system using mobile force plates and attitude sensors has been developed. On the other hand, a human maintains a high correlation among the motion of all joints during gait. The analysis of the correlation in the recorded joint motion extracts a few simultaneously activating segmental coordination patterns, and the structure of the intersegmental coordination is attracting attention to an expected relationship with a control strategy. However, when the evaluation method using singular value decomposition has been applied to joint angles of the lower limb as representative kinematic parameters, joint moments related to the rotational motion of the joints have not yet been considered. In this paper, joint moments as kinetic parameters applied on the lower limb during gait of a normal subject and a trans-femoral amputee are analyzed under change in walking velocity by the wearable gait motion analysis system, and the effectiveness for quantitatively evaluate the rotational motion pattern in the joints of the lower limb by using joint moments is validated. PMID:25570503

  13. The quantitative evaluation of the correlation between the magnification and the visibility-contrast value

    NASA Astrophysics Data System (ADS)

    Okubo, Shohei; Shibata, Takayuki; Kodera, Yoshie

    2015-03-01

    Talbot-Lau interferometer, which consists of a conventional x-ray tube, an x-ray detector, and three gratings arranged between them, is a new x-ray imaging system using phase-contrast method for excellent visualization of soft tissue. So, it is expected to be applied to an imaging method for soft tissue in the medical field, such as mammograms. The visibility-contrast image, which is one of the reconstruction images using Talbot-Lau interferometer, is known that the visibility-contrast reflects reduction of coherence that is caused from the x-ray small-angle scattering and the x-ray refraction due to the object's structures. Both phenomena were not distinguished when we evaluated the visibility signal quantitatively before. However, we consider that we should distinguish both phenomena to evaluate it quantitatively. In this study, to evaluate how much the magnification affect the visibility signal, we investigated the variability rate of the visibility signal between the object-position in the height of 0 cm to 50 cm from the diffraction grating in each case of examining the scattering signal and the refraction signal. We measured the edge signal of glass sphere to examine the scattering signal and the internal signal of glass sphere and some kinds of sheet to examine the refraction signal. We can indicate the difference of the variability rate between the edge signal and the internal signal. We tried to propose the estimation method using magnification.

  14. Comparison of Diagnostic Performance of Semi-Quantitative Knee Ultrasound and Knee Radiography with MRI: Oulu Knee Osteoarthritis Study.

    PubMed

    Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W; Arokoski, Jari P; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T; Tervonen, Osmo; Koski, Juhani M; Saarakkala, Simo

    2016-01-01

    Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level. PMID:26926836

  15. Comparison of Diagnostic Performance of Semi-Quantitative Knee Ultrasound and Knee Radiography with MRI: Oulu Knee Osteoarthritis Study

    PubMed Central

    Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W.; Arokoski, Jari P.; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T.; Tervonen, Osmo; Koski, Juhani M.; Saarakkala, Simo

    2016-01-01

    Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level. PMID:26926836

  16. MANUAL FOR THE EVALUATION OF LABORATORIES PERFORMING AQUATIC TOXICITY TESTS

    EPA Science Inventory

    This manual describes guidelines and standardized procedures for conducting on-site audits and evaluations of laboratories performing toxicity tests. ncluded are pre-survey information activities, on-site evaluation activities, evaluation criteria, organizational history and labo...

  17. Quantitative evaluation and visualization of cracking process in reinforced concrete by a moment tensor analysis of acoustic emission

    SciTech Connect

    Yuyama, Shigenori; Okamoto, Takahisa; Shigeishi, Mitsuhiro; Ohtsu, Masayasu

    1995-06-01

    Fracture tests are conducted on two types of reinforced concrete specimens under cyclic loadings. Cracking process is quantitatively evaluated and visualized by applying a moment tensor analysis to the AE waveforms detected during the fracture. First, bending tests are performed on reinforced concrete beams. It is found that both tensile and shear cracks are generated around the reinforcement in the low loading stages. However, shear cracks become dominant as the cracking process progresses. In the final stages, shear cracks are generated near the interface between the reinforcement and concrete even during unloadings. A bond strength test, made second, shows that tensile cracks are produced around the reinforcement in the early stages. They spread apart from the reinforcement to wider areas in the later stages. An intense AE cluster due to shear cracks is observed along the interface between the reinforcement and concrete. The previous result from an engineering structure is also presented for comparison. All these results demonstrate a great promise of the analysis for quantitative evaluation and visualization of the cracking process in reinforced concrete. The relationship between the opening width of surface cracks and the Kaiser effect is intensively studied. It is shown that a breakdown of the Kaiser effect and high AE activities during unloading can be effective indices to estimate the level of deterioration in concrete structures.

  18. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  19. A quantitative method evaluating the selective adsorption of molecularly imprinted polymer.

    PubMed

    Zhang, Z B; Hu, J Y

    2012-01-01

    Adsorption isotherms of 4 estrogenic compounds, estrone, 17β-estradiol, 17α-ethinylestradiol and Bisphenol A, using molecularly imprinted polymer were studied. The isotherms can be simulated by Langmuir model. According to the adsorption isotherms and the template's mass balance, an experimental concept, selective adsorption ratio, SAR, was proposed to assess how many template molecules extracted out of MIP could create selective binding sites. The SAR of the molecularly imprinted polymer was 74.3% for E2. This concept could be used to evaluate quantitatively the selective adsorption. PMID:22423989

  20. Quantitative and qualitative evaluation of PERCEPT indoor navigation system for visually impaired users.

    PubMed

    Ganz, Aura; Schafer, James; Puleo, Elaine; Wilson, Carole; Robertson, Meg

    2012-01-01

    In this paper we introduce qualitative and quantitative evaluation of PERCEPT system, an indoor navigation system for the blind and visually impaired. PERCEPT system trials with 24 blind and visually impaired users in a multi-story building show PERCEPT system effectiveness in providing appropriate navigation instructions to these users. The uniqueness of our system is that it is affordable and that its design follows Orientation and Mobility principles. These results encourage us to generalize the solution to large indoor spaces and test it with significantly larger visually impaired population in diverse settings. We hope that PERCEPT will become a standard deployed in all indoor public spaces. PMID:23367251

  1. Interdisciplinary program for quantitative nondestructive evaluation. Semiannual report, 1 October 1981-31 March 1982

    SciTech Connect

    Not Available

    1982-01-01

    This report constitutes the semiannual report of the Air Force/Defense Advanced Research Project Agency research program in quantitative nondestructive evaluation covering the period October 1, 1981 to March 31, 1982. It is organized by projects, each of which contains the reports of individual investigations. Because the goals of the projects are largely such that strong interdisciplinary interactions are necessary in order to achieve them, the individual reports reflect a close cooperation between various investigators. Projects included in this year's effort are: application of ultrasonic QNDE to RFC window problems; electromagnetic detection and sizing; new technical opportunities; and new flaw detection techniques. Twenty-three project reports are presented.

  2. 48 CFR 2452.216-73 - Performance evaluation plan.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Performance evaluation plan... 2452.216-73 Performance evaluation plan. As prescribed in 2416.406(e)(3), insert the following clause in all award fee contracts: Performance Evaluation Plan (AUG 1987) (a) The Government...

  3. 48 CFR 8.406-7 - Contractor Performance Evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Performance Evaluation. Ordering activities must prepare an evaluation of contractor performance for each... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Contractor Performance Evaluation. 8.406-7 Section 8.406-7 Federal Acquisition Regulations System FEDERAL ACQUISITION...

  4. 24 CFR 570.491 - Performance and evaluation report.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Development Block Grant Program § 570.491 Performance and evaluation report. The annual performance and evaluation report shall be submitted in accordance with 24 CFR part 91. (Approved by the Office of Management... 24 Housing and Urban Development 3 2011-04-01 2010-04-01 true Performance and evaluation...

  5. 24 CFR 570.491 - Performance and evaluation report.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Development Block Grant Program § 570.491 Performance and evaluation report. The annual performance and evaluation report shall be submitted in accordance with 24 CFR part 91. (Approved by the Office of Management... 24 Housing and Urban Development 3 2010-04-01 2010-04-01 false Performance and evaluation...

  6. Evaluation of ViroCyt® Virus Counter for rapid filovirus quantitation.

    PubMed

    Rossi, Cynthia A; Kearney, Brian J; Olschner, Scott P; Williams, Priscilla L; Robinson, Camenzind G; Heinrich, Megan L; Zovanyi, Ashley M; Ingram, Michael F; Norwood, David A; Schoepp, Randal J

    2015-03-01

    Development and evaluation of medical countermeasures for diagnostics, vaccines, and therapeutics requires production of standardized, reproducible, and well characterized virus preparations. For filoviruses this includes plaque assay for quantitation of infectious virus, transmission electron microscopy (TEM) for morphology and quantitation of virus particles, and real-time reverse transcription PCR for quantitation of viral RNA (qRT-PCR). The ViroCyt® Virus Counter (VC) 2100 (ViroCyt, Boulder, CO, USA) is a flow-based instrument capable of quantifying virus particles in solution. Using a proprietary combination of fluorescent dyes that stain both nucleic acid and protein in a single 30 min step, rapid, reproducible, and cost-effective quantification of filovirus particles was demonstrated. Using a seed stock of Ebola virus variant Kikwit, the linear range of the instrument was determined to be 2.8E+06 to 1.0E+09 virus particles per mL with coefficient of variation ranging from 9.4% to 31.5% for samples tested in triplicate. VC particle counts for various filovirus stocks were within one log of TEM particle counts. A linear relationship was established between the plaque assay, qRT-PCR, and the VC. VC results significantly correlated with both plaque assay and qRT-PCR. These results demonstrated that the VC is an easy, fast, and consistent method to quantify filoviruses in stock preparations. PMID:25710889

  7. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  8. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  9. Experimental Evaluation of Quantitative Diagnosis Technique for Hepatic Fibrosis Using Ultrasonic Phantom

    NASA Astrophysics Data System (ADS)

    Koriyama, Atsushi; Yasuhara, Wataru; Hachiya, Hiroyuki

    2012-07-01

    Since clinical diagnosis using ultrasonic B-mode images depends on the skill of the doctor, the realization of a quantitative diagnosis method using an ultrasound echo signal is highly required. We have been investigating a quantitative diagnosis technique, mainly for hepatic disease. In this paper, we present the basic experimental evaluation results on the accuracy of the proposed quantitative diagnosis technique for hepatic fibrosis by using a simple ultrasonic phantom. As a region of interest crossed on the boundary between two scatterer areas with different densities in a phantom, we can simulate the change of the echo amplitude distribution from normal tissue to fibrotic tissue in liver disease. The probability density function is well approximated by our fibrosis distribution model that is a mixture of normal and fibrotic tissue. The fibrosis parameters of the amplitude distribution model can be estimated relatively well at a mixture rate from 0.2 to 0.6. In the inversion processing, the standard deviation of the estimated fibrosis results at mixture ratios of less than 0.2 and larger than 0.6 are relatively large. Although the probability density is not large at high amplitude, the estimated variance ratio and mixture rate of the model are strongly affected by higher amplitude data.

  10. 40 CFR 35.515 - Evaluation of performance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Administrator's decision under the dispute processes in 40 CFR 31.70. (d) Evaluation reports. The Regional... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Evaluation of performance. 35.515....515 Evaluation of performance. (a) Joint evaluation process. The applicant and the...

  11. [Quantitative evaluation of growth-promoting properties of selected culture media used for isolation of Salmonella and Shigella strains].

    PubMed

    Kałuzewski, S; Sienicka, J

    1990-01-01

    Growth promoting properties and selectivity of 11 commercially produced media recommended for Salmonella and Shigella isolation were evaluated. The following media were tested: EMB (Eosin methylene blue agar), Endo, Płoskiriew, MacConkey, DC (Deoxycholate citrate agar), SS (Salmonella-Shigella agar), BS (Bismuth sulfite agar) and Mueller-Hinton as a medium with no selective properties. The media were produced in Czechoslovakia, East Germany, West Germany, Poland, and Soviet Union. Quantitative studies were performed on 71 strains representing 8 genera of Enterobacteriaceae family; both reference and wild newly + isolated from clinical material strains were included. It was found that none of DC and BS media provided suitable growth conditions for Shigella strains and in particular for S. dysenteriae, S. boydii, and S. flexneri. It was also found that the same medium (name and content) but derived from different producer can vary significantly in respect to growth promotion and selectivity especially for Shigella strains. All media with selective, differentiating properties for Salmonella and Shigella isolation should not be used without previous quantitative control test for their selective and growth promoting properties checked by user. The need for such a control performed both on reference and freshly isolated strains was shown in this study. In the set of control strains all species of Shigella should be represented. PMID:2087133

  12. Rapid method for glutathione quantitation using high-performance liquid chromatography with coulometric electrochemical detection.

    PubMed

    Bayram, Banu; Rimbach, Gerald; Frank, Jan; Esatbeyoglu, Tuba

    2014-01-15

    A rapid, sensitive, and direct method (without derivatization) was developed for the detection of reduced glutathione (GSH) in cultured hepatocytes (HepG2 cells) using high-performance liquid chromatography with electrochemical detection (HPLC-ECD). The method was validated according to the guidelines of the U.S. Food and Drug Administration in terms of linearity, lower limit of quantitation (LOQ), lower limit of detection (LOD), precision, accuracy, recovery, and stabilities of GSH standards and quality control samples. The total analysis time was 5 min, and the retention time of GSH was 1.78 min. Separation was carried out isocratically using 50 mM sodium phosphate (pH 3.0) as a mobile phase with a fused-core column. The detector response was linear between 0.01 and 80 μmol/L, and the regression coefficient (R(2)) was >0.99. The LOD for GSH was 15 fmol, and the intra- and interday recoveries ranged between 100.7 and 104.6%. This method also enabled the rapid detection (in 4 min) of other compounds involved in GSH metabolism such as uric acid, ascorbic acid, and glutathione disulfite. The optimized and validated HPLC-ECD method was successfully applied for the determination of GSH levels in HepG2 cells treated with buthionine sulfoximine (BSO), an inhibitor, and α-lipoic acid (α-LA), an inducer of GSH synthesis. As expected, the amount of GSH concentration-dependently decreased with BSO and increased with α-LA treatments in HepG2 cells. This method could also be useful for the quantitation of GSH, uric acid, ascorbic acid, and glutathione disulfide in other biological matrices such as tissue homogenates and blood. PMID:24328299

  13. Quantitative susceptibility mapping of striatum in children and adults, and its association with working memory performance.

    PubMed

    Darki, Fahimeh; Nemmi, Federico; Möller, Annie; Sitnikov, Rouslan; Klingberg, Torkel

    2016-08-01

    Quantitative susceptibility mapping (QSM) is a magnetic resonance imaging (MRI) technique in which the magnetic susceptibility characteristic of molecular and cellular components, including iron and myelin, is quantified. Rapid iron accumulation in subcortical nuclei and myelination of the white matter tracts are two important developmental processes that contribute to cognitive functions. Both also contribute to the magnetic susceptibility of the brain tissues. Here, we used the QSM as indirect measures of iron in subcortical nuclei and myelin in caudo-frontal white matter pathways. We included two groups of participants; 21 children aged 6-7years and 25 adults aged 21-40years. All subjects also performed tests estimating their visuo-spatial working memory capacity. Adults had higher magnetic susceptibility in all subcortical nuclei, compared to children. The magnetic susceptibility of these nuclei highly correlated with their previously reported iron content. Moreover, working memory performance correlated significantly with the magnetic susceptibility in caudate nucleus in both children and adults, while the correlation was not significant for gray matter density. QSM of white matter in the caudo-frontal tract also differed between children and adults, but did not correlate with working memory scores. These results indicate that QSM is a feasible technique to measure developmental aspects of changes in the striatum, possibly related to iron content that is relevant to cognition. PMID:27132546

  14. Unsupervised Performance Evaluation of Image Segmentation

    NASA Astrophysics Data System (ADS)

    Chabrier, Sebastien; Emile, Bruno; Rosenberger, Christophe; Laurent, Helene

    2006-12-01

    We present in this paper a study of unsupervised evaluation criteria that enable the quantification of the quality of an image segmentation result. These evaluation criteria compute some statistics for each region or class in a segmentation result. Such an evaluation criterion can be useful for different applications: the comparison of segmentation results, the automatic choice of the best fitted parameters of a segmentation method for a given image, or the definition of new segmentation methods by optimization. We first present the state of art of unsupervised evaluation, and then, we compare six unsupervised evaluation criteria. For this comparative study, we use a database composed of 8400 synthetic gray-level images segmented in four different ways. Vinet's measure (correct classification rate) is used as an objective criterion to compare the behavior of the different criteria. Finally, we present the experimental results on the segmentation evaluation of a few gray-level natural images.

  15. EVALUATION OF QUANTITATIVE REAL TIME PCR FOR THE MEASUREMENT OF HELICOBATER PYLORI AT LOW CONCENTRATIONS IN DRINKING WATER

    EPA Science Inventory

    Aims: To determine the performance of a rapid, real time polymerase chain reaction (PCR) method for the detection and quantitative analysis Helicobacter pylori at low concentrations in drinking water.

    Methods and Results: A rapid DNA extraction and quantitative PCR (QPCR)...

  16. Towards a Quantitative Performance Measurement Framework to Assess the Impact of Geographic Information Standards

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.

    2012-12-01

    Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives

  17. Performance evaluation of ground based radar systems

    NASA Astrophysics Data System (ADS)

    Grant, Stanley E.

    1994-06-01

    Ground based radar systems are a critical resource to the command, control, and communications system. This thesis provides the tools and methods to better understand the actual performance of an operational ground based radar system. This thesis defines two measurable performance standards: (1) the baseline performance, which is based on the sensor's internal characteristics, and (2) the theoretical performance, which considers not only the sensor's internal characteristics, but also the effects of the surrounding terrain and atmosphere on the sensor's performance. The baseline radar system performance, often used by operators, contractors, and radar modeling software to determine the expected system performance, is a simplistic and unrealistic means to predict actual radar system performance. The theoretical radar system performance is more complex; but, the results are much more indicative of the actual performance of an operational radar system. The AN/UPS-1 at the Naval Postgraduate School was used as the system under test to illustrate the baseline and theoretical radar system performance. The terrain effects are shown by performing a multipath study and producing coverage diagrams. The key variables used to construct the multipath study and coverage diagrams are discussed in detail. The atmospheric effects are illustrated by using the Integrated Refractive Effects Prediction System (IREPS) and the Engineer's Refractive Effects Prediction System (EREPS) software tools to produce propagations conditions summaries and coverage displays.

  18. A quantitative evaluation study of four-dimensional gated cardiac SPECT reconstruction.

    PubMed

    Jin, Mingwu; Yang, Yongyi; Niu, Xiaofeng; Marin, Thibault; Brankov, Jovan G; Feng, Bing; Pretorius, P Hendrik; King, Michael A; Wernick, Miles N

    2009-09-21

    In practice, gated cardiac SPECT images suffer from a number of degrading factors, including distance-dependent blur, attenuation, scatter and increased noise due to gating. Recently, we proposed a motion-compensated approach for four-dimensional (4D) reconstruction for gated cardiac SPECT and demonstrated that use of motion-compensated temporal smoothing could be effective for suppressing the increased noise due to lowered counts in individual gates. In this work, we further develop this motion-compensated 4D approach by also taking into account attenuation and scatter in the reconstruction process, which are two major degrading factors in SPECT data. In our experiments, we conducted a thorough quantitative evaluation of the proposed 4D method using Monte Carlo simulated SPECT imaging based on the 4D NURBS-based cardiac-torso (NCAT) phantom. In particular, we evaluated the accuracy of the reconstructed left ventricular myocardium using a number of quantitative measures including regional bias-variance analyses and wall intensity uniformity. The quantitative results demonstrate that use of motion-compensated 4D reconstruction can improve the accuracy of the reconstructed myocardium, which in turn can improve the detectability of perfusion defects. Moreover, our results reveal that while traditional spatial smoothing could be beneficial, its merit would become diminished with the use of motion-compensated temporal regularization. As a preliminary demonstration, we also tested our 4D approach on patient data. The reconstructed images from both simulated and patient data demonstrated that our 4D method can improve the definition of the LV wall. PMID:19724094

  19. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  20. Quantitation of aortic and mitral regurgitation in the pediatric population: evaluation by radionuclide angiocardiography

    SciTech Connect

    Hurwitz, R.A.; Treves, S.; Freed, M.; Girod, D.A.; Caldwell, R.L.

    1983-01-15

    The ability to quantitate aortic (AR) or mitral regurgitation (MR), or both, by radionuclide angiocardiography was evaluated in children and young adults at rest and during isometric exercise. Regurgitation was estimated by determining the ratio of left ventricular stroke volume to right ventricular stroke volume obtained during equilibrium ventriculography. The radionuclide measurement was compared with results of cineangiography, with good correlation between both studies in 47 of 48 patients. Radionuclide stroke volume ratio was used to classify severity: the group with equivocal regurgitation differed from the group with mild regurgitation (p less than 0.02); patients with mild regurgitation differed from those with moderate regurgitation (p less than 0.001); and those with moderate regurgitation differed from those with severe regurgitation (p less than 0.01). The stroke volume ratio was responsive to isometric exercise, remaining constant or increasing in 16 of 18 patients. After surgery to correct regurgitation, the stroke volume ratio significantly decreased from preoperative measurements in all 7 patients evaluated. Results from the present study demonstrate that a stroke volume ratio greater than 2.0 is compatible with moderately severe regurgitation and that a ratio greater than 3.0 suggests the presence of severe regurgitation. Thus, radionuclide angiocardiography should be useful for noninvasive quantitation of AR or MR, or both, helping define the course of young patients with left-side valvular regurgitation.

  1. Panoramic imaging is not suitable for quantitative evaluation, classification, and follow up in unilateral condylar hyperplasia.

    PubMed

    Nolte, J W; Karssemakers, L H E; Grootendorst, D C; Tuinzing, D B; Becking, A G

    2015-05-01

    Patients with suspected unilateral condylar hyperplasia are often screened radiologically with a panoramic radiograph, but this is not sufficient for routine diagnosis and follow up. We have therefore made a quantitative analysis and evaluation of panoramic radiographs in a large group of patients with the condition. During the period 1994-2011, 132 patients with 113 panoramic radiographs were analysed using a validated method. There was good reproducibility between observers, but the condylar neck and head were the regions reported with least reliability. Although in most patients asymmetry of the condylar head, neck, and ramus was confirmed, the kappa coefficient as an indicator of agreement between two observers was poor (-0.040 to 0.504). Hardly any difference between sides was measured at the gonion angle, and the body appeared to be higher on the affected side in 80% of patients. Panoramic radiographs might be suitable for screening, but are not suitable for the quantitative evaluation, classification, and follow up of patients with unilateral condylar hyperplasia. PMID:25798757

  2. Quantitative evaluation on internal seeing induced by heat-stop of solar telescope.

    PubMed

    Liu, Yangyi; Gu, Naiting; Rao, Changhui

    2015-07-27

    heat-stop is one of the essential thermal control devices of solar telescope. The internal seeing induced by its temperature rise will degrade the imaging quality significantly. For quantitative evaluation on internal seeing, an integrated analysis method based on computational fluid dynamics and geometric optics is proposed in this paper. Firstly, the temperature field of the heat-affected zone induced by heat-stop temperature rise is obtained by the method of computational fluid dynamics calculation. Secondly, the temperature field is transformed to refractive index field by corresponding equations. Thirdly, the wavefront aberration induced by internal seeing is calculated by geometric optics based on optical integration in the refractive index field. This integrated method is applied in the heat-stop of the Chinese Large Solar Telescope to quantitatively evaluate its internal seeing. The analytical results show that the maximum acceptable temperature rise of heat-stop is up to 5 Kelvins above the ambient air at any telescope pointing directions under the condition that the root-mean-square of wavefront aberration induced by internal seeing is less than 25nm. Furthermore, it is found that the magnitude of wavefront aberration gradually increases with the increase of heat-stop temperature rise for a certain telescope pointing direction. Meanwhile, with the variation of telescope pointing varying from the horizontal to the vertical direction, the magnitude of wavefront aberration decreases at first and then increases for the same heat-stop temperature rise. PMID:26367657

  3. A systematic review of diagnostic performance of quantitative tests to assess musculoskeletal disorders in hand-arm vibration syndrome

    PubMed Central

    MAHBUB, MH; KUROZAWA, Youichi; ISHITAKE, Tatsuya; KUME, Yukinori; MIYASHITA, Kazuhisa; SAKAKIBARA, Hisataka; SATO, Shuji; TOIBANA, Norikuni; HARADA, Noriaki

    2015-01-01

    The purpose was to systematically review the published reports for the clinical utility of quantitative objective tests commonly used for diagnosing musculoskeletal disorders in hand-arm vibration syndrome (HAVS). Two reviewers independently conducted a computerized literature search in PubMed and Scopus using predefined criteria, and relevant papers were identified. The articles were screened in several stages and considered for final inclusion. Quality of the selected papers was evaluated by a modified QUADAS tool. Relevant data were extracted as necessary. For this review, only 4 relevant studies could be identified for detailed examination. Grip strength, pinch strength, and Purdue pegboard tests were commonly used with their reported sensitivity and specificity ranging between 1.7 to 65.7% and 65.2 to 100%, 1.7 to 40% and 94 to 100%, and 44.8 to 85% and 78 to 95%, respectively. A considerable difference across the studies was observed with respect to patient and control populations, diagnostic performance and cut-off values of different tests. Overall, currently available English-language limited literature do not provide enough evidence in favour of the application of grip strength and pinch strength tests for diagnosing musculoskeletal injuries in HAVS; Purdue pegboard test seems to have some diagnostic value in evaluating impaired dexterity in HAVS. PMID:26051288

  4. A novel integrated approach to quantitatively evaluate the efficiency of extracellular polymeric substances (EPS) extraction process.

    PubMed

    Sun, Min; Li, Wen-Wei; Yu, Han-Qing; Harada, Hideki

    2012-12-01

    A novel integrated approach is developed to quantitatively evaluate the extracellular polymeric substances (EPS) extraction efficiency after taking into account EPS yield, EPS damage, and cell lysis. This approach incorporates grey relational analysis and fuzzy logic analysis, in which the evaluation procedure is established on the basis of grey relational coefficients generation, membership functions construction, and fuzzy rules description. The flocculation activity and DNA content of EPS are chosen as the two evaluation responses. To verify the feasibility and effectiveness of this integrated approach, EPS from Bacillus megaterium TF10 are extracted using five different extraction methods, and their extraction efficiencies are evaluated as one real case study. Based on the evaluation results, the maximal extraction grades and corresponding optimal extraction times of the five extraction methods are ordered as EDTA, 10 h > formaldehyde + NaOH, 60 min > heating, 120 min > ultrasonication, 30 min > H₂SO₄, 30 min > control. The proposed approach here offers an effective tool to select appropriate EPS extraction methods and determine the optimal extraction conditions. PMID:23064456

  5. Development and evaluation of an improved quantitative 90Y bremsstrahlung SPECT method

    PubMed Central

    Rong, Xing; Du, Yong; Ljungberg, Michael; Rault, Erwann; Vandenberghe, Stefaan; Frey, Eric C.

    2012-01-01

    Purpose: Yttrium-90 (90Y) is one of the most commonly used radionuclides in targeted radionuclide therapy (TRT). Since it decays with essentially no gamma photon emissions, surrogate radionuclides (e.g., 111In) or imaging agents (e.g., 99mTc MAA) are typically used for treatment planning. It would, however, be useful to image 90Y directly in order to confirm that the distributions measured with these other radionuclides or agents are the same as for the 90Y labeled agents. As a result, there has been a great deal of interest in quantitative imaging of 90Y bremsstrahlung photons using single photon emission computed tomography (SPECT) imaging. The continuous and broad energy distribution of bremsstrahlung photons, however, imposes substantial challenges on accurate quantification of the activity distribution. The aim of this work was to develop and evaluate an improved quantitative 90Y bremsstrahlung SPECT reconstruction method appropriate for these imaging applications. Methods: Accurate modeling of image degrading factors such as object attenuation and scatter and the collimator-detector response is essential to obtain quantitatively accurate images. All of the image degrading factors are energy dependent. Thus, the authors separated the modeling of the bremsstrahlung photons into multiple categories and energy ranges. To improve the accuracy, the authors used a bremsstrahlung energy spectrum previously estimated from experimental measurements and incorporated a model of the distance between 90Y decay location and bremsstrahlung emission location into the SIMIND code used to generate the response functions and kernels used in the model. This improved Monte Carlo bremsstrahlung simulation was validated by comparison to experimentally measured projection data of a 90Y line source. The authors validated the accuracy of the forward projection model for photons in the various categories and energy ranges using the validated Monte Carlo (MC) simulation method. The

  6. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    SciTech Connect

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.; Vaccaro, S.; Schwalbach, P.; Liljenfeldt, Henrik; Tobin, Stephen

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  7. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  8. Quantitative microscopic evaluation of mucin areas and its percentage in mucinous carcinoma of the breast using tissue histological images.

    PubMed

    Saha, Monjoy; Arun, Indu; Basak, Bijan; Agarwal, Sanjit; Ahmed, Rosina; Chatterjee, Sanjoy; Bhargava, Rohit; Chakraborty, Chandan

    2016-06-01

    Mucinous carcinoma (MC) of the breast is very rare (∼1-7% of all breast cancers), invasive ductal carcinoma. Presence of pools of extracellular mucin is one of the most important histological features for MC. This paper aims at developing a quantitative computer-aided methodology for automated identification of mucin areas and its percentage using tissue histological images. The proposed method includes pre-processing (i.e., colour space transformation and colour normalization), mucin regions segmentation, post-processing, and performance evaluation. The proposed algorithm achieved 97.74% segmentation accuracy in comparison to ground truths. In addition, the percentage of mucin present in the tissue regions is calculated by the mucin index (MI) for grading MC (pure, moderately, minimally mucinous). PMID:26971129

  9. An evaluation of ARM radiosonde operational performance

    SciTech Connect

    Lesht, B.M.

    1995-06-01

    Because the ARM (Atmospheric Radiation Measurement) program uses data from radiosondes for real-time quality control and sensitive modeling applications, it is important to have a quantitative measure of the quality of the radiosonde data themselves. Two methods have been tried for estimating the quality of radiosonde data: comparisons with known standards before launch and examination of pseudo-replicate samples by single sensors aloft. The ground check procedure showed that the ARM radiosondes are within manufacturer`s specifications for measuring relative humidity; procedural artifacts prevented verification for temperature. Pseudo-replicates from ascent and descent suggest that the temperature measurement is within the specified {minus_plus}0.2 C. On average ascent and descent data are similar, but detailed structure may be obscured on descent by loss of sampling density, and the descent involves other uncertainties.

  10. FLUORESCENT TRACER EVALUATION OF PROTECTIVE CLOTHING PERFORMANCE

    EPA Science Inventory

    Field studies evaluating chemical protective clothing (CPC), which is often employed as a primary control option to reduce occupational exposures during pesticide applications, are limited. This study, supported by the U.S. Environmental Protection Agency (EPA), was designed to...

  11. Quantitative evaluation of regularized phase retrieval algorithms on bone scaffolds seeded with bone cells

    NASA Astrophysics Data System (ADS)

    Weber, L.; Langer, M.; Tavella, S.; Ruggiu, A.; Peyrin, F.

    2016-05-01

    In the field of regenerative medicine, there has been a growing interest in studying the combination of bone scaffolds and cells that can maximize newly formed bone. In-line phase-contrast x-ray tomography was used to image porous bone scaffolds (Skelite©), seeded with bone forming cells. This technique allows the quantification of both mineralized and soft tissue, unlike with classical x-ray micro-computed tomography. Phase contrast images were acquired at four distances. The reconstruction is typically performed in two successive steps: phase retrieval and tomographic reconstruction. In this work, different regularization methods were applied to the phase retrieval process. The application of a priori terms for heterogeneous objects enables quantitative 3D imaging of not only bone morphology, mineralization, and soft tissue formation, but also cells trapped in the pre-bone matrix. A statistical study was performed to derive statistically significant information on the different culture conditions.

  12. Quantitative evaluation of regularized phase retrieval algorithms on bone scaffolds seeded with bone cells.

    PubMed

    Weber, L; Langer, M; Tavella, S; Ruggiu, A; Peyrin, F

    2016-05-01

    In the field of regenerative medicine, there has been a growing interest in studying the combination of bone scaffolds and cells that can maximize newly formed bone. In-line phase-contrast x-ray tomography was used to image porous bone scaffolds (Skelite(©)), seeded with bone forming cells. This technique allows the quantification of both mineralized and soft tissue, unlike with classical x-ray micro-computed tomography. Phase contrast images were acquired at four distances. The reconstruction is typically performed in two successive steps: phase retrieval and tomographic reconstruction. In this work, different regularization methods were applied to the phase retrieval process. The application of a priori terms for heterogeneous objects enables quantitative 3D imaging of not only bone morphology, mineralization, and soft tissue formation, but also cells trapped in the pre-bone matrix. A statistical study was performed to derive statistically significant information on the different culture conditions. PMID:27054380

  13. Evaluating Performances of Solar-Energy Systems

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1987-01-01

    CONC11 computer program calculates performances of dish-type solar thermal collectors and power systems. Solar thermal power system consists of one or more collectors, power-conversion subsystems, and powerprocessing subsystems. CONC11 intended to aid system designer in comparing performance of various design alternatives. Written in Athena FORTRAN and Assembler.

  14. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  15. Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging☆

    PubMed Central

    Oishi, Kenichi; Faria, Andreia V.; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2013-01-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a “growth percentile chart,” which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced

  16. Investigation of the performance characteristics of a plasma synthetic jet actuator based on a quantitative Schlieren method

    NASA Astrophysics Data System (ADS)

    Zong, Hao-hua; Wu, Yun; Song, Hui-min; Jia, Min; Liang, Hua; Li, Ying-hong; Zhang, Zhi-bo

    2016-05-01

    A quantitative Schlieren method is developed to calculate the density field of axisymmetric flows. With this method, the flow field structures of plasma synthetic jets are analysed in detail. Major performance parameters, including the maximum density increase behind the shock wave, the expelled mass per pulse and the impulse, are obtained to evaluate the intensity of the shock wave and the jet. A high-density but low-velocity jet issues out of the cavity after the precursor shock wave, with a vortex ring at the wave front. The vortex ring gradually lags behind the center jet during the propagation, and its profile resembles a pair of kidneys in shape. After the jet terminates, the vortex ring breaks down and the whole density field is separated into two regions. In one period, the jet front velocity first increases and then decreases, with a maximum value of 270 m s‑1. The precursor shock wave velocity decays quickly from 370 m s‑1 to 340 m s‑1 in the first 50 μs. The variation in the maximum density rise behind the precursor shock wave is similar to that of the jet front velocity. The averaged exit density drops sharply at around 50 μs and then gradually rises. The maximum mass flow rate is about 0.35 g s‑1, and the total expelled mass in one period occupies 26% of the initial cavity gas mass. The impulse produced in the jet stage is estimated to be 5 μN s–1. The quantitative Schlieren method developed can also be used in the research of other compressible axisymmetric flows.

  17. Efficacy of quantitative hepatobiliary scintigraphy and fatty-meal sonography for evaluating patients with suspected partial common duct obstruction.

    PubMed

    Darweesh, R M; Dodds, W J; Hogan, W J; Geenen, J E; Collier, B D; Shaker, R; Kishk, S M; Stewart, E T; Lawson, T L; Hassanein, E H

    1988-03-01

    In this study we evaluated by blinded design the diagnostic efficacy of two noninvasive techniques, quantitative hepatobiliary scintigraphy (QHS) and fatty-meal sonography (FMS), for evaluating patients with suspected partial common duct obstruction. Quantitative hepatobiliary scintigraphy was performed on 56 cholecystectomized individuals (22 asymptomatic controls, 28 patients with suspected partial common duct obstruction, and 6 nonjaundiced cirrhotics) and FMS was done in 51 cases. For QHS, time-activity curves were generated for regions of interest over the liver, hepatic hilum, and common duct. For FMS, we measured common duct diameter before and 45 min after a fatty meal (Lipomul, 1.5 ml/kg). Each of the 28 patients with suspected partial common duct obstruction and 6 cirrhotic patients underwent endoscopic retrograde cholangiography, often accompanied by sphincter of Oddi manometry. Findings from these examinations were taken as the gold standard to determine the presence or absence of conditions that could account for intermittent symptomatic partial common duct obstruction. The most sensitive indicators for a positive test were a 45-min isotope clearance of less than 63% for QHS and a common duct increase of greater than or equal to 2 mm after the fatty meal for FMS. Of 28 patients with suspected partial common duct obstruction, 15 were judged to be true-positive and 13 true-negative. The 6 cirrhotic patients were without common duct obstruction. The study findings showed that each test had a 67% sensitivity that improved to 80% when the findings from both test results were combined. The specificity of QHS was 85% and that of FMS was 100%. All 6 cirrhotic patients had negative findings on FMS and 4 were false-positive on QHS. The true-positives included 8 patients with a small common duct stone and 6 with obstructive sphincter of Oddi dysfunction (4 stenosis, 2 dyskinesia). We conclude that noninvasive QHS and FMS afford good sensitivity and specificity

  18. Control and quantitation of voluntary weight-lifting performance of rats.

    PubMed

    Wirth, Oliver; Gregory, Erik W; Cutlip, Robert G; Miller, G Roger

    2003-07-01

    The present paper describes an exercise model that produces a voluntary hindlimb weight-lifting response. Each rat was operantly conditioned to enter a vertical tube, insert its head into a weighted ring (either 70 g or 700 g), lift the ring until its nose interrupted an infrared detector, and then lower the ring. Load cells measured the external force generated, and displacement transducers measured the vertical displacement of the ring during each lifting and lowering movement. The apparatus and training procedures were computer automated. Peak force, velocity, work, and power were calculated for each movement. Rats in both groups easily acquired the task after 12-15 training sessions, on average, conducted 5 days/wk. Once rats were trained, the lifting patterns were quite stable during several more weeks of posttraining exercise; however, the lighter 70-g load gave rise to more variable performances across rats. Results demonstrate the utility of quantitating the biomechanics of volitional movements and suggest that the present model can establish and maintain controlled repetitive movements necessary for studies of adaptation and/or injury in muscles, tendon, and bone. PMID:12665538

  19. Quantitative analysis of tivantinib in rat plasma using ultra performance liquid chromatography with tandem mass spectrometry.

    PubMed

    Bai, Yan-Li; Yuan, Hong-Chang; Zhang, Dong-Tao; Liu, Yuan; Zhang, Yin

    2016-07-15

    In this work, a simple, sensitive and fast ultra performance liquid chromatography with tandem mass spectrometry (UPLC-MS/MS) method was developed and validated for the quantitative determination of tivantinib in rat plasma. Plasma samples were processed with a protein precipitation. The separation was achieved by an Acquity UPLC BEH C18 (2.1mm×50mm, 1.7μm) column with a gradient mobile phase consisting of 0.1% formic acid in water and acetonitrile. Detection was carried out using positive-ion electrospray tandem mass spectrometry via multiple reaction monitoring (MRM). The validated method had an excellent linearity in the range of 1.0-100ng/mL (r(2)>0.9967) with a lower limit of quantification (1.0ng/mL). The extraction recovery was in the range of 79.4-84.2% for tivantinib and 80.3% for carbamazepine (internal standard, IS). The intra- and inter-day precision was below 8.9% and accuracy was from -7.2% to 9.5%. No notable matrix effect and astaticism was observed for tivantinib. The method has been successfully applied to a pharmacokinetic study of tivantinib in rats for the first time, which provides the basis for the further development and application of tivantinib. PMID:27179187

  20. 48 CFR 2936.201 - Evaluation of contractor performance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Construction 2936.201 Evaluation of contractor performance. The HCA must establish procedures to evaluate... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Evaluation of contractor performance. 2936.201 Section 2936.201 Federal Acquisition Regulations System DEPARTMENT OF LABOR...

  1. Team Primacy Concept (TPC) Based Employee Evaluation and Job Performance

    ERIC Educational Resources Information Center

    Muniute, Eivina I.; Alfred, Mary V.

    2007-01-01

    This qualitative study explored how employees learn from Team Primacy Concept (TPC) based employee evaluation and how they use the feedback in performing their jobs. TPC based evaluation is a form of multirater evaluation, during which the employee's performance is discussed by one's peers in a face-to-face team setting. The study used Kolb's…

  2. Guidelines for Performance Based Evaluation: Teachers, Counselors, Librarians. [New Edition.

    ERIC Educational Resources Information Center

    Missouri State Dept. of Elementary and Secondary Education, Jefferson City.

    Guidelines for the performance-based evaluation of teachers, counselors, and librarians in the Missouri public schools are provided in this manual. Performance-based evaluation of school staff, mandated by state law, is described in terms of its philosophy and procedures, suggested evaluation criteria, and descriptors for each of the three job…

  3. A quantitative metrology for performance characterization of breast tomosynthesis systems based on an anthropomorphic phantom

    NASA Astrophysics Data System (ADS)

    Ikejimba, Lynda; Chen, Yicheng; Oberhofer, Nadia; Kiarashi, Nooshin; Lo, Joseph Y.; Samei, Ehsan

    2015-03-01

    Purpose: Common methods for assessing image quality of digital breast tomosynthesis (DBT) devices currently utilize simplified or otherwise unrealistic phantoms, which use inserts in a uniform background and gauge performance based on a subjective evaluation of insert visibility. This study proposes a different methodology to assess system performance using a three-dimensional clinically-informed anthropomorphic breast phantom. Methods: The system performance is assessed by imaging the phantom and computationally characterizing the resultant images in terms of several new metrics. These include a contrast index (reflective of local difference between adipose and glandular material), a contrast to noise ratio index (reflective of contrast against local background noise), and a nonuniformity index (reflective of contributions of noise and artifacts within uniform adipose regions). Indices were measured at ROI sizes of 10mm and 37 mm, respectively. The method was evaluated at fixed dose of 1.5 mGy AGD. Results: Results indicated notable differences between systems. At 10 mm, vendor A had the highest contrast index, followed by B and C in that. The performance ranking was identical at the largest ROI size. The non-uniformity index similarly exhibited system-dependencies correlated with visual appearance of clutter from out-of-plane artifacts. Vendor A had the greatest NI at all ROI sizes, B had the second greatest, and C the least. Conclusions: The findings illustrate that the anthropomorphic phantom can be used as a quality control tool with results that are targeted to be more reflective of clinical performance of breast tomosynthesis systems of multiple manufacturers.

  4. A model for evaluating the social performance of construction waste management

    SciTech Connect

    Yuan Hongping

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Scant attention is paid to social performance of construction waste management (CWM). Black-Right-Pointing-Pointer We develop a model for assessing the social performance of CWM. Black-Right-Pointing-Pointer With the model, the social performance of CWM can be quantitatively simulated. - Abstract: It has been determined by existing literature that a lot of research efforts have been made to the economic performance of construction waste management (CWM), but less attention is paid to investigation of the social performance of CWM. This study therefore attempts to develop a model for quantitatively evaluating the social performance of CWM by using a system dynamics (SD) approach. Firstly, major variables affecting the social performance of CWM are identified and a holistic system for assessing the social performance of CWM is formulated in line with feedback relationships underlying these variables. The developed system is then converted into a SD model through the software iThink. An empirical case study is finally conducted to demonstrate application of the model. Results of model validation indicate that the model is robust and reasonable to reflect the situation of the real system under study. Findings of the case study offer helpful insights into effectively promoting the social performance of CWM of the project investigated. Furthermore, the model exhibits great potential to function as an experimental platform for dynamically evaluating effects of management measures on improving the social performance of CWM of construction projects.

  5. Quantitative analysis of real-time tissue elastography for evaluation of liver fibrosis

    PubMed Central

    Shi, Ying; Wang, Xing-Hua; Zhang, Huan-Hu; Zhang, Hai-Qing; Tu, Ji-Zheng; Wei, Kun; Li, Juan; Liu, Xiao-Li

    2014-01-01

    The present study aimed to investigate the feasibility of quantitative analysis of liver fibrosis using real-time tissue elastography (RTE) and its pathological and molecule biological basis. Methods: Fifty-four New Zealand rabbits were subcutaneously injected with thioacetamide (TAA) to induce liver fibrosis as the model group, and another eight New Zealand rabbits served as the normal control group. Four rabbits were randomly taken every two weeks for real-time tissue elastography (RTE) and quantitative analysis of tissue diffusion. The obtained twelve characteristic quantities included relative mean value (MEAN), standard deviation (SD), blue area % (% AREA), complexity (COMP), kurtosis (KURT), skewness (SKEW), contrast (CONT), entropy (ENT), inverse different moment (IDM), angular secon moment (ASM), correlation (CORR) and liver fibrosis index (LF Index). Rabbits were executed and liver tissues were taken for pathological staging of liver fibrosis (grouped by pathological stage into S0 group, S1 group, S2 group, S3 group and S4 group). In addition, the collagen I (Col I) and collagen III (Col III) expression levels in liver tissue were detected by Western blot. Results: Except for KURT, there were significant differences among the other eleven characteristic quantities (P < 0.05). LF Index, Col I and Col III expression levels showed a rising trend with increased pathological staging of liver fibrosis, presenting a positive correlation with the pathological staging of liver fibrosis (r = 0.718, r = 0.693, r = 0.611, P < 0.05). Conclusion: RTE quantitative analysis is expected for noninvasive evaluation of the pathological staging of liver fibrosis. PMID:24955175

  6. Quantitative determination of fluoxetine in human serum by high performance thin layer chromatography.

    PubMed

    Mennickent, Sigrid; Fierro, Ricardo; Vega, Mario; De Diego, Marta; Godoy, C Gloria

    2010-07-01

    A high performance thin layer chromatographic method was developed and validated for the quantification of fluoxetine in human serum. Fluoxetine was extracted by liquid-liquid extraction method with diethyl ether as extraction solvent. Imipramine was used as internal standard. The chromatographic separation was achieved on precoated silica gel F 254 high performance thin layer chromatographic plates using a mixture of toluene/acetic acid glacial (4:5 v/v) as mobile phase. 4-Dimethylamino-azobenzene-4-sulphonyl chloride was used as derivatization reagent. Densitometric detection was done at 272 nm. The method was linear between 12.5 and 87.5 ng/spot, corresponding to 0.05 and 0.35 ng/microL of fluoxetine in human serum after extraction process and applying 25 microL to the chromatographic plates. The method correlation coefficient was 0.999. The intra-assay and inter-assay precisions, expressed as the RSD, were in the range of 0.70-2.01% (n=3) and 0.81-3.90% (n=9), respectively. The LOD was 0.23 ng, and the LOQ was 0.70 ng. The method proved be accurate, with a recovery between 94.75 and 98.95%, with a RSD not higher than 3.61% and was selective for the active principle tested. This method was successfully applied to quantify fluoxetine in patient serum samples. In conclusion, the method is useful for quantitative determination of fluoxetine in human serum. PMID:20533339

  7. The use of a battery of tracking tests in the quantitative evaluation of neurological function

    NASA Technical Reports Server (NTRS)

    Repa, B. S.; Albers, J. W.; Potvin, A. R.; Tourtellotte, W. W.

    1972-01-01

    A tracking test battery has been applied in a drug trail designed to compare the efficacy of L-DOPA and amantadine to that of L-DOPA and placebo in the treatment of 28 patients with Parkinson's disease. The drug trial provided an ideal opportunity for objectively evaluating the usefulness of tracking tests in assessing changes in neurologic function. Evaluating changes in patient performance resulting from disease progression and controlled clinical trials is of great importance in establishing effective treatment programs.

  8. Quantitative evaluation of the memory bias effect in ROC studies with PET/CT

    NASA Astrophysics Data System (ADS)

    Kallergi, Maria; Pianou, Nicoletta; Georgakopoulos, Alexandros; Kafiri, Georgia; Pavlou, Spiros; Chatziioannou, Sofia

    2012-02-01

    PURPOSE. The purpose of the study was to evaluate the memory bias effect in ROC experiments with tomographic data and, specifically, in the evaluation of two different PET/CT protocols for the detection and diagnosis of recurrent thyroid cancer. MATERIALS AND METHODS. Two readers participated in an ROC experiment that evaluated tomographic images from 43 patients followed up for thyroid cancer recurrence. Readers evaluated first whole body PET/CT scans of the patients and then a combination of whole body and high-resolution head and neck scans of the same patients. The second set was read twice. Once within 48 hours of the first set and the second time at least a month later. The detection and diagnostic performances of the readers in the three reading sessions were assessed with the DBMMRMC and LABMRMC software using the area under the ROC curve as a performance index. Performances were also evaluated by comparing the number and the size of the detected abnormal foci among the three readings. RESULTS. There was no performance difference between first and second treatments. There were statistically significant differences between first and third, and second and third treatments showing that memory can seriously affect the outcome of ROC studies. CONCLUSION. Despite the fact that tomographic data involve numerous image slices per patient, the memory bias effect is present and substantial and should be carefully eliminated from analogous ROC experiments.

  9. 24 CFR 968.330 - PHA performance and evaluation report.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false PHA performance and evaluation... 250 or More Public Housing Units) § 968.330 PHA performance and evaluation report. For any FFY in which a PHA has received assistance under this subpart, the PHA shall submit a Performance...

  10. A Performance Evaluation System for Professional Support Personnel.

    ERIC Educational Resources Information Center

    Stronge, James H.; Helm, Virginia M.

    1992-01-01

    Provides a conceptual framework for a professional support personnel (e.g., counselors, deans, librarians) performance evaluation system. Outlines steps in evaluating support personnel (identifying system needs, relating program expectations to job responsibilities, selecting performance indicators, setting job performance standards, documenting…

  11. Quality consistency evaluation of Melissa officinalis L. commercial herbs by HPLC fingerprint and quantitation of selected phenolic acids.

    PubMed

    Arceusz, Agnieszka; Wesolowski, Marek

    2013-09-01

    To evaluate the quality consistency of commercial medicinal herbs, a simple and reliable HPLC method with UV-vis detector was developed, both for fingerprint analysis and quantitation of some pharmacologically active constituents (marker compounds). Melissa officinalis L. (lemon balm) was chosen for this study because it is widely used as an aromatic, culinary and medicine remedy. About fifty peaks were found in each chromatogram of a lemon balm extract, including twelve satisfactorily resolved characteristic peaks. A reference chromatographic fingerprint for the studied medicinal herb was calculated using Matlab 9.1 software as a result of analysing all the 19 lemon balm samples obtained from 12 Polish manufacturers. The similarity values and the results of principal component analysis revealed that all the samples were highly correlated with the reference fingerprint and could be accurately classified in relation to their quality consistency. Next, a quantitation of selected phenolic acids in the studied samples was performed. The results have shown that the levels of phenolic acids, i.e. gallic, chlorogenic, syringic, caffeic, ferulic and rosmarinic were as follows (mg/g of dry weight): 0.001-0.067, 0.010-0.333, 0.007-0.553, 0.047-0.705, 0.006-1.589 and 0.158-48.608, respectively. Statistical analysis indicated that rosmarinic acid occurs in M. officinalis at the highest level, whereas gallic acid in the lowest. A detailed inspection of these data has also revealed that reference chromatographic fingerprints combined with quantitation of pharmacologically active constituents of the plant could be used as an efficient strategy for monitoring of the lemon balm quality consistency. PMID:23770780

  12. Evaluation of performance impairment by spacecraft contaminants

    NASA Technical Reports Server (NTRS)

    Geller, I.; Hartman, R. J., Jr.; Mendez, V. M.

    1977-01-01

    The environmental contaminants (isolated as off-gases in Skylab and Apollo missions) were evaluated. Specifically, six contaminants were evaluated for their effects on the behavior of juvenile baboons. The concentrations of contaminants were determined through preliminary range-finding studies with laboratory rats. The contaminants evaluated were acetone, methyl ethyl ketone (MEK), methyl isobutyl ketone (MIBK), trichloroethylene (TCE), heptane and Freon 21. When the studies of the individual gases were completed, the baboons were also exposed to a mixture of MEK and TCE. The data obtained revealed alterations in the behavior of baboons exposed to relatively low levels of the contaminants. These findings were presented at the First International Symposium on Voluntary Inhalation of Industrial Solvents in Mexico City, June 21-24, 1976. A preprint of the proceedings is included.

  13. EVALUATION OF VENTILATION PERFORMANCE FOR INDOOR SPACE

    EPA Science Inventory

    The paper discusses a personal-computer-based application of computational fluid dynamics that can be used to determine the turbulent flow field and time-dependent/steady-state contaminant concentration distributions within isothermal indoor space. (NOTE: Ventilation performance ...

  14. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  15. Evaluation of a Quantitative Serological Assay for Diagnosing Chronic Pulmonary Aspergillosis

    PubMed Central

    Fujita, Yuka; Suzuki, Hokuto; Doushita, Kazushi; Kuroda, Hikaru; Takahashi, Masaaki; Yamazaki, Yasuhiro; Tsuji, Tadakatsu; Fujikane, Toshiaki; Osanai, Shinobu; Sasaki, Takaaki; Ohsaki, Yoshinobu

    2016-01-01

    The purpose of this study was to evaluate the clinical utility of a quantitative Aspergillus IgG assay for diagnosing chronic pulmonary aspergillosis. We examined Aspergillus-specific IgG levels in patients who met the following criteria: (i) chronic (duration of >3 months) pulmonary or systemic symptoms, (ii) radiological evidence of a progressive (over months or years) pulmonary lesion with surrounding inflammation, and (iii) no major discernible immunocompromising factors. Anti-Aspergillus IgG serum levels were retrospectively analyzed according to defined classifications. Mean Aspergillus IgG levels were significantly higher in the proven group than those in the possible and control groups (P < 0.01). Receiver operating characteristic curve analysis revealed that the Aspergillus IgG cutoff value for diagnosing proven cases was 50 mg of antigen-specific antibodies/liter (area under the curve, 0.94; sensitivity, 0.98; specificity, 0.84). The sensitivity and specificity for diagnosing proven cases using this cutoff were 0.77 and 0.78, respectively. The positive rates of Aspergillus IgG in the proven and possible groups were 97.9% and 39.2%, respectively, whereas that of the control group was 6.6%. The quantitative Aspergillus IgG assay offers reliable sensitivity and specificity for diagnosing chronic pulmonary aspergillosis and may be an alternative to the conventional precipitin test. PMID:27008878

  16. Quantitative and Qualitative Evaluation of Iranian Researchers’ Scientific Production in Dentistry Subfields

    PubMed Central

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-01-01

    Background: As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers’ scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. Methods: This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. Results: 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). Conclusions: The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields. PMID:26635439

  17. Evaluation of a Quantitative Serological Assay for Diagnosing Chronic Pulmonary Aspergillosis.

    PubMed

    Fujiuchi, Satoru; Fujita, Yuka; Suzuki, Hokuto; Doushita, Kazushi; Kuroda, Hikaru; Takahashi, Masaaki; Yamazaki, Yasuhiro; Tsuji, Tadakatsu; Fujikane, Toshiaki; Osanai, Shinobu; Sasaki, Takaaki; Ohsaki, Yoshinobu

    2016-06-01

    The purpose of this study was to evaluate the clinical utility of a quantitative Aspergillus IgG assay for diagnosing chronic pulmonary aspergillosis. We examined Aspergillus-specific IgG levels in patients who met the following criteria: (i) chronic (duration of >3 months) pulmonary or systemic symptoms, (ii) radiological evidence of a progressive (over months or years) pulmonary lesion with surrounding inflammation, and (iii) no major discernible immunocompromising factors. Anti-Aspergillus IgG serum levels were retrospectively analyzed according to defined classifications. Mean Aspergillus IgG levels were significantly higher in the proven group than those in the possible and control groups (P < 0.01). Receiver operating characteristic curve analysis revealed that the Aspergillus IgG cutoff value for diagnosing proven cases was 50 mg of antigen-specific antibodies/liter (area under the curve, 0.94; sensitivity, 0.98; specificity, 0.84). The sensitivity and specificity for diagnosing proven cases using this cutoff were 0.77 and 0.78, respectively. The positive rates of Aspergillus IgG in the proven and possible groups were 97.9% and 39.2%, respectively, whereas that of the control group was 6.6%. The quantitative Aspergillus IgG assay offers reliable sensitivity and specificity for diagnosing chronic pulmonary aspergillosis and may be an alternative to the conventional precipitin test. PMID:27008878

  18. NEUROBEHAVIORAL EVALUATION SYSTEM (NES) AND SCHOOL PERFORMANCE

    EPA Science Inventory

    The aims of this study were to explore the validity of a set of computerized tests, and to explore the validity of reaction time variability as an index of sustained attention. n Phase I, 105 7- to 10-year-old children were presented with five tests from the Neurobehavioral Evalu...

  19. The performance environment of the England youth soccer teams: a quantitative investigation.

    PubMed

    Pain, Matthew A; Harwood, Chris G

    2008-09-01

    We examined the performance environment of the England youth soccer teams. Using a conceptually grounded questionnaire developed from the themes identified by Pain and Harwood (2007), 82 players and 23 national coaches and support staff were surveyed directly following international tournaments regarding the factors that positively and negatively influenced performance. The survey enabled data to be captured regarding both the extent and magnitude of the impact of the factors comprising the performance environment. Overall, team and social factors were generally perceived to have the greatest positive impact, with players and staff showing high levels of consensus in their evaluations. Team leadership and strong team cohesion were identified by both groups as having the greatest positive impact. Overall, far fewer variables were perceived to have a negative impact on performance, especially for players. The main negatives common to both groups were players losing composure during games, player boredom, and a lack of available activities in the hotel. The major findings support those of Pain and Harwood (2007) and in using a larger sample helped to corroborate and strengthen the generalizability of the findings. PMID:18720205

  20. [Quantitative evaluation of film-screen combinations for x-ray diagnosis].

    PubMed

    Bronder, T; Heinze-Assmann, R

    1988-05-01

    The properties of screen/film combinations for radiographs set a lower limit for the x-ray exposure of the patient and an upper limit for the quality of the x-ray picture. Sensitivity, slope and resolution of different screen/film combinations were determined using a measuring phantom which was developed in the PTB. For all screens used the measurements show the same relation between screen sensitivity and resolution. This allows quantitative evaluation of image quality. A classification scheme derived from these results facilitates the selection of screen/film combinations for practical use. In addition for quality assurance gross differences in material properties and conditions of film development can be detected with the aid of the measuring phantom. PMID:3399512

  1. Methods for quantitative evaluation of dynamics of repair proteins within irradiated cells

    NASA Astrophysics Data System (ADS)

    Hable, V.; Dollinger, G.; Greubel, C.; Hauptner, A.; Krücken, R.; Dietzel, S.; Cremer, T.; Drexler, G. A.; Friedl, A. A.; Löwe, R.

    2006-04-01

    Living HeLa cells are irradiated well directed with single 100 MeV oxygen ions by the superconducting ion microprobe SNAKE, the Superconducting Nanoscope for Applied Nuclear (=Kern-) Physics Experiments, at the Munich 14 MV tandem accelerator. Various proteins, which are involved directly or indirectly in repair processes, accumulate as clusters (so called foci) at DNA-double strand breaks (DSBs) induced by the ions. The spatiotemporal dynamics of these foci built by the phosphorylated histone γ-H2AX are studied. For this purpose cells are irradiated in line patterns. The γ-H2AX is made visible under the fluorescence microscope using immunofluorescence techniques. Quantitative analysis methods are developed to evaluate the data of the microscopic images in order to analyze movement of the foci and their changing size.

  2. Evaluating the Performance of Administrators: The Process and the Tools.

    ERIC Educational Resources Information Center

    Herman, Jerry J.

    1991-01-01

    Describes the various roles (monitor, information gatherer, communicator and feedback provider, clarifier, coanalyzer, assister, resource provider, and motivator) played by the supervisor when evaluating administrators. Presents a sample evaluation instrument assessing five major performance areas (management, professionalism, leadership,…

  3. Performance evaluation of 1 kw PEFC

    SciTech Connect

    Komaki, Hideaki; Tsuchiyama, Syozo

    1996-12-31

    This report covers part of a joint study on a PEFC propulsion system for surface ships, summarized in a presentation to this Seminar, entitled {open_quote}Study on a PEFC Propulsion System for Surface Ships{close_quotes}, and which envisages application to a 1,500 DWT cargo vessel. The aspect treated here concerns the effects brought on PEFC operating performance by conditions particular to shipboard operation. The performance characteristics were examined through tests performed on a 1 kw stack and on a single cell (Manufactured by Fuji Electric Co., Ltd.). The tests covered the items (1) to (4) cited in the headings of the sections that follow. Specifications of the stack and single cell are as given.

  4. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs. PMID:26076424

  5. Quantitative MR evaluation of body composition in patients with Duchenne muscular dystrophy.

    PubMed

    Pichiecchio, Anna; Uggetti, Carla; Egitto, Maria Grazia; Berardinelli, Angela; Orcesi, Simona; Gorni, Ksenija Olga Tatiana; Zanardi, Cristina; Tagliabue, Anna

    2002-11-01

    The aim of this study was to propose a quantitative MR protocol with very short acquisition time and good reliability in volume construction, for the evaluation of body composition in patients affected by Duchenne muscular dystrophy (DMD). This MR protocol was compared with common anthropometric evaluations of the same patients. Nine boys affected by DMD, ranging in age from 6 to 12 years, were selected to undergo MR examination. Transversal T1-weighted spin-echo sequences (0.5T; TR 300 ms, TE 10 ms, slice thickness 10 mm, slice gap 1 mm) were used for all acquisitions, each consisting of 8 slices and lasting just 54 s. Whole-body examination needed an average of nine acquisitions. Afterwards, images were downloaded to an independent workstation and, through their electronic segmentation with a reference filter, total volume and adipose tissue volumes were calculated manually. This process took up to 2 h for each patient. The MR data were compared with anthropometric evaluations. Affected children have a marked increase in adipose tissue and a decrease in lean tissue compared with reference healthy controls. Mean fat mass calculated by MR is significantly higher than mean fat mass obtained using anthropometric measurements ( p<0.001). Our MR study proved to be accurate and easy to apply, although it was time-consuming. We recommend it in monitoring the progression of the disease and planning DMD patients' diet. PMID:12386760

  6. Quantitative analysis of topoisomerase II{alpha} to rapidly evaluate cell proliferation in brain tumors

    SciTech Connect

    Oda, Masashi; Arakawa, Yoshiki; Kano, Hideyuki; Kawabata, Yasuhiro; Katsuki, Takahisa; Shirahata, Mitsuaki; Ono, Makoto; Yamana, Norikazu; Hashimoto, Nobuo; Takahashi, Jun A. . E-mail: jat@kuhp.kyoto-u.ac.jp

    2005-06-17

    Immunohistochemical cell proliferation analyses have come into wide use for evaluation of tumor malignancy. Topoisomerase II{alpha} (topo II{alpha}), an essential nuclear enzyme, has been known to have cell cycle coupled expression. We here show the usefulness of quantitative analysis of topo II{alpha} mRNA to rapidly evaluate cell proliferation in brain tumors. A protocol to quantify topo II{alpha} mRNA was developed with a real-time RT-PCR. It took only 3 h to quantify from a specimen. A total of 28 brain tumors were analyzed, and the level of topo II{alpha} mRNA was significantly correlated with its immuno-staining index (p < 0.0001, r = 0.9077). Furthermore, it sharply detected that topo II{alpha} mRNA decreased in growth-inhibited glioma cell. These results support that topo II{alpha} mRNA may be a good and rapid indicator to evaluate cell proliferate potential in brain tumors.

  7. Quantitative evaluation of linear and nonlinear methods characterizing interdependencies between brain signals

    PubMed Central

    Ansari-Asl, Karim; Senhadji, Lotfi; Bellanger, Jean-Jacques; Wendling, Fabrice

    2006-01-01

    Brain functional connectivity can be characterized by the temporal evolution of correlation between signals recorded from spatially-distributed regions. It is aimed at explaining how different brain areas interact within networks involved during normal (as in cognitive tasks) or pathological (as in epilepsy) situations. Numerous techniques were introduced for assessing this connectivity. Recently, some efforts were made to compare methods performances but mainly qualitatively and for a special application. In this paper, we go further and propose a comprehensive comparison of different classes of methods (linear and nonlinear regressions, phase synchronization (PS), and generalized synchronization (GS)) based on various simulation models. For this purpose, quantitative criteria are used: in addition to mean square error (MSE) under null hypothesis (independence between two signals) and mean variance (MV) computed over all values of coupling degree in each model, we introduce a new criterion for comparing performances. Results show that the performances of the compared methods are highly depending on the hypothesis regarding the underlying model for the generation of the signals. Moreover, none of them outperforms the others in all cases and the performance hierarchy is model-dependent. PMID:17025676

  8. Quantitative Ultrasonic Evaluation of Radiation-Induced Late Tissue Toxicity: Pilot Study of Breast Cancer Radiotherapy

    SciTech Connect

    Liu Tian; Zhou Jun; Yoshida, Emi J.; Woodhouse, Shermian A.; Schiff, Peter B.; Wang, Tony J.C.; Lu Zhengfeng; Pile-Spellman, Eliza; Zhang Pengpeng; Kutcher, Gerald J.

    2010-11-01

    Purpose: To investigate the use of advanced ultrasonic imaging to quantitatively evaluate normal-tissue toxicity in breast-cancer radiation treatment. Methods and Materials: Eighteen breast cancer patients who received radiation treatment were enrolled in an institutional review board-approved clinical study. Radiotherapy involved a radiation dose of 50.0 to 50.4 Gy delivered to the entire breast, followed by an electron boost of 10.0 to 16.0 Gy delivered to the tumor bed. Patients underwent scanning with ultrasound during follow-up, which ranged from 6 to 94 months (median, 22 months) postradiotherapy. Conventional ultrasound images and radio-frequency (RF) echo signals were acquired from treated and untreated breasts. Three ultrasound parameters, namely, skin thickness, Pearson coefficient, and spectral midband fit, were computed from RF signals to measure radiation-induced changes in dermis, hypodermis, and subcutaneous tissue, respectively. Ultrasound parameter values of the treated breast were compared with those of the untreated breast. Ultrasound findings were compared with clinical assessment using Radiation Therapy Oncology Group (RTOG) late-toxicity scores. Results: Significant changes were observed in ultrasonic parameter values of the treated vs. untreated breasts. Average skin thickness increased by 27.3%, from 2.05 {+-} 0.22mm to 2.61 {+-} 0.52mm; Pearson coefficient decreased by 31.7%, from 0.41 {+-} 0.07 to 0.28 {+-} 0.05; and midband fit increased by 94.6%, from -0.92 {+-} 7.35 dB to 0.87 {+-} 6.70 dB. Ultrasound evaluations were consistent with RTOG scores. Conclusions: Quantitative ultrasound provides a noninvasive, objective means of assessing radiation-induced changes to the skin and subcutaneous tissue. This imaging tool will become increasingly valuable as we continue to improve radiation therapy technique.

  9. Quantitative evaluation of interaction force between functional groups in protein and polymer brush surfaces.

    PubMed

    Sakata, Sho; Inoue, Yuuki; Ishihara, Kazuhiko

    2014-03-18

    To understand interactions between polymer surfaces and different functional groups in proteins, interaction forces were quantitatively evaluated by force-versus-distance curve measurements using atomic force microscopy with a functional-group-functionalized cantilever. Various polymer brush surfaces were systematically prepared by surface-initiated atom transfer radical polymerization as well-defined model surfaces to understand protein adsorption behavior. The polymer brush layers consisted of phosphorylcholine groups (zwitterionic/hydrophilic), trimethylammonium groups (cationic/hydrophilic), sulfonate groups (anionic/hydrophilic), hydroxyl groups (nonionic/hydrophilic), and n-butyl groups (nonionic/hydrophobic) in their side chains. The interaction forces between these polymer brush surfaces and different functional groups (carboxyl groups, amino groups, and methyl groups, which are typical functional groups existing in proteins) were quantitatively evaluated by force-versus-distance curve measurements using atomic force microscopy with a functional-group-functionalized cantilever. Furthermore, the amount of adsorbed protein on the polymer brush surfaces was quantified by surface plasmon resonance using albumin with a negative net charge and lysozyme with a positive net charge under physiological conditions. The amount of proteins adsorbed on the polymer brush surfaces corresponded to the interaction forces generated between the functional groups on the cantilever and the polymer brush surfaces. The weakest interaction force and least amount of protein adsorbed were observed in the case of the polymer brush surface with phosphorylcholine groups in the side chain. On the other hand, positive and negative surfaces generated strong forces against the oppositely charged functional groups. In addition, they showed significant adsorption with albumin and lysozyme, respectively. These results indicated that the interaction force at the functional group level might be

  10. Quantitative evaluation of optical coherence tomography signal enhancement with gold nanoshells.

    PubMed

    Agrawal, Anant; Huang, Stanley; Wei Haw Lin, Alex; Lee, Min-Ho; Barton, Jennifer K; Drezek, Rebekah A; Pfefer, T Joshua

    2006-01-01

    Nanoshell-enhanced optical coherence tomography (OCT) is a novel technique with the potential for molecular imaging and improved disease detection. However, optimization of this approach will require a quantitative understanding of the influence of nanoshell parameters on detected OCT signals. In this study, OCT was performed at 1310 nm in water and turbid tissue-simulating phantoms to which nanoshells were added. The effect of nanoshell concentration, core diameter, and shell thickness on signal enhancement was characterized. Experimental results indicated trends that were consistent with predicted optical properties-a monotonic increase in signal intensity and attenuation with increasing shell and core size. Threshold concentrations for a 2-dB OCT signal intensity gain were determined for several nanoshell geometries. For the most highly backscattering nanoshells tested-291-nm core diameter, 25-nm shell thickness-a concentration of 10(9) nanoshells/mL was needed to produce this signal increase. Based on these results, we discuss various practical considerations for optimizing nanoshell-enhanced OCT. Quantitative experimental data presented here will facilitate optimization of OCT-based diagnostics and may also be relevant to other reflectance-based approaches as well. PMID:16965149

  11. Quantitative evaluation of his-tag purification and immunoprecipitation of tristetraprolin and its mutant proteins from transfected human cells

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Histidine (His)-tag is widely used for affinity purification of recombinant proteins, but the yield and purity of expressed proteins are quite different. Little information is available about quantitative evaluation of this procedure. The objective of the current study was to evaluate the His-tag pr...

  12. Evaluating the effect of a reader worker program on team performance

    SciTech Connect

    Hahn, H.A.; Alvarez, Y.P.

    1994-03-01

    When safety, security, or other logistical concerns prevent direct objective assessment of team performance, other evaluation techniques become necessary. In this paper, the effect of a Department of Energy-mandated reader worker program on team performance at a particular DOE facility was evaluated using unstructured observations, informal discussions with technicians, and human reliability analysis. The reader worker program is intended to enhance nuclear explosive safety by improving the reliability of team performance. The three methods used for the evaluation combine to provide a strong indication that team performance is in fact enhanced by a properly implemented reader worker procedure. Because direct quantitative data on dependent variables particular to the task of interest is not available, however, there has been some skepticism regarding the results by staff at the facility.

  13. Quantitative evaluation of susceptibility effects caused by dental materials in head magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Strocchi, S.; Ghielmi, M.; Basilico, F.; Macchi, A.; Novario, R.; Ferretti, R.; Binaghi, E.

    2016-03-01

    This work quantitatively evaluates the effects induced by susceptibility characteristics of materials commonly used in dental practice on the quality of head MR images in a clinical 1.5T device. The proposed evaluation procedure measures the image artifacts induced by susceptibility in MR images by providing an index consistent with the global degradation as perceived by the experts. Susceptibility artifacts were evaluated in a near-clinical setup, using a phantom with susceptibility and geometric characteristics similar to that of a human head. We tested different dentist materials, called PAL Keramit, Ti6Al4V-ELI, Keramit NP, ILOR F, Zirconia and used different clinical MR acquisition sequences, such as "classical" SE and fast, gradient, and diffusion sequences. The evaluation is designed as a matching process between reference and artifacts affected images recording the same scene. The extent of the degradation induced by susceptibility is then measured in terms of similarity with the corresponding reference image. The matching process involves a multimodal registration task and the use an adequate similarity index psychophysically validated, based on correlation coefficient. The proposed analyses are integrated within a computer-supported procedure that interactively guides the users in the different phases of the evaluation method. 2-Dimensional and 3-dimensional indexes are used for each material and each acquisition sequence. From these, we drew a ranking of the materials, averaging the results obtained. Zirconia and ILOR F appear to be the best choice from the susceptibility artefacts point of view, followed, in order, by PAL Keramit, Ti6Al4V-ELI and Keramit NP.

  14. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas

    PubMed Central

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken’s embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  15. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas.

    PubMed

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken's embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  16. High-performance piezoelectric nanogenerators for self-powered nanosystems: quantitative standards and figures of merit.

    PubMed

    Wu, Wenzhuo

    2016-03-18

    Harvesting energies from the atmosphere cost-effectively is critical for both addressing worldwide long-term energy needs at the macro-scale, and achieving the sustainable maintenance-free operation of nanodevices at the micro-scale (Wang and Wu 2012 Angew. Chem. Int. Ed. 51 11700-21). Piezoelectric nanogenerator (NG) technology has demonstrated its great application potential in harvesting the ubiquitous and abundant mechanical energy. Despite of the progress made in this rapidly-advancing field, a fundamental understanding and common standard for consistently quantifying and evaluating the performance of the various types of piezoelectric NGs is still lacking. In their recent study Crossley and Kar-Narayan (2015 Nanotechnology 26 344001), systematically investigated dynamical properties of piezoelectric NGs by taking into account the effect of driving mechanism and load frequency on NG performance. They further defined the NGs' figures of merit as energy harvested normalized by applied strain or stress for NGs under strain-driven or stress-driven conditions, which are commonly seen in the vibrational energy harvesting. This work provides new insight and a feasible approach for consistently evaluating piezoelectric nanomaterials and NG devices, which is important for designing and optimizing nanoscale piezoelectric energy harvesters, as well as promoting their applications in emerging areas e.g. the internet of things, wearable devices, and self-powered nanosystems. PMID:26871611

  17. High-performance piezoelectric nanogenerators for self-powered nanosystems: quantitative standards and figures of merit

    NASA Astrophysics Data System (ADS)

    Wu, Wenzhuo

    2016-03-01

    Harvesting energies from the atmosphere cost-effectively is critical for both addressing worldwide long-term energy needs at the macro-scale, and achieving the sustainable maintenance-free operation of nanodevices at the micro-scale (Wang and Wu 2012 Angew. Chem. Int. Ed. 51 11700-21). Piezoelectric nanogenerator (NG) technology has demonstrated its great application potential in harvesting the ubiquitous and abundant mechanical energy. Despite of the progress made in this rapidly-advancing field, a fundamental understanding and common standard for consistently quantifying and evaluating the performance of the various types of piezoelectric NGs is still lacking. In their recent study Crossley and Kar-Narayan (2015 Nanotechnology 26 344001), systematically investigated dynamical properties of piezoelectric NGs by taking into account the effect of driving mechanism and load frequency on NG performance. They further defined the NGs’ figures of merit as energy harvested normalized by applied strain or stress for NGs under strain-driven or stress-driven conditions, which are commonly seen in the vibrational energy harvesting. This work provides new insight and a feasible approach for consistently evaluating piezoelectric nanomaterials and NG devices, which is important for designing and optimizing nanoscale piezoelectric energy harvesters, as well as promoting their applications in emerging areas e.g. the internet of things, wearable devices, and self-powered nanosystems.

  18. Using Ratio Analysis to Evaluate Financial Performance.

    ERIC Educational Resources Information Center

    Minter, John; And Others

    1982-01-01

    The ways in which ratio analysis can help in long-range planning, budgeting, and asset management to strengthen financial performance and help avoid financial difficulties are explained. Types of ratios considered include balance sheet ratios, net operating ratios, and contribution and demand ratios. (MSE)

  19. Performance Evaluation Gravity Probe B Design

    NASA Technical Reports Server (NTRS)

    Francis, Ronnie; Wells, Eugene M.

    1996-01-01

    This final report documents the work done to develop a 6 degree-of-freedom simulation of the Lockheed Martin Gravity Probe B (GPB) Spacecraft. This simulation includes the effects of vehicle flexibility and propellant slosh. The simulation was used to investigate the control performance of the spacecraft when subjected to realistic on orbit disturbances.

  20. Game Performance Evaluation in Male Goalball Players.

    PubMed

    Molik, Bartosz; Morgulec-Adamowicz, Natalia; Kosmol, Andrzej; Perkowski, Krzysztof; Bednarczuk, Grzegorz; Skowroński, Waldemar; Gomez, Miguel Angel; Koc, Krzysztof; Rutkowska, Izabela; Szyman, Robert J

    2015-11-22

    Goalball is a Paralympic sport exclusively for athletes who are visually impaired and blind. The aims of this study were twofold: to describe game performance of elite male goalball players based upon the degree of visual impairment, and to determine if game performance was related to anthropometric characteristics of elite male goalball players. The study sample consisted of 44 male goalball athletes. A total of 38 games were recorded during the Summer Paralympic Games in London 2012. Observations were reported using the Game Efficiency Sheet for Goalball. Additional anthropometric measurements included body mass (kg), body height (cm), the arm span (cm) and length of the body in the defensive position (cm). The results differentiating both groups showed that the players with total blindness obtained higher means than the players with visual impairment for game indicators such as the sum of defense (p = 0.03) and the sum of good defense (p = 0.04). The players with visual impairment obtained higher results than those with total blindness for attack efficiency (p = 0.04), the sum of penalty defenses (p = 0.01), and fouls (p = 0.01). The study showed that athletes with blindness demonstrated higher game performance in defence. However, athletes with visual impairment presented higher efficiency in offensive actions. The analyses confirmed that body mass, body height, the arm span and length of the body in the defensive position did not differentiate players' performance at the elite level. PMID:26834872

  1. Space Shuttle Underside Astronaut Communications Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Dobbins, Justin A.; Loh, Yin-Chung; Kroll, Quin D.; Sham, Catherine C.

    2005-01-01

    The Space Shuttle Ultra High Frequency (UHF) communications system is planned to provide Radio Frequency (RF) coverage for astronauts working underside of the Space Shuttle Orbiter (SSO) for thermal tile inspection and repairing. This study is to assess the Space Shuttle UHF communication performance for astronauts in the shadow region without line-of-sight (LOS) to the Space Shuttle and Space Station UHF antennas. To insure the RF coverage performance at anticipated astronaut worksites, the link margin between the UHF antennas and Extravehicular Activity (EVA) Astronauts with significant vehicle structure blockage was analyzed. A series of near-field measurements were performed using the NASA/JSC Anechoic Chamber Antenna test facilities. Computational investigations were also performed using the electromagnetic modeling techniques. The computer simulation tool based on the Geometrical Theory of Diffraction (GTD) was used to compute the signal strengths. The signal strength was obtained by computing the reflected and diffracted fields along the propagation paths between the transmitting and receiving antennas. Based on the results obtained in this study, RF coverage for UHF communication links was determined for the anticipated astronaut worksite in the shadow region underneath the Space Shuttle.

  2. Game Performance Evaluation in Male Goalball Players

    PubMed Central

    Molik, Bartosz; Morgulec-Adamowicz, Natalia; Kosmol, Andrzej; Perkowski, Krzysztof; Bednarczuk, Grzegorz; Skowroński, Waldemar; Gomez, Miguel Angel; Koc, Krzysztof; Rutkowska, Izabela; Szyman, Robert J

    2015-01-01

    Goalball is a Paralympic sport exclusively for athletes who are visually impaired and blind. The aims of this study were twofold: to describe game performance of elite male goalball players based upon the degree of visual impairment, and to determine if game performance was related to anthropometric characteristics of elite male goalball players. The study sample consisted of 44 male goalball athletes. A total of 38 games were recorded during the Summer Paralympic Games in London 2012. Observations were reported using the Game Efficiency Sheet for Goalball. Additional anthropometric measurements included body mass (kg), body height (cm), the arm span (cm) and length of the body in the defensive position (cm). The results differentiating both groups showed that the players with total blindness obtained higher means than the players with visual impairment for game indicators such as the sum of defense (p = 0.03) and the sum of good defense (p = 0.04). The players with visual impairment obtained higher results than those with total blindness for attack efficiency (p = 0.04), the sum of penalty defenses (p = 0.01), and fouls (p = 0.01). The study showed that athletes with blindness demonstrated higher game performance in defence. However, athletes with visual impairment presented higher efficiency in offensive actions. The analyses confirmed that body mass, body height, the arm span and length of the body in the defensive position did not differentiate players’ performance at the elite level. PMID:26834872

  3. An hierarchical approach to performance evaluation of expert systems

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1985-01-01

    The number and size of expert systems is growing rapidly. Formal evaluation of these systems - which is not performed for many systems - increases the acceptability by the user community and hence their success. Hierarchical evaluation that had been conducted for computer systems is applied for expert system performance evaluation. Expert systems are also evaluated by treating them as software systems (or programs). This paper reports many of the basic concepts and ideas in the Performance Evaluation of Expert Systems Study being conducted at the University of Southwestern Louisiana.

  4. Application performation evaluation of the HTMT architecture.

    SciTech Connect

    Hereld, M.; Judson, I. R.; Stevens, R.

    2004-02-23

    In this report we summarize findings from a study of the predicted performance of a suite of application codes taken from the research environment and analyzed against a modeling framework for the HTMT architecture. We find that the inward bandwidth of the data vortex may be a limiting factor for some applications. We also find that available memory in the cryogenic layer is a constraining factor in the partitioning of applications into parcels. The architecture in several examples may be inadequately exploited; in particular, applications typically did not capitalize well on the available computational power or data organizational capability in the PIM layers. The application suite provided significant examples of wide excursions from the accepted (if simplified) program execution model--in particular, by required complex in-SPELL synchronization between parcels. The availability of the HTMT-C emulation environment did not contribute significantly to the ability to analyze applications, because of the large gap between the available hardware descriptions and parameters in the modeling framework and the types of data that could be collected via HTMT-C emulation runs. Detailed analysis of application performance, and indeed further credible development of the HTMT-inspired program execution model and system architecture, requires development of much better tools. Chief among them are cycle-accurate simulation tools for computational, network, and memory components. Additionally, there is a critical need for a whole system simulation tool to allow detailed programming exercises and performance tests to be developed. We address three issues in this report: (1) The landscape for applications of petaflops computing; (2) The performance of applications on the HTMT architecture; and (3) The effectiveness of HTMT-C as a tool for studying and developing the HTMT architecture. We set the scene with observations about the course of application development as petaflops

  5. What Makes a Good Criminal Justice Professor? A Quantitative Analysis of Student Evaluation Forms

    ERIC Educational Resources Information Center

    Gerkin, Patrick M.; Kierkus, Christopher A.

    2011-01-01

    The goal of this research is to understand how students define teaching effectiveness. By using multivariate regression analysis of 8,000+ student evaluations of teaching compiled by a School of Criminal Justice at a Midwestern public university, this paper explores the relationships between individual indicators of instructor performance (e.g.…

  6. Evaluating Suit Fit Using Performance Degradation

    NASA Technical Reports Server (NTRS)

    Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar

    2011-01-01

    The Mark III suit has multiple sizes of suit components (arm, leg, and gloves) as well as sizing inserts to tailor the fit of the suit to an individual. This study sought to determine a way to identify the point an ideal suit fit transforms into a bad fit and how to quantify this breakdown using mobility-based physical performance data. This study examined the changes in human physical performance via degradation of the elbow and wrist range of motion of the planetary suit prototype (Mark III) with respect to changes in sizing and as well as how to apply that knowledge to suit sizing options and improvements in suit fit. The methods implemented in this study focused on changes in elbow and wrist mobility due to incremental suit sizing modifications. This incremental sizing was within a range that included both optimum and poor fit. Suited range of motion data was collected using a motion analysis system for nine isolated and functional tasks encompassing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm only. The results were then compared across sizing configurations. The results of this study indicate that range of motion may be used as a viable parameter to quantify at what stage suit sizing causes a detriment in performance; however the human performance decrement appeared to be based on the interaction of multiple joints along a limb, not a single joint angle. The study was able to identify a preliminary method to quantify the impact of size on performance and to develop a means to gauge tolerances around optimal size. More work is needed to improve the assessment of optimal fit and to compensate for multiple joint interactions.

  7. Evaluating Suit Fit Using Performance Degradation

    NASA Technical Reports Server (NTRS)

    Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar

    2012-01-01

    The Mark III planetary technology demonstrator space suit can be tailored to an individual by swapping the modular components of the suit, such as the arms, legs, and gloves, as well as adding or removing sizing inserts in key areas. A method was sought to identify the transition from an ideal suit fit to a bad fit and how to quantify this breakdown using a metric of mobility-based human performance data. To this end, the degradation of the range of motion of the elbow and wrist of the suit as a function of suit sizing modifications was investigated to attempt to improve suit fit. The sizing range tested spanned optimal and poor fit and was adjusted incrementally in order to compare each joint angle across five different sizing configurations. Suited range of motion data were collected using a motion capture system for nine isolated and functional tasks utilizing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm by itself. Findings indicated that no single joint drives the performance of the arm as a function of suit size; instead it is based on the interaction of multiple joints along a limb. To determine a size adjustment range where an individual can operate the suit at an acceptable level, a performance detriment limit was set. This user-selected limit reveals the task-dependent tolerance of the suit fit around optimal size. For example, the isolated joint motion indicated that the suit can deviate from optimal by as little as -0.6 in to -2.6 in before experiencing a 10% performance drop in the wrist or elbow joint. The study identified a preliminary method to quantify the impact of size on performance and developed a new way to gauge tolerances around optimal size.

  8. High performance liquid chromatographic assay for the quantitation of total glutathione in plasma

    NASA Technical Reports Server (NTRS)

    Abukhalaf, Imad K.; Silvestrov, Natalia A.; Menter, Julian M.; von Deutsch, Daniel A.; Bayorh, Mohamed A.; Socci, Robin R.; Ganafa, Agaba A.

    2002-01-01

    A simple and widely used homocysteine HPLC procedure was applied for the HPLC identification and quantitation of glutathione in plasma. The method, which utilizes SBDF as a derivatizing agent utilizes only 50 microl of sample volume. Linear quantitative response curve was generated for glutathione over a concentration range of 0.3125-62.50 micromol/l. Linear regression analysis of the standard curve exhibited correlation coefficient of 0.999. Limit of detection (LOD) and limit of quantitation (LOQ) values were 5.0 and 15 pmol, respectively. Glutathione recovery using this method was nearly complete (above 96%). Intra-assay and inter-assay precision studies reflected a high level of reliability and reproducibility of the method. The applicability of the method for the quantitation of glutathione was demonstrated successfully using human and rat plasma samples.

  9. Relating Performance Evaluation to Compensation of Public Sector Employees

    ERIC Educational Resources Information Center

    Van Adelsberg, Henri

    1978-01-01

    Provides a variety of approaches to administering individual salaries on the basis of evaluated performance. Describes methods of precalculating and controlling salary expenditures while simultaneously administering salaries on a "relative" rather than "absolute" performance rating system. (Author)

  10. PERFORMANCE EVALUATION OF TYPE I MARINE SANITATION DEVICES

    EPA Science Inventory

    This performance test was designed to evaluate the effectiveness of two Type I Marine Sanitation Devices (MSDs): the Electro Scan Model EST 12, manufactured by Raritan Engineering Company, Inc., and the Thermopure-2, manufactured by Gross Mechanical Laboratories, Inc. Performance...

  11. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    NASA Astrophysics Data System (ADS)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  12. A quantitative evaluation of the dynamic cathodoluminescence contrast of gliding dislocations in semiconductor crystals

    NASA Astrophysics Data System (ADS)

    Vasnyov, S.; Schreiber, J.; Hoering, L.

    2004-01-01

    Dark cathodoluminescence (CL) defect contrasts observed in CL video movies taken on GaAs and ZnO samples disclose the intrinsic recombination properties of glide dislocations during their slip motion. This way, the kinematical SEM CL microscopy provides, for the first time, direct information on the possible relationship between the dynamics and electronic activity of glide dislocations as expected from structural alterations or kink processes related to defect movement. The dark CL defect contrasts observed for various dislocation types in both materials indicate defect-bound non-radiative excess carrier recombination. Quantitative CL contrast analysis is performed to discover differences in the recombination strength of distinct dislocation structures resulting from the type and dynamic state of the glide dislocations studied.

  13. Evaluation of dental enamel caries assessment using Quantitative Light Induced Fluorescence and Optical Coherence Tomography.

    PubMed

    Maia, Ana Marly Araújo; de Freitas, Anderson Zanardi; de L Campello, Sergio; Gomes, Anderson Stevens Leônidas; Karlsson, Lena

    2016-06-01

    An in vitro study of morphological alterations between sound dental structure and artificially induced white spot lesions in human teeth, was performed through the loss of fluorescence by Quantitative Light-Induced Fluorescence (QLF) and the alterations of the light attenuation coefficient by Optical Coherence Tomography (OCT). To analyze the OCT images using a commercially available system, a special algorithm was applied, whereas the QLF images were analyzed using the software available in the commercial system employed. When analyzing the sound region against white spot lesions region by QLF, a reduction in the fluorescence intensity was observed, whilst an increase of light attenuation by the OCT system occurred. Comparison of the percentage of alteration between optical properties of sound and artificial enamel caries regions showed that OCT processed images through the attenuation of light enhanced the tooth optical alterations more than fluorescence detected by QLF System. QLF versus OCT imaging of enamel caries: a photonics assessment. PMID:26351155

  14. The Effects of Problem-Based Learning Instruction on University Students' Performance of Conceptual and Quantitative Problems in Gas Concepts

    ERIC Educational Resources Information Center

    Bilgin, Ibrahim; Senocak, Erdal; Sozbilir, Mustafa

    2009-01-01

    This study aimed at investigating effects of Problem-Based Learning (PBL) on pre-service teachers' performance on conceptual and quantitative problems about concepts of gases. The subjects of this study were 78 second year undergraduates from two different classes enrolled to General Chemistry course in the Department of Primary Mathematics…

  15. Examination of Information Technology (IT) Certification and the Human Resources (HR) Professional Perception of Job Performance: A Quantitative Study

    ERIC Educational Resources Information Center

    O'Horo, Neal O.

    2013-01-01

    The purpose of this quantitative survey study was to test the Leontief input/output theory relating the input of IT certification to the output of the English-speaking U.S. human resource professional perceived IT professional job performance. Participants (N = 104) rated their perceptions of IT certified vs. non-IT certified professionals' job…

  16. Inclusion and Student Learning: A Quantitative Comparison of Special and General Education Student Performance Using Team and Solo-Teaching

    ERIC Educational Resources Information Center

    Jamison, Joseph A.

    2013-01-01

    This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…

  17. Holistic Evaluation of Quality Consistency of Ixeris sonchifolia (Bunge) Hance Injectables by Quantitative Fingerprinting in Combination with Antioxidant Activity and Chemometric Methods

    PubMed Central

    Yang, Lanping; Sun, Guoxiang; Guo, Yong; Hou, Zhifei; Chen, Shuai

    2016-01-01

    A widely used herbal medicine, Ixeris sonchifolia (Bge.) Hance Injectable (ISHI) was investigated for quality consistency. Characteristic fingerprints of 23 batches of the ISHI samples were generated at five wavelengths and evaluated by the systematic quantitative fingerprint method (SQFM) as well as simultaneous analysis of the content of seven marker compounds. Chemometric methods, i.e., support vector machine (SVM) and principal component analysis (PCA) were performed to assist in fingerprint evaluation of the ISHI samples. Qualitative classification of the ISHI samples by SVM was consistent with PCA, and in agreement with the quantitative evaluation by SQFM. In addition, the antioxidant activities of the ISHI samples were determined by both the off-line and on-line DPPH (2, 2-diphenyl-1-picryldrazyl) radical scavenging assays. A fingerprint–efficacy relationship linking the chemical components and in vitro antioxidant activity was established and validated using the partial least squares (PLS) and orthogonal projection to latent structures (OPLS) models; and the online DPPH assay further revealed those components that had position contribution to the total antioxidant activity. Therefore, the combined use of the chemometric methods, quantitative fingerprint evaluation by SQFM, and multiple marker compound analysis in conjunction with the assay of antioxidant activity provides a powerful and holistic approach to evaluate quality consistency of herbal medicines and their preparations. PMID:26872364

  18. Holistic Evaluation of Quality Consistency of Ixeris sonchifolia (Bunge) Hance Injectables by Quantitative Fingerprinting in Combination with Antioxidant Activity and Chemometric Methods.

    PubMed

    Yang, Lanping; Sun, Guoxiang; Guo, Yong; Hou, Zhifei; Chen, Shuai

    2016-01-01

    A widely used herbal medicine, Ixeris sonchifolia (Bge.) Hance Injectable (ISHI) was investigated for quality consistency. Characteristic fingerprints of 23 batches of the ISHI samples were generated at five wavelengths and evaluated by the systematic quantitative fingerprint method (SQFM) as well as simultaneous analysis of the content of seven marker compounds. Chemometric methods, i.e., support vector machine (SVM) and principal component analysis (PCA) were performed to assist in fingerprint evaluation of the ISHI samples. Qualitative classification of the ISHI samples by SVM was consistent with PCA, and in agreement with the quantitative evaluation by SQFM. In addition, the antioxidant activities of the ISHI samples were determined by both the off-line and on-line DPPH (2, 2-diphenyl-1-picryldrazyl) radical scavenging assays. A fingerprint-efficacy relationship linking the chemical components and in vitro antioxidant activity was established and validated using the partial least squares (PLS) and orthogonal projection to latent structures (OPLS) models; and the online DPPH assay further revealed those components that had position contribution to the total antioxidant activity. Therefore, the combined use of the chemometric methods, quantitative fingerprint evaluation by SQFM, and multiple marker compound analysis in conjunction with the assay of antioxidant activity provides a powerful and holistic approach to evaluate quality consistency of herbal medicines and their preparations. PMID:26872364

  19. Toward a quantitative visual noise evaluation of sensors and image processing pipes

    NASA Astrophysics Data System (ADS)

    Mornet, Clémence; Baxter, Donald; Vaillant, Jérôme; Decroux, Thomas; Herault, Didier; Schanen, Isabelle

    2011-01-01

    The evaluation of sensor's performance in terms of signal-to-noise ratio (SNR) is a big challenge for both camera phone manufacturers and customers. The first ones want to predict and assess the performance of their pixel while the seconds require being able to benchmark raw sensors and processing pipes. The Reference SNR metric is very sensitive to crosstalk whereas for low-light issue, the weight of sensitivity should be increased. To evaluate noise on final image, the analytical calculation of SNR on luminance channel has been performed by taking into account noise correlation due to the processing pipe. However, this luminance noise does not match the perception of human eye which is also sensitive to chromatic noise. Alternative metrics have been investigated to find a visual noise metric closer to the human visual system. They have been computed on five pixel technologies nodes with different sensor resolutions and viewing conditions.

  20. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    NASA Technical Reports Server (NTRS)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  1. Diagnostics and performance evaluation of multikilohertz capacitors

    SciTech Connect

    McDuff, G.; Nunnally, W.C.; Rust, K.; Sarjeant, J.

    1980-01-01

    The observed performance of nanofarad polypropylene-silicone oil, mica paper, and polytetrafluoroethylene-silicone oil capacitors discharged in a 100-ns, 1-kA pulse with a pulse repetition frequency of 1 kHz is presented. The test facility circuit, diagnostic parameters, and the preliminary test schedule are outlined as a basis for discussion of the observed failure locations and proposed failure mechanisms. Most of the test data and discussion presented involves the polypropylene-silicone oil units.

  2. Chiller performance evaluation report. Final report

    SciTech Connect

    Wylie, D.

    1998-12-01

    The Electric Power Research Institute (EPRI) directed ASW Engineering Management to analyze the performance of a new package chiller manufactured by VaCom, Inc. The chiller was operated for approximately 22 months using three different refrigerants (R-407C, R-22 and R-507). The objective was to identify the chiller`s energy-efficiency with each of the three refrigerants. This report presents AWS`s findings and associated backup information.

  3. Performance Evaluation of a Clinical PACS Module

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Cho, Paul S.; Huang, H. K.; Mankovich, Nicholas J.; Boechat, Maria I.

    1989-05-01

    Picture archiving and communication systems (PACS) are now clinically available in limited radiologic applications. The benefits, acceptability, and reliablity of these systems have thus far been mainly speculative and anecdotal. This paper discusses the evaluation of a PACS module implemented in the pediatric radiology section of a 700-bed teaching hospital. The PACS manages all pediatric inpatient images including conventional x-rays and contrast studies (obtained with a computed radiography system), magnetic resonance images, and relevant ultrasound images. A six-monitor workstation is available for image review.

  4. Performance Evaluation of Hyperspectral Chemical Detection Systems

    NASA Astrophysics Data System (ADS)

    Truslow, Eric

    Remote sensing of chemical vapor plumes is a difficult but important task with many military and civilian applications. Hyperspectral sensors operating in the long wave infrared (LWIR) regime have well demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis-testing problem that standard detection metrics do not fully describe. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and an identification metric based on the Dice index. Using the developed metrics, we demonstrate that using a detector bank followed by an identifier can achieve superior performance relative to either algorithm individually. Performance of the cascaded system relies on the first pass reliably detecting the plume. However, detection performance is severely hampered by the inclusion of plume pixels in estimates of background quantities. We demonstrate that this problem, known as contamination, can be mitigated by iteratively applying a spatial filter to the detected pixels. Multiple detection and filtering passes can remove nearly all contamination from the background estimates, a vast improvement over single-pass techniques.

  5. Performance evaluation of blind steganalysis classifiers

    NASA Astrophysics Data System (ADS)

    Hogan, Mark T.; Silvestre, Guenole C. M.; Hurley, Neil J.

    2004-06-01

    Steganalysis is the art of detecting and/or decoding secret messages embedded in multimedia contents. The topic has received considerable attention in recent years due to the malicious use of multimedia documents for covert communication. Steganalysis algorithms can be classified as either blind or non-blind depending on whether or not the method assumes knowledge of the embedding algorithm. In general, blind methods involve the extraction of a feature vector that is sensitive to embedding and is subsequently used to train a classifier. This classifier can then be used to determine the presence of a stego-object, subject to an acceptable probability of false alarm. In this work, the performance of three classifiers, namely Fisher linear discriminant (FLD), neural network (NN) and support vector machines (SVM), is compared using a recently proposed feature extraction technique. It is shown that the NN and SVM classifiers exhibit similar performance exceeding that of the FLD. However, steganographers may be able to circumvent such steganalysis algorithms by preserving the statistical transparency of the feature vector at the embedding. This motivates the use of classification algorithms based on the entire document. Such a strategy is applied using SVM classification for DCT, FFT and DWT representations of an image. The performance is compared to a feature extraction technique.

  6. Performance Evaluation Method for Dissimilar Aircraft Designs

    NASA Technical Reports Server (NTRS)

    Walker, H. J.

    1979-01-01

    A rationale is presented for using the square of the wingspan rather than the wing reference area as a basis for nondimensional comparisons of the aerodynamic and performance characteristics of aircraft that differ substantially in planform and loading. Working relationships are developed and illustrated through application to several categories of aircraft covering a range of Mach numbers from 0.60 to 2.00. For each application, direct comparisons of drag polars, lift-to-drag ratios, and maneuverability are shown for both nondimensional systems. The inaccuracies that may arise in the determination of aerodynamic efficiency based on reference area are noted. Span loading is introduced independently in comparing the combined effects of loading and aerodynamic efficiency on overall performance. Performance comparisons are made for the NACA research aircraft, lifting bodies, century-series fighter aircraft, F-111A aircraft with conventional and supercritical wings, and a group of supersonic aircraft including the B-58 and XB-70 bomber aircraft. An idealized configuration is included in each category to serve as a standard for comparing overall efficiency.

  7. Genetic variability of oil palm parental genotypes and performance of its' progenies as revealed by molecular markers and quantitative traits.

    PubMed

    Abdullah, Norziha; Rafii Yusop, Mohd; Ithnin, Maizura; Saleh, Ghizan; Latif, M A

    2011-04-01

    Studies were conducted to assess the genetic relationships between the parental palms (dura and pisifera) and performance of their progenies based on nine microsatellite markers and 29 quantitative traits. Correlation analyses between genetic distances and hybrids performance were estimated. The coefficients of correlation values of genetic distances with hybrid performance were non-significant, except for mean nut weight and leaf number. However, the correlation coefficient of genetic distances with these characters was low to be used as predicted value. These results indicated that genetic distances based on the microsatellite markers may not be useful for predicting hybrid performance. The genetic distance analysis using UPGMA clustering system generated 5 genetic clusters with coefficient of 1.26 based on quantitative traits of progenies. The genotypes, DP16, DP14, DP4, DP13, DP12, DP15, DP8, DP1 and DP2 belonging to distant clusters and greater genetic distances could be selected for further breeding programs. PMID:21513898

  8. Performance evaluation of video on ethernet

    SciTech Connect

    Pihlman, M.; Farrell, R.

    1993-08-01

    The purpose of this project was to determine the feasibility of using an ethernet local area network (LAN) to support videoconferencing connections between CAMEO Macintosh desktop videoconferencing systems. The specific goals were to: (1) to ensure that CAMEO video could be transported-without protocol modification-via existing ethernet networks, and would do so without ``bringing-down`` the network; (2) to measure the effect of CAMEO video connections on ethernet traffic; (3) to evaluate qualitatively how generated ethernet traffic effects the CAMEO video; and (4) to evaluate qualitatively how multiple CAMEO connections work between two routered ethernet networks via a backbone. High quality CAMEO video can be transported on an ethernet network and between routered networks, via a backbone. The number of simultaneous video connections possible on an ethernet segment would probably be less than 45, since each connection uses 2.2% of the network and errors increase rapidly as video connections are made. However, the actual number of simultaneous video connections possible will depend upon your network implementation and the amount of ``normal`` traffic present. The remainder of this report discusses the effect of CAMEO video on our networks.

  9. Phased array performance evaluation with photoelastic visualization

    SciTech Connect

    Ginzel, Robert; Dao, Gavin

    2014-02-18

    New instrumentation and a widening range of phased array transducer options are affording the industry a greater potential. Visualization of the complex wave components using the photoelastic system can greatly enhance understanding of the generated signals. Diffraction, mode conversion and wave front interaction, together with beam forming for linear, sectorial and matrix arrays, will be viewed using the photoelastic system. Beam focus and steering performance will be shown with a range of embedded and surface targets within glass samples. This paper will present principles and sound field images using this visualization system.

  10. ATAMM enhancement and multiprocessing performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.

    1994-01-01

    The algorithm to architecture mapping model (ATAAM) is a Petri net based model which provides a strategy for periodic execution of a class of real-time algorithms on multicomputer dataflow architecture. The execution of large-grained, decision-free algorithms on homogeneous processing elements is studied. The ATAAM provides an analytical basis for calculating performance bounds on throughput characteristics. Extension of the ATAMM as a strategy for cyclo-static scheduling provides for a truly distributed ATAMM multicomputer operating system. An ATAAM testbed consisting of a centralized graph manager and three processors is described using embedded firmware on 68HC11 microcontrollers.

  11. HENC performance evaluation and plutonium calibration

    SciTech Connect

    Menlove, H.O.; Baca, J.; Pecos, J.M.; Davidson, D.R.; McElroy, R.D.; Brochu, D.B.

    1997-10-01

    The authors have designed a high-efficiency neutron counter (HENC) to increase the plutonium content in 200-L waste drums. The counter uses totals neutron counting, coincidence counting, and multiplicity counting to determine the plutonium mass. The HENC was developed as part of a Cooperative Research and Development Agreement between the Department of Energy and Canberra Industries. This report presents the results of the detector modifications, the performance tests, the add-a-source calibration, and the plutonium calibration at Los Alamos National Laboratory (TA-35) in 1996.

  12. Performance evaluation of the Balcomb solar house

    SciTech Connect

    Balcomb, J.D.; Hedstrom, J.C.; Perry, J.E. Jr.

    1980-01-01

    Additional instrumentation was added to the Balcomb solar house for a six-week period and up to 85 channels were recorded hourly. Some new findings based on an evaluation of these data are presented. (1) The thermal comfort characteristics of four rooms are documented. (2) Relative humidity in the living room varies from 30 to 50%; these data are used to infer an evaporation rate in the house of about 25 kg of water/day. The evaporation rate correlates reasonably well with greenhouse temperature. (3) Heat storage in the greenhouse floor is estimated at about 0.30 kWh/day-m/sup 2/ based on temperatures measured at four depths. (4) Several thermal characteristics of the rock bed are deduced but it is evident that the heat flow is not yet completely understood.

  13. Quantitative evaluation of hand cranking a roller pump in a crisis management drill.

    PubMed

    Tomizawa, Yasuko; Tokumine, Asako; Ninomiya, Shinji; Momose, Naoki; Matayoshi, Toru

    2008-01-01

    The heart-lung machines for open-heart surgery have improved over the past 50 years; they rarely break down and are almost always equipped with backup batteries. The hand-cranking procedure only becomes necessary when a pump breaks down during perfusion or after the batteries have run out. In this study, the performance of hand cranking a roller pump was quantitatively assessed by an objective method using the ECCSIM-Lite educational simulator system. A roller pump connected to an extracorporeal circuit with an oxygenator and with gravity venous drainage was used. A flow sensor unit consisting of electromagnetic sensors was used to measure arterial and venous flow rates, and a built-in pressure sensor was used to measure the water level in the reservoir. A preliminary study of continuous cranking by a team of six people was conducted as a surprise drill. This system was then used at a perfusion seminar. At the seminar, 1-min hand-cranking drills were conducted by volunteers according to a prepared scenario. The data were calculated on site and trend graphs of individual performances were given to the participants as a handout. Preliminary studies showed that each person's performance was different. Results from 1-min drills showed that good performance was not related to the number of clinical cases experienced, years of practice, or experience in hand cranking. Hand cranking to maintain the target flow rate could be achieved without practice; however, manipulating the venous return clamp requires practice. While the necessity of performing hand cranking during perfusion due to pump failure is rare, we believe that it is beneficial for perfusionists and patients to include hand-cranking practice in periodic extracorporeal circulation crisis management drills because a drill allows perfusionists to mentally rehearse the procedures should such a crisis occur. PMID:18836871

  14. Evaluation of board performance in Iran’s universities of medical sciences

    PubMed Central

    Sajadi, Haniye Sadat; Maleki, Mohammadreza; Ravaghi, Hamid; Farzan, Homayoun; Aminlou, Hasan; Hadi, Mohammad

    2014-01-01

    Background: The critical role that the board plays in governance of universities clarifies the necessity of evaluating its performance. This study was aimed to evaluate the performance of the boards of medical universities and provide solutions to enhance its performance. Methods: The first phase of present study was a qualitative research in which data were collected through face-to-face semi-structured interviews. Data were analyzed by thematic approach. The second phase was a mixed qualitative and quantitative study, with quantitative part in cross-sectional format and qualitative part in content analysis format. In the quantitative part, data were collected through Ministry of Health and Medical Education (MoHME). In the qualitative part, the content of 2,148 resolutions that were selected by using stratified sampling method were analyzed. Results: Participants believed that the boards had no acceptable performance for a long time.Results also indicated the increasing number of meetings and resolutions of the boards in these 21 years. The boards’ resolutions were mostly operational in domain and administrative in nature. The share of specific resolutions was more than the general ones. Conclusion: Given the current pace of change and development and the need to timely respond them, it is recommended to accelerate the slow pace of improvement process of the boards. It appears that more delegation and strengthening the position of the boards are the effective strategies to speed up this process. PMID:25337597

  15. Performance Evaluation of the NEXT Ion Engine

    NASA Technical Reports Server (NTRS)

    Soulas, George C.; Domonkos, Matthew T.; Patterson, Michael J.

    2003-01-01

    The performance test results of three NEXT ion engines are presented. These ion engines exhibited peak specific impulse and thrust efficiency ranges of 4060 4090 s and 0.68 0.69, respectively, at the full power point of the NEXT throttle table. The performance of the ion engines satisfied all project requirements. Beam flatness parameters were significantly improved over the NSTAR ion engine, which is expected to improve accelerator grid service life. The results of engine inlet pressure and temperature measurements are also presented. Maximum main plenum, cathode, and neutralizer pressures were 12,000 Pa, 3110 Pa, and 8540 Pa, respectively, at the full power point of the NEXT throttle table. Main plenum and cathode inlet pressures required about 6 hours to increase to steady-state, while the neutralizer required only about 0.5 hour. Steady-state engine operating temperature ranges throughout the power throttling range examined were 179 303 C for the discharge chamber magnet rings and 132 213 C for the ion optics mounting ring.

  16. Evaluation of green coffee beans quality using near infrared spectroscopy: a quantitative approach.

    PubMed

    Santos, João Rodrigo; Sarraguça, Mafalda C; Rangel, António O S S; Lopes, João A

    2012-12-01

    Characterisation of coffee quality based on bean quality assessment is associated with the relative amount of defective beans among non-defective beans. It is therefore important to develop a methodology capable of identifying the presence of defective beans that enables a fast assessment of coffee grade and that can become an analytical tool to standardise coffee quality. In this work, a methodology for quality assessment of green coffee based on near infrared spectroscopy (NIRS) is proposed. NIRS is a green chemistry, low cost, fast response technique without the need of sample processing. The applicability of NIRS was evaluated for Arabica and Robusta varieties from different geographical locations. Partial least squares regression was used to relate the NIR spectrum to the mass fraction of defective and non-defective beans. Relative errors around 5% show that NIRS can be a valuable analytical tool to be used by coffee roasters, enabling a simple and quantitative evaluation of green coffee quality in a fast way. PMID:22953929

  17. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  18. Exploring the utility of quantitative network design in evaluating Arctic sea-ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-03-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve ten-day to five-month sea-ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett Ice Severity Index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea-ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  19. Quantitative Evaluation of Peptide-Material Interactions by a Force Mapping Method: Guidelines for Surface Modification.

    PubMed

    Mochizuki, Masahito; Oguchi, Masahiro; Kim, Seong-Oh; Jackman, Joshua A; Ogawa, Tetsu; Lkhamsuren, Ganchimeg; Cho, Nam-Joon; Hayashi, Tomohiro

    2015-07-28

    Peptide coatings on material surfaces have demonstrated wide application across materials science and biotechnology, facilitating the development of nanobio interfaces through surface modification. A guiding motivation in the field is to engineer peptides with a high and selective binding affinity to target materials. Herein, we introduce a quantitative force mapping method in order to evaluate the binding affinity of peptides to various hydrophilic oxide materials by atomic force microscopy (AFM). Statistical analysis of adhesion forces and probabilities obtained on substrates with a materials contrast enabled us to simultaneously compare the peptide binding affinity to different materials. On the basis of the experimental results and corresponding theoretical analysis, we discuss the role of various interfacial forces in modulating the strength of peptide attachment to hydrophilic oxide solid supports as well as to gold. The results emphasize the precision and robustness of our approach to evaluating the adhesion strength of peptides to solid supports, thereby offering guidelines to improve the design and fabrication of peptide-coated materials. PMID:26125092

  20. A quantitative and standardized robotic method for the evaluation of arm proprioception after stroke.

    PubMed

    Simo, Lucia S; Ghez, Claude; Botzer, Lior; Scheidt, Robert A

    2011-01-01

    Stroke often results in both motor and sensory deficits, which may interact in the manifested functional impairment. Proprioception is known to play important roles in the planning and control of limb posture and movement; however, the impact of proprioceptive deficits on motor function has been difficult to elucidate due in part to the qualitative nature of available clinical tests. We present a quantitative and standardized method for evaluating proprioception in tasks directly relevant to those used to assess motor function. Using a robotic manipulandum that exerted controlled displacements of the hand, stroke participants were evaluated, and compared with a control group, in their ability to detect such displacements in a 2-alternative, forced-choice paradigm. A psychometric function parameterized the decision process underlying the detection of the hand displacements. The shape of this function was determined by a signal detection threshold and by the variability of the response about this threshold. Our automatic procedure differentiates between participants with and without proprioceptive deficits and quantifies functional proprioceptive sensation on a magnitude scale that is meaningful for ongoing studies of degraded motor function in comparable horizontal movements. PMID:22256252