Sample records for quantitative performance measurement

  1. Quantitative performance measurements of bent crystal Laue analyzers for X-ray fluorescence spectroscopy.

    PubMed

    Karanfil, C; Bunker, G; Newville, M; Segre, C U; Chapman, D

    2012-05-01

    Third-generation synchrotron radiation sources pose difficult challenges for energy-dispersive detectors for XAFS because of their count rate limitations. One solution to this problem is the bent crystal Laue analyzer (BCLA), which removes most of the undesired scatter and fluorescence before it reaches the detector, effectively eliminating detector saturation due to background. In this paper experimental measurements of BCLA performance in conjunction with a 13-element germanium detector, and a quantitative analysis of the signal-to-noise improvement of BCLAs are presented. The performance of BCLAs are compared with filters and slits.

  2. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  3. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  4. Smile line assessment comparing quantitative measurement and visual estimation.

    PubMed

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  5. Towards a Quantitative Performance Measurement Framework to Assess the Impact of Geographic Information Standards

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.

    2012-12-01

    Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives

  6. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Technical Performance Assessment

    PubMed Central

    2017-01-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831

  7. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  8. Imaging Performance of Quantitative Transmission Ultrasound

    PubMed Central

    Lenox, Mark W.; Wiskin, James; Lewis, Matthew A.; Darrouzet, Stephen; Borup, David; Hsieh, Scott

    2015-01-01

    Quantitative Transmission Ultrasound (QTUS) is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound. PMID:26604918

  9. Measurements in quantitative research: how to select and report on research instruments.

    PubMed

    Hagan, Teresa L

    2014-07-01

    Measures exist to numerically represent degrees of attributes. Quantitative research is based on measurement and is conducted in a systematic, controlled manner. These measures enable researchers to perform statistical tests, analyze differences between groups, and determine the effectiveness of treatments. If something is not measurable, it cannot be tested.

  10. Performance of biometric quality measures.

    PubMed

    Grother, Patrick; Tabassi, Elham

    2007-04-01

    We document methods for the quantitative evaluation of systems that produce a scalar summary of a biometric sample's quality. We are motivated by a need to test claims that quality measures are predictive of matching performance. We regard a quality measurement algorithm as a black box that converts an input sample to an output scalar. We evaluate it by quantifying the association between those values and observed matching results. We advance detection error trade-off and error versus reject characteristics as metrics for the comparative evaluation of sample quality measurement algorithms. We proceed this with a definition of sample quality, a description of the operational use of quality measures. We emphasize the performance goal by including a procedure for annotating the samples of a reference corpus with quality values derived from empirical recognition scores.

  11. Diagnostic performance of different measurement methods for lung nodule enhancement at quantitative contrast-enhanced computed tomography

    NASA Astrophysics Data System (ADS)

    Wormanns, Dag; Klotz, Ernst; Dregger, Uwe; Beyer, Florian; Heindel, Walter

    2004-05-01

    Lack of angiogenesis virtually excludes malignancy of a pulmonary nodule; assessment with quantitative contrast-enhanced CT (QECT) requires a reliable enhancement measurement technique. Diagnostic performance of different measurement methods in the distinction between malignant and benign nodules was evaluated. QECT (unenhanced scan and 4 post-contrast scans) was performed in 48 pulmonary nodules (12 malignant, 12 benign, 24 indeterminate). Nodule enhancement was the difference between the highest nodule density at any post-contrast scan and the unenhanced scan. Enhancement was determined with: A) the standard 2D method; B) a 3D method consisting of segmentation, removal of peripheral structures and density averaging. Enhancement curves were evaluated for their plausibility using a predefined set of criteria. Sensitivity and specificity were 100% and 33% for the 2D method resp. 92% and 55% for the 3D method using a threshold of 20 HU. One malignant nodule did not show significant enhancement with method B due to adjacent atelectasis which disappeared within the few minutes of the QECT examination. Better discrimination between benign and malignant lesions was achieved with a slightly higher threshold than proposed in the literature. Application of plausibility criteria to the enhancement curves rendered less plausibility faults with the 3D method. A new 3D method for analysis of QECT scans yielded less artefacts and better specificity in the discrimination between benign and malignant pulmonary nodules when using an appropriate enhancement threshold. Nevertheless, QECT results must be interpreted with care.

  12. Adipose tissue MRI for quantitative measurement of central obesity.

    PubMed

    Poonawalla, Aziz H; Sjoberg, Brett P; Rehm, Jennifer L; Hernando, Diego; Hines, Catherine D; Irarrazaval, Pablo; Reeder, Scott B

    2013-03-01

    To validate adipose tissue magnetic resonance imaging (atMRI) for rapid, quantitative volumetry of visceral adipose tissue (VAT) and total adipose tissue (TAT). Data were acquired on normal adults and clinically overweight girls with Institutional Review Board (IRB) approval/parental consent using sagittal 6-echo 3D-spoiled gradient-echo (SPGR) (26-sec single-breath-hold) at 3T. Fat-fraction images were reconstructed with quantitative corrections, permitting measurement of a physiologically based fat-fraction threshold in normals to identify adipose tissue, for automated measurement of TAT, and semiautomated measurement of VAT. TAT accuracy was validated using oil phantoms and in vivo TAT/VAT measurements validated with manual segmentation. Group comparisons were performed between normals and overweight girls using TAT, VAT, VAT-TAT-ratio (VTR), body-mass-index (BMI), waist circumference, and waist-hip-ratio (WHR). Oil phantom measurements were highly accurate (<3% error). The measured adipose fat-fraction threshold was 96% ± 2%. VAT and TAT correlated strongly with manual segmentation (normals r(2) ≥ 0.96, overweight girls r(2) ≥ 0.99). VAT segmentation required 30 ± 11 minutes/subject (14 ± 5 sec/slice) using atMRI, versus 216 ± 73 minutes/subject (99 ± 31 sec/slice) manually. Group discrimination was significant using WHR (P < 0.001) and VTR (P = 0.004). The atMRI technique permits rapid, accurate measurements of TAT, VAT, and VTR. Copyright © 2012 Wiley Periodicals, Inc.

  13. Development and Measurement of Preschoolers' Quantitative Knowledge

    ERIC Educational Resources Information Center

    Geary, David C.

    2015-01-01

    The collection of studies in this special issue make an important contribution to our understanding and measurement of the core cognitive and noncognitive factors that influence children's emerging quantitative competencies. The studies also illustrate how the field has matured, from a time when the quantitative competencies of infants and young…

  14. Performance Evaluation and Quantitative Accuracy of Multipinhole NanoSPECT/CT Scanner for Theranostic Lu-177 Imaging

    NASA Astrophysics Data System (ADS)

    Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung

    2018-06-01

    SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.

  15. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  16. Quantitative phase measurement for wafer-level optics

    NASA Astrophysics Data System (ADS)

    Qu, Weijuan; Wen, Yongfu; Wang, Zhaomin; Yang, Fang; Huang, Lei; Zuo, Chao

    2015-07-01

    Wafer-level-optics now is widely used in smart phone camera, mobile video conferencing or in medical equipment that require tiny cameras. Extracting quantitative phase information has received increased interest in order to quantify the quality of manufactured wafer-level-optics, detect defective devices before packaging, and provide feedback for manufacturing process control, all at the wafer-level for high-throughput microfabrication. We demonstrate two phase imaging methods, digital holographic microscopy (DHM) and Transport-of-Intensity Equation (TIE) to measure the phase of the wafer-level lenses. DHM is a laser-based interferometric method based on interference of two wavefronts. It can perform a phase measurement in a single shot. While a minimum of two measurements of the spatial intensity of the optical wave in closely spaced planes perpendicular to the direction of propagation are needed to do the direct phase retrieval by solving a second-order differential equation, i.e., with a non-iterative deterministic algorithm from intensity measurements using the Transport-of-Intensity Equation (TIE). But TIE is a non-interferometric method, thus can be applied to partial-coherence light. We demonstrated the capability and disability for the two phase measurement methods for wafer-level optics inspection.

  17. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  18. Quantitative measurements of electromechanical response with a combined optical beam and interferometric atomic force microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Labuda, Aleksander; Proksch, Roger

    An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement.more » The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.« less

  19. Performance and Maqasid al-Shari'ah's Pentagon-Shaped Ethical Measurement.

    PubMed

    Bedoui, Houssem Eddine; Mansour, Walid

    2015-06-01

    Business performance is traditionally viewed from the one-dimensional financial angle. This paper develops a new approach that links performance to the ethical vision of Islam based on maqasid al-shari'ah (i.e., the objectives of Islamic law). The approach involves a Pentagon-shaped performance scheme structure via five pillars, namely wealth, posterity, intellect, faith, and human self. Such a scheme ensures that any firm or organization can ethically contribute to the promotion of human welfare, prevent corruption, and enhance social and economic stability and not merely maximize its own performance in terms of its financial return. A quantitative measure of ethical performance is developed. It surprisingly shows that a firm or organization following only the financial aspect at the expense of the others performs poorly. This paper discusses further the practical instances of the quantitative measurement of the ethical aspects of the system taken at an aggregate level.

  20. Chinese Middle School Teachers' Preferences Regarding Performance Evaluation Measures

    ERIC Educational Resources Information Center

    Liu, Shujie; Xu, Xianxuan; Stronge, James H.

    2016-01-01

    Teacher performance evaluation currently is receiving unprecedented attention from policy makers, scholars, and practitioners worldwide. This study is one of the few studies of teacher perceptions regarding teacher performance measures that focus on China. We employed a quantitative dominant mixed research design to investigate Chinese teachers'…

  1. Quantitative research regarding performance measures for intermodal freight transportation : executive summary

    DOT National Transportation Integrated Search

    1995-10-01

    The primary objective of this study is to provide information relative to the development of a set of performance measures for intermodal freight transportation. To accomplish this objective, data was collected, processed, and analyzed on the basis o...

  2. Qualitative pattern classification of shear wave elastography for breast masses: how it correlates to quantitative measurements.

    PubMed

    Yoon, Jung Hyun; Ko, Kyung Hee; Jung, Hae Kyoung; Lee, Jong Tae

    2013-12-01

    To determine the correlation of qualitative shear wave elastography (SWE) pattern classification to quantitative SWE measurements and whether it is representative of quantitative SWE values with similar performances. From October 2012 to January 2013, 267 breast masses of 236 women (mean age: 45.12 ± 10.54 years, range: 21-88 years) who had undergone ultrasonography (US), SWE, and subsequent biopsy were included. US BI-RADS final assessment and qualitative and quantitative SWE measurements were recorded. Correlation between pattern classification and mean elasticity, maximum elasticity, elasticity ratio and standard deviation were evaluated. Diagnostic performances of grayscale US, SWE parameters, and US combined to SWE values were calculated and compared. Of the 267 breast masses, 208 (77.9%) were benign and 59 (22.1%) were malignant. Pattern classifications significantly correlated with all quantitative SWE measurements, showing highest correlation with maximum elasticity, r = 0.721 (P<0.001). Sensitivity was significantly decreased in US combined to SWE measurements to grayscale US: 69.5-89.8% to 100.0%, while specificity was significantly improved: 62.5-81.7% to 13.9% (P<0.001). Area under the ROC curve (Az) did not show significant differences between grayscale US to US combined to SWE (P>0.05). Pattern classification shows high correlation to maximum stiffness and may be representative of quantitative SWE values. When combined to grayscale US, SWE improves specificity of US. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. NASA Intellectual Property Negotiation Practices and their Relationship to Quantitative Measures of Technology Transfer

    NASA Technical Reports Server (NTRS)

    Bush, Lance B.

    1997-01-01

    In the current political climate NASA must be able to show reliable measures demonstrating successful technology transfer. The currently available quantitative data of intellectual property technology transfer efforts portray a less than successful performance. In this paper, the use of only quantitative values for measurement of technology transfer is shown to undervalue the effort. In addition, NASA's current policy in negotiating intellectual property rights results in undervalued royalty rates. NASA has maintained that it's position of providing public good precludes it from negotiating fair market value for its technology and instead has negotiated for reasonable cost in order to recover processing fees. This measurement issue is examined and recommendations made which include a new policy regarding the intellectual property rights negotiation, and two measures to supplement the intellectual property measures.

  4. Quantitative measures for redox signaling.

    PubMed

    Pillay, Ché S; Eagling, Beatrice D; Driscoll, Scott R E; Rohwer, Johann M

    2016-07-01

    Redox signaling is now recognized as an important regulatory mechanism for a number of cellular processes including the antioxidant response, phosphokinase signal transduction and redox metabolism. While there has been considerable progress in identifying the cellular machinery involved in redox signaling, quantitative measures of redox signals have been lacking, limiting efforts aimed at understanding and comparing redox signaling under normoxic and pathogenic conditions. Here we have outlined some of the accepted principles for redox signaling, including the description of hydrogen peroxide as a signaling molecule and the role of kinetics in conferring specificity to these signaling events. Based on these principles, we then develop a working definition for redox signaling and review a number of quantitative methods that have been employed to describe signaling in other systems. Using computational modeling and published data, we show how time- and concentration- dependent analyses, in particular, could be used to quantitatively describe redox signaling and therefore provide important insights into the functional organization of redox networks. Finally, we consider some of the key challenges with implementing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Hematocrit Measurement with R2* and Quantitative Susceptibility Mapping in Postmortem Brain.

    PubMed

    Walsh, A J; Sun, H; Emery, D J; Wilman, A H

    2018-05-24

    Noninvasive venous oxygenation quantification with MR imaging will improve the neurophysiologic investigation and the understanding of the pathophysiology in neurologic diseases. Available MR imaging methods are limited by sensitivity to flow and often require assumptions of the hematocrit level. In situ postmortem imaging enables evaluation of methods in a fully deoxygenated environment without flow artifacts, allowing direct calculation of hematocrit. This study compares 2 venous oxygenation quantification methods in in situ postmortem subjects. Transverse relaxation (R2*) mapping and quantitative susceptibility mapping were performed on a whole-body 4.7T MR imaging system. Intravenous measurements in major draining intracranial veins were compared between the 2 methods in 3 postmortem subjects. The quantitative susceptibility mapping technique was also applied in 10 healthy control subjects and compared with reference venous oxygenation values. In 2 early postmortem subjects, R2* mapping and quantitative susceptibility mapping measurements within intracranial veins had a significant and strong correlation ( R 2 = 0.805, P = .004 and R 2 = 0.836, P = .02). Higher R2* and susceptibility values were consistently demonstrated within gravitationally dependent venous segments during the early postmortem period. Hematocrit ranged from 0.102 to 0.580 in postmortem subjects, with R2* and susceptibility as large as 291 seconds -1 and 1.75 ppm, respectively. Measurements of R2* and quantitative susceptibility mapping within large intracranial draining veins have a high correlation in early postmortem subjects. This study supports the use of quantitative susceptibility mapping for evaluation of in vivo venous oxygenation and postmortem hematocrit concentrations. © 2018 by American Journal of Neuroradiology.

  6. The Relationship between Quantitative and Qualitative Measures of Writing Skills.

    ERIC Educational Resources Information Center

    Howerton, Mary Lou P.; And Others

    The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…

  7. Measuring the performance of visual to auditory information conversion.

    PubMed

    Tan, Shern Shiou; Maul, Tomás Henrique Bode; Mennie, Neil Russell

    2013-01-01

    Visual to auditory conversion systems have been in existence for several decades. Besides being among the front runners in providing visual capabilities to blind users, the auditory cues generated from image sonification systems are still easier to learn and adapt to compared to other similar techniques. Other advantages include low cost, easy customizability, and universality. However, every system developed so far has its own set of strengths and weaknesses. In order to improve these systems further, we propose an automated and quantitative method to measure the performance of such systems. With these quantitative measurements, it is possible to gauge the relative strengths and weaknesses of different systems and rank the systems accordingly. Performance is measured by both the interpretability and also the information preservation of visual to auditory conversions. Interpretability is measured by computing the correlation of inter image distance (IID) and inter sound distance (ISD) whereas the information preservation is computed by applying Information Theory to measure the entropy of both visual and corresponding auditory signals. These measurements provide a basis and some insights on how the systems work. With an automated interpretability measure as a standard, more image sonification systems can be developed, compared, and then improved. Even though the measure does not test systems as thoroughly as carefully designed psychological experiments, a quantitative measurement like the one proposed here can compare systems to a certain degree without incurring much cost. Underlying this research is the hope that a major breakthrough in image sonification systems will allow blind users to cost effectively regain enough visual functions to allow them to lead secure and productive lives.

  8. Quantitative echocardiographic measures in the assessment of single ventricle function post-Fontan: Incorporation into routine clinical practice.

    PubMed

    Rios, Rodrigo; Ginde, Salil; Saudek, David; Loomba, Rohit S; Stelter, Jessica; Frommelt, Peter

    2017-01-01

    Quantitative echocardiographic measurements of single ventricular (SV) function have not been incorporated into routine clinical practice. A clinical protocol, which included quantitative measurements of SV deformation (global circumferential and longitudinal strain and strain rate), standard deviation of time to peak systolic strain, myocardial performance index (MPI), dP/dT from an atrioventricular valve regurgitant jet, and superior mesenteric artery resistance index, was instituted for all patients with a history of Fontan procedure undergoing echocardiography. All measures were performed real time during clinically indicated studies and were included in clinical reports. A total of 100 consecutive patients (mean age = 11.95±6.8 years, range 17 months-31.3 years) completed the protocol between September 1, 2014 to April 29, 2015. Deformation measures were completed in 100% of the studies, MPI in 93%, dP/dT in 55%, and superior mesenteric artery Doppler in 82%. The studies were reviewed to assess for efficiency in completing the protocol. The average time for image acquisition was 27.4±8.8 (range 10-62 minutes). The average time to perform deformation measures was 10.8±5.5 minutes (range 5-35 minutes) and time from beginning of imaging to report completion was 53.4±13.7 minutes (range 27-107 minutes). There was excellent inter-observer reliability when deformation indices were blindly repeated. Patients with a single left ventricle had significantly higher circumferential strain and strain rate, longitudinal strain and strain rate, and dP/dT compared to a single right ventricle. There were no differences in quantitative indices of ventricular function between patients <10 vs. >10 years post-Fontan. Advanced quantitative assessment of SV function post-Fontan can be consistently and efficiently performed real time during clinically indicated echocardiograms with excellent reliability. © 2016, Wiley Periodicals, Inc.

  9. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    NASA Astrophysics Data System (ADS)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  10. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  11. Quantitative Measurement of Oxygen in Microgravity Combustion

    NASA Technical Reports Server (NTRS)

    Silver, Joel A.

    1997-01-01

    A low-gravity environment, in space or in ground-based facilities such as drop towers, provides a unique setting for studying combustion mechanisms. Understanding the physical phenomena controlling the ignition and spread of flames in microgravity has importance for space safety as well as for better characterization of dynamical and chemical combustion processes which are normally masked by buoyancy and other gravity-related effects. Due to restrictions associated with performing measurements in reduced gravity, diagnostic methods which have been applied to microgravity combustion studies have generally been limited to capture of flame emissions on film or video, laser Schlieren imaging and (intrusive) temperature measurements using thermocouples. Given the development of detailed theoretical models, more sophisticated diagnostic methods are needed to provide the kind of quantitative data necessary to characterize the properties of microgravity combustion processes as well as provide accurate feedback to improve the predictive capabilities of the models. When the demands of space flight are considered, the need for improved diagnostic systems which are rugged, compact, reliable, and operate at low power becomes apparent. The objective of this research is twofold. First, we want to develop a better understanding of the relative roles of diffusion and reaction of oxygen in microgravity combustion. As the primary oxidizer species, oxygen plays a major role in controlling the observed properties of flames, including flame front speed (in solid or liquid flames), extinguishment characteristics, flame size and flame temperature. The second objective is to develop better diagnostics based on diode laser absorption which can be of real value in both microgravity combustion research and as a sensor on-board Spacelab as either an air quality monitor or as part of a fire detection system. In our prior microgravity work, an eight line-of-sight fiber optic system measured

  12. Quantitative criteria for assessment of gamma-ray imager performance

    NASA Astrophysics Data System (ADS)

    Gottesman, Steve; Keller, Kristi; Malik, Hans

    2015-08-01

    In recent years gamma ray imagers such as the GammaCamTM and Polaris have demonstrated good imaging performance in the field. Imager performance is often summarized as "resolution", either angular, or spatial at some distance from the imager, however the definition of resolution is not always related to the ability to image an object. It is difficult to quantitatively compare imagers without a common definition of image quality. This paper examines three categories of definition: point source; line source; and area source. It discusses the details of those definitions and which ones are more relevant for different situations. Metrics such as Full Width Half Maximum (FWHM), variations on the Rayleigh criterion, and some analogous to National Imagery Interpretability Rating Scale (NIIRS) are discussed. The performance against these metrics is evaluated for a high resolution coded aperture imager modeled using Monte Carlo N-Particle (MCNP), and for a medium resolution imager measured in the lab.

  13. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  14. Quantitative endoscopy: initial accuracy measurements.

    PubMed

    Truitt, T O; Adelman, R A; Kelly, D H; Willging, J P

    2000-02-01

    The geometric optics of an endoscope can be used to determine the absolute size of an object in an endoscopic field without knowing the actual distance from the object. This study explores the accuracy of a technique that estimates absolute object size from endoscopic images. Quantitative endoscopy involves calibrating a rigid endoscope to produce size estimates from 2 images taken with a known traveled distance between the images. The heights of 12 samples, ranging in size from 0.78 to 11.80 mm, were estimated with this calibrated endoscope. Backup distances of 5 mm and 10 mm were used for comparison. The mean percent error for all estimated measurements when compared with the actual object sizes was 1.12%. The mean errors for 5-mm and 10-mm backup distances were 0.76% and 1.65%, respectively. The mean errors for objects <2 mm and > or =2 mm were 0.94% and 1.18%, respectively. Quantitative endoscopy estimates endoscopic image size to within 5% of the actual object size. This method remains promising for quantitatively evaluating object size from endoscopic images. It does not require knowledge of the absolute distance of the endoscope from the object, rather, only the distance traveled by the endoscope between images.

  15. Indium adhesion provides quantitative measure of surface cleanliness

    NASA Technical Reports Server (NTRS)

    Krieger, G. L.; Wilson, G. J.

    1968-01-01

    Indium tipped probe measures hydrophobic and hydrophilic contaminants on rough and smooth surfaces. The force needed to pull the indium tip, which adheres to a clean surface, away from the surface provides a quantitative measure of cleanliness.

  16. Quantitative Measurements of Nitric Oxide Concentration in High-Pressure, Swirl-Stabilized Spray Flames

    NASA Technical Reports Server (NTRS)

    Cooper, Clayton S.; Laurendeau, Normand M.; Hicks, Yolanda R. (Technical Monitor)

    2000-01-01

    Lean direct-injection (LDI) spray flames offer the possibility of reducing NO(sub x) emissions from gas turbines by rapid mixing of the liquid fuel and air so as to drive the flame structure toward partially-premixed conditions. We consider the technical approaches required to utilize laser-induced fluorescence methods for quantitatively measuring NO concentrations in high-pressure LDI spray flames. In the progression from atmospheric to high-pressure measurements, the LIF method requires a shift from the saturated to the linear regime of fluorescence measurements. As such, we discuss quantitative, spatially resolved laser-saturated fluorescence (LSF), linear laser-induced fluorescence (LIF), and planar laser-induced fluorescence (PLIF) measurements of NO concentration in LDI spray flames. Spatially-resolved LIF measurements of NO concentration (ppm) are reported for preheated, LDI spray flames at pressures of two to five atmospheres. The spray is produced by a hollow-cone, pressure-atomized nozzle supplied with liquid heptane. NO is excited via the Q(sub 2)(26.5) transition of the gamma(0,0) band. Detection is performed in a two nanometer region centered on the gamma(0,1) band. A complete scheme is developed by which quantitative NO concentrations in high-pressure LDI spray flames can be measured by applying linear LIF. NO is doped into the reactants and convected through the flame with no apparent destruction, thus allowing a NO fluorescence calibration to be taken inside the flame environment. The in-situ calibration scheme is validated by comparisons to a reference flame. Quantitative NO profiles are presented and analyzed so as to better understand the operation of lean-direct injectors for gas turbine combustors. Moreover, parametric studies are provided for variations in pressure, air-preheat temperature, and equivalence ratio. Similar parametric studies are performed for lean, premixed-prevaporized flames to permit comparisons to those for LDI flames

  17. Investigation of PACE™ software and VeriFax's Impairoscope device for quantitatively measuring the effects of stress

    NASA Astrophysics Data System (ADS)

    Morgenthaler, George W.; Nuñez, German R.; Botello, Aaron M.; Soto, Jose; Shrairman, Ruth; Landau, Alexander

    1998-01-01

    Many reaction time experiments have been conducted over the years to observe human responses. However, most of the experiments that were performed did not have quantitatively accurate instruments for measuring change in reaction time under stress. There is a great need for quantitative instruments to measure neuromuscular reaction responses under stressful conditions such as distraction, disorientation, disease, alcohol, drugs, etc. The two instruments used in the experiments reported in this paper are such devices. Their accuracy, portability, ease of use, and biometric character are what makes them very special. PACE™ is a software model used to measure reaction time. VeriFax's Impairoscope measures the deterioration of neuromuscular responses. During the 1997 Summer Semester, various reaction time experiments were conducted on University of Colorado faculty, staff, and students using the PACE™ system. The tests included both two-eye and one-eye unstressed trials and trials with various stresses such as fatigue, distractions in which subjects were asked to perform simple arithmetic during the PACE™ tests, and stress due to rotating-chair dizziness. Various VeriFax Impairoscope tests, both stressed and unstressed, were conducted to determine the Impairoscope's ability to quantitatively measure this impairment. In the 1997 Fall Semester, a Phase II effort was undertaken to increase test sample sizes in order to provide statistical precision and stability. More sophisticated statistical methods remain to be applied to better interpret the data.

  18. Quantitative tomographic measurements of opaque multiphase flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN

    2000-03-01

    An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDTmore » and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.« less

  19. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  20. Quantitative elasticity measurement of urinary bladder wall using laser-induced surface acoustic waves.

    PubMed

    Li, Chunhui; Guan, Guangying; Zhang, Fan; Song, Shaozhen; Wang, Ruikang K; Huang, Zhihong; Nabi, Ghulam

    2014-12-01

    The maintenance of urinary bladder elasticity is essential to its functions, including the storage and voiding phases of the micturition cycle. The bladder stiffness can be changed by various pathophysiological conditions. Quantitative measurement of bladder elasticity is an essential step toward understanding various urinary bladder disease processes and improving patient care. As a nondestructive, and noncontact method, laser-induced surface acoustic waves (SAWs) can accurately characterize the elastic properties of different layers of organs such as the urinary bladder. This initial investigation evaluates the feasibility of a noncontact, all-optical method of generating and measuring the elasticity of the urinary bladder. Quantitative elasticity measurements of ex vivo porcine urinary bladder were made using the laser-induced SAW technique. A pulsed laser was used to excite SAWs that propagated on the bladder wall surface. A dedicated phase-sensitive optical coherence tomography (PhS-OCT) system remotely recorded the SAWs, from which the elasticity properties of different layers of the bladder were estimated. During the experiments, series of measurements were performed under five precisely controlled bladder volumes using water to estimate changes in the elasticity in relation to various urinary bladder contents. The results, validated by optical coherence elastography, show that the laser-induced SAW technique combined with PhS-OCT can be a feasible method of quantitative estimation of biomechanical properties.

  1. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography

    NASA Astrophysics Data System (ADS)

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.

  2. Quantitative measurement of marginal disintegration of ceramic inlays.

    PubMed

    Hayashi, Mikako; Tsubakimoto, Yuko; Takeshige, Fumio; Ebisu, Shigeyuki

    2004-01-01

    The objectives of this study include establishing a method for quantitative measurement of marginal change in ceramic inlays and clarifying their marginal disintegration in vivo. An accurate CCD optical laser scanner system was used for morphological measurement of the marginal change of ceramic inlays. The accuracy of the CCD measurement was assessed by comparing it with microscopic measurement. Replicas of 15 premolars restored with Class II ceramic inlays at the time of placement and eight years after restoration were used for morphological measurement by means of the CCD laser scanner system. Occlusal surfaces of the restored teeth were scanned and cross-sections of marginal areas were computed with software. Marginal change was defined as the area enclosed by two profiles obtained by superimposing two cross-sections of the same location at two different times and expressing the maximum depth and mean area of the area enclosed. The accuracy of this method of measurement was 4.3 +/- 3.2 microm in distance and 2.0 +/- 0.6% in area. Quantitative marginal changes for the eight-year period were 10 x 10 microm in depth and 50 x 10(3) microm2 in area at the functional cusp area and 7 x 10 microm in depth and 28 x 10(3) microm2 in area at the non-functional cusp area. Marginal disintegration at the functional cusp area was significantly greater than at the non-functional cusp area (Wilcoxon signed-ranks test, p < 0.05). This study constitutes a quantitative measurement of in vivo deterioration in marginal adaptation of ceramic inlays and indicates that occlusal force may accelerate marginal disintegration.

  3. Quantitative Evaluation of Performance during Robot-assisted Treatment.

    PubMed

    Peri, E; Biffi, E; Maghini, C; Servodio Iammarrone, F; Gagliardi, C; Germiniasi, C; Pedrocchi, A; Turconi, A C; Reni, G

    2016-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The great potential of robots in extracting quantitative and meaningful data is not always exploited in clinical practice. The aim of the present work is to describe a simple parameter to assess the performance of subjects during upper limb robotic training exploiting data automatically recorded by the robot, with no additional effort for patients and clinicians. Fourteen children affected by cerebral palsy (CP) performed a training with Armeo®Spring. Each session was evaluated with P, a simple parameter that depends on the overall performance recorded, and median and interquartile values were computed to perform a group analysis. Median (interquartile) values of P significantly increased from 0.27 (0.21) at T0 to 0.55 (0.27) at T1 . This improvement was functionally validated by a significant increase of the Melbourne Assessment of Unilateral Upper Limb Function. The parameter described here was able to show variations in performance over time and enabled a quantitative evaluation of motion abilities in a way that is reliable with respect to a well-known clinical scale.

  4. Quantitative fundus autofluorescence in mice: correlation with HPLC quantitation of RPE lipofuscin and measurement of retina outer nuclear layer thickness.

    PubMed

    Sparrow, Janet R; Blonska, Anna; Flynn, Erin; Duncker, Tobias; Greenberg, Jonathan P; Secondi, Roberta; Ueda, Keiko; Delori, François C

    2013-04-17

    Our study was conducted to establish procedures and protocols for quantitative autofluorescence (qAF) measurements in mice, and to report changes in qAF, A2E bisretinoid concentration, and outer nuclear layer (ONL) thickness in mice of different genotypes and age. Fundus autofluorescence (AF) images (55° lens, 488 nm excitation) were acquired in albino Abca4(-/-), Abca4(+/-), and Abca4(+/+) mice (ages 2-12 months) with a confocal scanning laser ophthalmoscope (cSLO). Gray levels (GLs) in each image were calibrated to an internal fluorescence reference. The bisretinoid A2E was measured by quantitative high performance liquid chromatography (HPLC). Histometric analysis of ONL thicknesses was performed. The Bland-Altman coefficient of repeatability (95% confidence interval) was ±18% for between-session qAF measurements. Mean qAF values increased with age (2-12 months) in all groups of mice. qAF was approximately 2-fold higher in Abca4(-/-) mice than in Abca4(+/+) mice and approximately 20% higher in heterozygous mice. HPLC measurements of the lipofuscin fluorophore A2E also revealed age-associated increases, and the fold difference between Abca4(-/-) and wild-type mice was more pronounced (approximately 3-4-fold) than measurable by qAF. Moreover, A2E levels declined after 8 months of age, a change not observed with qAF. The decline in A2E levels in the Abca4(-/-) mice corresponded to reduced photoreceptor cell viability as reflected in ONL thinning beginning at 8 months of age. The qAF method enables measurement of in vivo lipofuscin and the detection of genotype and age-associated differences. The use of this approach has the potential to aid in understanding retinal disease processes and will facilitate preclinical studies.

  5. Quantitative force measurements in liquid using frequency modulation atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Uchihashi, Takayuki; Higgins, Michael J.; Yasuda, Satoshi; Jarvis, Suzanne P.; Akita, Seiji; Nakayama, Yoshikazu; Sader, John E.

    2004-10-01

    The measurement of short-range forces with the atomic force microscope (AFM) typically requires implementation of dynamic techniques to maintain sensitivity and stability. While frequency modulation atomic force microscopy (FM-AFM) is used widely for high-resolution imaging and quantitative force measurements in vacuum, quantitative force measurements using FM-AFM in liquids have proven elusive. Here we demonstrate that the formalism derived for operation in vacuum can also be used in liquids, provided certain modifications are implemented. To facilitate comparison with previous measurements taken using surface forces apparatus, we choose a model system (octamethylcyclotetrasiloxane) that is known to exhibit short-ranged structural ordering when confined between two surfaces. Force measurements obtained are found to be in excellent agreement with previously reported results. This study therefore establishes FM-AFM as a powerful tool for the quantitative measurement of forces in liquid.

  6. A novel semi-quantitative method for measuring tissue bleeding.

    PubMed

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  7. A general way for quantitative magnetic measurement by transmitted electrons

    NASA Astrophysics Data System (ADS)

    Song, Dongsheng; Li, Gen; Cai, Jianwang; Zhu, Jing

    2016-01-01

    EMCD (electron magnetic circular dichroism) technique opens a new door to explore magnetic properties by transmitted electrons. The recently developed site-specific EMCD technique makes it possible to obtain rich magnetic information from the Fe atoms sited at nonequivalent crystallographic planes in NiFe2O4, however it is based on a critical demand for the crystallographic structure of the testing sample. Here, we have further improved and tested the method for quantitative site-specific magnetic measurement applicable for more complex crystallographic structure by using the effective dynamical diffraction effects (general routine for selecting proper diffraction conditions, making use of the asymmetry of dynamical diffraction for design of experimental geometry and quantitative measurement, etc), and taken yttrium iron garnet (Y3Fe5O12, YIG) with more complex crystallographic structure as an example to demonstrate its applicability. As a result, the intrinsic magnetic circular dichroism signals, spin and orbital magnetic moment of iron with site-specific are quantitatively determined. The method will further promote the development of quantitative magnetic measurement with high spatial resolution by transmitted electrons.

  8. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  9. Quantitative measurement of oxygen in microgravity combustion

    NASA Technical Reports Server (NTRS)

    Silver, Joel A.

    1995-01-01

    This research combines two innovations in an experimental system which should result in a new capability for quantitative, nonintrusive measurement of major combustion species. Using a newly available vertical cavity surface-emitting diode laser (VCSEL) and an improved spatial scanning method, we plan to measure the temporal and spatial profiles of the concentrations and temperatures of molecular oxygen in a candle flame and in a solid fuel (cellulose sheet) system. The required sensitivity for detecting oxygen is achieved by the use of high frequency wavelength modulation spectroscopy (WMS). Measurements will be performed in the NASA Lewis 2.2-second Drop Tower Facility. The objective of this research is twofold. First, we want to develop a better understanding of the relative roles of diffusion and reaction of oxygen in microgravity combustion. As the primary oxidizer species, oxygen plays a major role in controlling the observed properties of flames, including flame front speed (in solid or liquid flames), extinguishment characteristics, flame size, and flame temperature. The second objective is to develop better diagnostics based on diode laser absorption which can be of real value in microgravity combustion research. We will also demonstrate diode lasers' potential usefulness for compact, intrinsically-safe monitoring sensors aboard spacecraft. Such sensors could be used to monitor any of the major cabin gases as well as important pollutants.

  10. Quantitative angle-insensitive flow measurement using relative standard deviation OCT

    NASA Astrophysics Data System (ADS)

    Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping

    2017-10-01

    Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo.

  11. Quantitative angle-insensitive flow measurement using relative standard deviation OCT.

    PubMed

    Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping

    2017-10-30

    Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .

  12. Quantitative Species Measurements In Microgravity Combustion Flames

    NASA Technical Reports Server (NTRS)

    Chen, Shin-Juh; Pilgrim, Jeffrey S.; Silver, Joel A.; Piltch, Nancy D.

    2003-01-01

    The capability of models and theories to accurately predict and describe the behavior of low gravity flames can only be verified by quantitative measurements. Although video imaging, simple temperature measurements, and velocimetry methods have provided useful information in many cases, there is still a need for quantitative species measurements. Over the past decade, we have been developing high sensitivity optical absorption techniques to permit in situ, non-intrusive, absolute concentration measurements for both major and minor flames species using diode lasers. This work has helped to establish wavelength modulation spectroscopy (WMS) as an important method for species detection within the restrictions of microgravity-based measurements. More recently, in collaboration with Prof. Dahm at the University of Michigan, a new methodology combining computed flame libraries with a single experimental measurement has allowed us to determine the concentration profiles for all species in a flame. This method, termed ITAC (Iterative Temperature with Assumed Chemistry) was demonstrated for a simple laminar nonpremixed methane-air flame at both 1-g and at 0-g in a vortex ring flame. In this paper, we report additional normal and microgravity experiments which further confirm the usefulness of this approach. We also present the development of a new type of laser. This is an external cavity diode laser (ECDL) which has the unique capability of high frequency modulation as well as a very wide tuning range. This will permit the detection of multiple species with one laser while using WMS detection.

  13. Comparative evaluation of performance measures for shading correction in time-lapse fluorescence microscopy.

    PubMed

    Liu, L; Kan, A; Leckie, C; Hodgkin, P D

    2017-04-01

    Time-lapse fluorescence microscopy is a valuable technology in cell biology, but it suffers from the inherent problem of intensity inhomogeneity due to uneven illumination or camera nonlinearity, known as shading artefacts. This will lead to inaccurate estimates of single-cell features such as average and total intensity. Numerous shading correction methods have been proposed to remove this effect. In order to compare the performance of different methods, many quantitative performance measures have been developed. However, there is little discussion about which performance measure should be generally applied for evaluation on real data, where the ground truth is absent. In this paper, the state-of-the-art shading correction methods and performance evaluation methods are reviewed. We implement 10 popular shading correction methods on two artificial datasets and four real ones. In order to make an objective comparison between those methods, we employ a number of quantitative performance measures. Extensive validation demonstrates that the coefficient of joint variation (CJV) is the most applicable measure in time-lapse fluorescence images. Based on this measure, we have proposed a novel shading correction method that performs better compared to well-established methods for a range of real data tested. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  14. Impact of image quality on OCT angiography based quantitative measurements.

    PubMed

    Al-Sheikh, Mayss; Ghasemi Falavarjani, Khalil; Akil, Handan; Sadda, SriniVas R

    2017-01-01

    To study the impact of image quality on quantitative measurements and the frequency of segmentation error with optical coherence tomography angiography (OCTA). Seventeen eyes of 10 healthy individuals were included in this study. OCTA was performed using a swept-source device (Triton, Topcon). Each subject underwent three scanning sessions 1-2 min apart; the first two scans were obtained under standard conditions and for the third session, the image quality index was reduced using application of a topical ointment. En face OCTA images of the retinal vasculature were generated using the default segmentation for the superficial and deep retinal layer (SRL, DRL). Intraclass correlation coefficient (ICC) was used as a measure for repeatability. The frequency of segmentation error, motion artifact, banding artifact and projection artifact was also compared among the three sessions. The frequency of segmentation error, and motion artifact was statistically similar between high and low image quality sessions (P = 0.707, and P = 1 respectively). However, the frequency of projection and banding artifact was higher with a lower image quality. The vessel density in the SRL was highly repeatable in the high image quality sessions (ICC = 0.8), however, the repeatability was low, comparing the high and low image quality measurements (ICC = 0.3). In the DRL, the repeatability of the vessel density measurements was fair in the high quality sessions (ICC = 0.6 and ICC = 0.5, with and without automatic artifact removal, respectively) and poor comparing high and low image quality sessions (ICC = 0.3 and ICC = 0.06, with and without automatic artifact removal, respectively). The frequency of artifacts is higher and the repeatability of the measurements is lower with lower image quality. The impact of image quality index should be always considered in OCTA based quantitative measurements.

  15. A quantitative measure for degree of automation and its relation to system performance and mental load.

    PubMed

    Wei, Z G; Macwan, A P; Wieringa, P A

    1998-06-01

    In this paper we quantitatively model degree of automation (DofA) in supervisory control as a function of the number and nature of tasks to be performed by the operator and automation. This model uses a task weighting scheme in which weighting factors are obtained from task demand load, task mental load, and task effect on system performance. The computation of DofA is demonstrated using an experimental system. Based on controlled experiments using operators, analyses of the task effect on system performance, the prediction and assessment of task demand load, and the prediction of mental load were performed. Each experiment had a different DofA. The effect of a change in DofA on system performance and mental load was investigated. It was found that system performance became less sensitive to changes in DofA at higher levels of DofA. The experimental data showed that when the operator controlled a partly automated system, perceived mental load could be predicted from the task mental load for each task component, as calculated by analyzing a situation in which all tasks were manually controlled. Actual or potential applications of this research include a methodology to balance and optimize the automation of complex industrial systems.

  16. Quantitative and Isolated Measurement of Far-Field Light Scattering by a Single Nanostructure

    NASA Astrophysics Data System (ADS)

    Kim, Donghyeong; Jeong, Kwang-Yong; Kim, Jinhyung; Ee, Ho-Seok; Kang, Ju-Hyung; Park, Hong-Gyu; Seo, Min-Kyo

    2017-11-01

    Light scattering by nanostructures has facilitated research on various optical phenomena and applications by interfacing the near fields and free-propagating radiation. However, direct quantitative measurement of far-field scattering by a single nanostructure on the wavelength scale or less is highly challenging. Conventional back-focal-plane imaging covers only a limited solid angle determined by the numerical aperture of the objectives and suffers from optical aberration and distortion. Here, we present a quantitative measurement of the differential far-field scattering cross section of a single nanostructure over the full hemisphere. In goniometer-based far-field scanning with a high signal-to-noise ratio of approximately 27.4 dB, weak scattering signals are efficiently isolated and detected under total-internal-reflection illumination. Systematic measurements reveal that the total and differential scattering cross sections of a Au nanorod are determined by the plasmonic Fabry-Perot resonances and the phase-matching conditions to the free-propagating radiation, respectively. We believe that our angle-resolved far-field measurement scheme provides a way to investigate and evaluate the physical properties and performance of nano-optical materials and phenomena.

  17. Quantitative measures of gingival recession and the influence of gender, race, and attrition.

    PubMed

    Handelman, Chester S; Eltink, Anthony P; BeGole, Ellen

    2018-01-29

    Gingival recession in dentitions with otherwise healthy periodontium is a common occurrence in adults. Recession is clinically measured using a periodontal probe to the nearest millimeter. The aim of this study is to establish quantitative measures of recession, the clinical crown height, and a new measure the gingival margin-papillae measurement. The latter is seen as the shortest apico-coronal distance measured from the depth of the gingival margin to a line connecting the tips of the two adjacent papillae. Measurements on all teeth up to and including the first molar were performed on pretreatment study models of 120 adult Caucasian and African-American subjects divided into four groups of 30 by gender and race. Both the clinical crown height and the gingival margin-papillae measurements gave a true positive result for changes associated with gingival recession. Tooth wear shortens the clinical crown, and therefore, the measure of clinical crown height can give a false negative result when gingival recession is present. However, the gingival margin-papillae measurement was not affected by tooth wear and gave a true positive result for gingival recession. Tooth wear (attrition) was not associated with an increase in gingival recession. These measures are also useful in detecting recession prior to cemental exposure. Measures for recession and tooth wear were different for the four demographic groups studied. These measures can be used as quantitative standards in both clinical dentistry, research, and epidemiological studies.

  18. Quantitative measures of walking and strength provide insight into brain corticospinal tract pathology in multiple sclerosis.

    PubMed

    Fritz, Nora E; Keller, Jennifer; Calabresi, Peter A; Zackowski, Kathleen M

    2017-01-01

    At least 85% of individuals with multiple sclerosis report walking dysfunction as their primary complaint. Walking and strength measures are common clinical measures to mark increasing disability or improvement with rehabilitation. Previous studies have shown an association between strength or walking ability and spinal cord MRI measures, and strength measures with brainstem corticospinal tract magnetization transfer ratio. However, the relationship between walking performance and brain corticospinal tract magnetization transfer imaging measures and the contribution of clinical measurements of walking and strength to the underlying integrity of the corticospinal tract has not been explored in multiple sclerosis. The objectives of this study were explore the relationship of quantitative measures of walking and strength to whole-brain corticospinal tract-specific MRI measures and to determine the contribution of quantitative measures of function in addition to basic clinical measures (age, gender, symptom duration and Expanded Disability Status Scale) to structural imaging measures of the corticospinal tract. We hypothesized that quantitative walking and strength measures would be related to brain corticospinal tract-specific measures, and would provide insight into the heterogeneity of brain pathology. Twenty-nine individuals with relapsing-remitting multiple sclerosis (mean(SD) age 48.7 (11.5) years; symptom duration 11.9(8.7); 17 females; median[range] Expanded Disability Status Scale 4.0 [1.0-6.5]) and 29 age and gender-matched healthy controls (age 50.8(11.6) years; 20 females) participated in clinical tests of strength and walking (Timed Up and Go, Timed 25 Foot Walk, Two Minute Walk Test ) as well as 3 T imaging including diffusion tensor imaging and magnetization transfer imaging. Individuals with multiple sclerosis were weaker (p = 0.0024) and walked slower (p = 0.0013) compared to controls. Quantitative measures of walking and strength were

  19. Introduction of an automated user-independent quantitative volumetric magnetic resonance imaging breast density measurement system using the Dixon sequence: comparison with mammographic breast density assessment.

    PubMed

    Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja

    2015-02-01

    The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative

  20. Quantitative chest computed tomography as a means of predicting exercise performance in severe emphysema.

    PubMed

    Crausman, R S; Ferguson, G; Irvin, C G; Make, B; Newell, J D

    1995-06-01

    We assessed the value of quantitative high-resolution computed tomography (CT) as a diagnostic and prognostic tool in smoking-related emphysema. We performed an inception cohort study of 14 patients referred with emphysema. The diagnosis of emphysema was based on a compatible history, physical examination, chest radiograph, CT scan of the lung, and pulmonary physiologic evaluation. As a group, those who underwent exercise testing were hyperinflated (percentage predicted total lung capacity +/- standard error of the mean = 133 +/- 9%), and there was evidence of air trapping (percentage predicted respiratory volume = 318 +/- 31%) and airflow limitation (forced expiratory volume in 1 sec [FEV1] = 40 +/- 7%). The exercise performance of the group was severely limited (maximum achievable workload = 43 +/- 6%) and was characterized by prominent ventilatory, gas exchange, and pulmonary vascular abnormalities. The quantitative CT index was markedly elevated in all patients (76 +/- 9; n = 14; normal < 4). There were correlations between this quantitative CT index and measures of airflow limitation (FEV1 r2 = .34, p = 09; FEV1/forced vital capacity r2 = .46, p = .04) and between maximum workload achieved (r2 = .93, p = .0001) and maximum oxygen utilization (r2 = .83, p = .0007). Quantitative chest CT assessment of disease severity is correlated with the degree of airflow limitation and exercise impairment in pulmonary emphysema.

  1. On Measuring Quantitative Interpretations of Reasonable Doubt

    ERIC Educational Resources Information Center

    Dhami, Mandeep K.

    2008-01-01

    Beyond reasonable doubt represents a probability value that acts as the criterion for conviction in criminal trials. I introduce the membership function (MF) method as a new tool for measuring quantitative interpretations of reasonable doubt. Experiment 1 demonstrated that three different methods (i.e., direct rating, decision theory based, and…

  2. Prediction of Coronal Mass Ejections From Vector Magnetograms: Quantitative Measures as Predictors

    NASA Technical Reports Server (NTRS)

    Falconer, D. A.; Moore, R. L.; Gary, G. A.; Rose, M. Franklin (Technical Monitor)

    2001-01-01

    We derived two quantitative measures of an active region's global nonpotentiality from the region's vector magnetogram, 1) the net current (I(sub N)), and 2) the length of strong-shear, strong-field main neutral line (Lss), and used these two measures in a pilot study of the CME productivity of 4 active regions. We compared the global nonpotentiality measures to the active regions' CME productivity determined from GOES and Yohkoh/SXT observations. We found that two of the active regions were highly globally nonpotential and were CME productive, while the other two active regions had little global nonpotentiality and produced no CMEs. At the Fall 2000 AGU, we reported on an expanded study (12 active regions and 17 magnetograms) in which we evaluated four quantitative global measures of an active region's magnetic field and compared these measures with the CME productivity. The four global measures (all derived from MSFC vector magnetograms) included our two previous measures (I(sub N) and L(sub ss)) as well as two new ones, the total magnetic flux (PHI) (a measure of an active region's size), and the normalized twist (alpha (bar)= muIN/PHI). We found that the three quantitative measures of global nonpotentiality (I(sub N), L(sub ss), alpha (bar)) were all well correlated (greater than 99% confidence level) with an active region's CME productivity within plus or minus 2 days of the day of the magnetogram. We will now report on our findings of how good our quantitative measures are as predictors of active-region CME productivity, using only CMEs that occurred after the magnetogram. We report the preliminary skill test of these quantitative measures as predictors. We compare the CME prediction success of our quantitative measures to the CME prediction success based on an active region's past CME productivity. We examine the cases of the handful of false positive and false negatives to look for improvements to our predictors. This work is funded by NSF through the Space

  3. Management-by-Results and Performance Measurement in Universities--Implications for Work Motivation

    ERIC Educational Resources Information Center

    Kallio, Kirsi-Mari; Kallio, Tomi J.

    2014-01-01

    The article focuses on the effects of management-by-results from the perspective of the work motivation of university employees. The study is based on extensive survey data among employees at Finnish universities. According to the results, performance measurement is based on quantitative rather than qualitative measures, and the current…

  4. Quantitative impedance measurements for eddy current model validation

    NASA Astrophysics Data System (ADS)

    Khan, T. A.; Nakagawa, N.

    2000-05-01

    This paper reports on a series of laboratory-based impedance measurement data, collected by the use of a quantitatively accurate, mechanically controlled measurement station. The purpose of the measurement is to validate a BEM-based eddy current model against experiment. We have therefore selected two "validation probes," which are both split-D differential probes. Their internal structures and dimensions are extracted from x-ray CT scan data, and thus known within the measurement tolerance. A series of measurements was carried out, using the validation probes and two Ti-6Al-4V block specimens, one containing two 1-mm long fatigue cracks, and the other containing six EDM notches of a range of sizes. Motor-controlled XY scanner performed raster scans over the cracks, with the probe riding on the surface with a spring-loaded mechanism to maintain the lift off. Both an impedance analyzer and a commercial EC instrument were used in the measurement. The probes were driven in both differential and single-coil modes for the specific purpose of model validation. The differential measurements were done exclusively by the eddyscope, while the single-coil data were taken with both the impedance analyzer and the eddyscope. From the single-coil measurements, we obtained the transfer function to translate the voltage output of the eddyscope into impedance values, and then used it to translate the differential measurement data into impedance results. The presentation will highlight the schematics of the measurement procedure, a representative of raw data, explanation of the post data-processing procedure, and then a series of resulting 2D flaw impedance results. A noise estimation will be given also, in order to quantify the accuracy of these measurements, and to be used in probability-of-detection estimation.—This work was supported by the NSF Industry/University Cooperative Research Program.

  5. The measurement of liver fat from single-energy quantitative computed tomography scans

    PubMed Central

    Cheng, Xiaoguang; Brown, J. Keenan; Guo, Zhe; Zhou, Jun; Wang, Fengzhe; Yang, Liqiang; Wang, Xiaohong; Xu, Li

    2017-01-01

    Background Studies of soft tissue composition using computed tomography (CT) scans are often semi-quantitative and based on Hounsfield units (HU) measurements that have not been calibrated with a quantitative CT (QCT) phantom. We describe a study to establish the water (H2O) and dipotassium hydrogen phosphate (K2HPO4) basis set equivalent densities of fat and fat-free liver tissue. With this information liver fat can be accurately measured from any abdominal CT scan calibrated with a suitable phantom. Methods Liver fat content was measured by comparing single-energy QCT (SEQCT) HU measurements of the liver with predicted HU values for fat and fat-free liver tissue calculated from their H2O and K2HPO4 equivalent densities and calibration data from a QCT phantom. The equivalent densities of fat were derived from a listing of its constituent fatty acids, and those of fat-free liver tissue from a dual-energy QCT (DEQCT) study performed in 14 healthy Chinese subjects. This information was used to calculate liver fat from abdominal SEQCT scans performed in a further 541 healthy Chinese subjects (mean age 62 years; range, 31–95 years) enrolled in the Prospective Urban Rural Epidemiology (PURE) Study. Results The equivalent densities of fat were 941.75 mg/cm3 H2O and –43.72 mg/cm3 K2HPO4, and for fat-free liver tissue 1,040.13 mg/cm3 H2O and 21.34 mg/cm3 K2HPO4. Liver fat in the 14 subjects in the DEQCT study varied from 0–17.9% [median: 4.5%; interquartile range (IQR): 3.0–7.9%]. Liver fat in the 541 PURE study subjects varied from –0.3–29.9% (median: 4.9%; IQR: 3.4–6.9%). Conclusions We have established H2O and K2HPO4 equivalent densities for fat and fat-free liver tissue that allow a measurement of liver fat to be obtained from any abdominal CT scan acquired with a QCT phantom. Although radiation dose considerations preclude the routine use of QCT to measure liver fat, the method described here facilitates its measurement in patients having CT scans

  6. Quantitative color measurement for black walnut wood.

    Treesearch

    Ali A. Moslemi

    1967-01-01

    Black walnut (Juglans nigra L.) veneer specimens with wide variations in color were evaluated by a quantitative method of color measurement. The internationally adopted CIE system of colorimetry was used to analyze the data. These data were converted to also show them in the Munsell system. Color differences among the walnut veneer specimens were also numerically...

  7. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  8. Quantitative Measurement of Trans-Fats by Infrared Spectroscopy

    ERIC Educational Resources Information Center

    Walker, Edward B.; Davies, Don R.; Campbell, Mike

    2007-01-01

    Trans-fat is a general term, which is mainly used to describe the various trans geometric isomers present in unsaturated fatty acids. Various techniques are now used for a quantitative measurement of the amount of trans-fats present in foods and cooking oil.

  9. A workload model and measures for computer performance evaluation

    NASA Technical Reports Server (NTRS)

    Kerner, H.; Kuemmerle, K.

    1972-01-01

    A generalized workload definition is presented which constructs measurable workloads of unit size from workload elements, called elementary processes. An elementary process makes almost exclusive use of one of the processors, CPU, I/O processor, etc., and is measured by the cost of its execution. Various kinds of user programs can be simulated by quantitative composition of elementary processes into a type. The character of the type is defined by the weights of its elementary processes and its structure by the amount and sequence of transitions between its elementary processes. A set of types is batched to a mix. Mixes of identical cost are considered as equivalent amounts of workload. These formalized descriptions of workloads allow investigators to compare the results of different studies quantitatively. Since workloads of different composition are assigned a unit of cost, these descriptions enable determination of cost effectiveness of different workloads on a machine. Subsequently performance parameters such as throughput rate, gain factor, internal and external delay factors are defined and used to demonstrate the effects of various workload attributes on the performance of a selected large scale computer system.

  10. Comparison of MPEG-1 digital videotape with digitized sVHS videotape for quantitative echocardiographic measurements

    NASA Technical Reports Server (NTRS)

    Garcia, M. J.; Thomas, J. D.; Greenberg, N.; Sandelski, J.; Herrera, C.; Mudd, C.; Wicks, J.; Spencer, K.; Neumann, A.; Sankpal, B.; hide

    2001-01-01

    Digital format is rapidly emerging as a preferred method for displaying and retrieving echocardiographic studies. The qualitative diagnostic accuracy of Moving Pictures Experts Group (MPEG-1) compressed digital echocardiographic studies has been previously reported. The goals of the present study were to compare quantitative measurements derived from MPEG-1 recordings with the super-VHS (sVHS) videotape clinical standard. Six reviewers performed blinded measurements from still-frame images selected from 20 echocardiographic studies that were simultaneously acquired in sVHS and MPEG-1 formats. Measurements were obtainable in 1401 (95%) of 1486 MPEG-1 variables compared with 1356 (91%) of 1486 sVHS variables (P <.001). Excellent agreement existed between MPEG-1 and sVHS 2-dimensional linear measurements (r = 0.97; MPEG-1 = 0.95[sVHS] + 1.1 mm; P <.001; Delta = 9% +/- 10%), 2-dimensional area measurements (r = 0.89), color jet areas (r = 0.87, p <.001), and Doppler velocities (r = 0.92, p <.001). Interobserver variability was similar for both sVHS and MPEG-1 readings. Our results indicate that quantitative off-line measurements from MPEG-1 digitized echocardiographic studies are feasible and comparable to those obtained from sVHS.

  11. Quantitative fluorescence measurements performed on typical matrix molecules in matrix-assisted laser desorption/ionisation

    NASA Astrophysics Data System (ADS)

    Allwood, D. A.; Dyer, P. E.

    2000-11-01

    Fundamental photophysical parameters have been determined for several molecules that are commonly used as matrices, e.g. ferulic acid, within matrix-assisted laser desorption/ionization (MALDI) mass spectrometry. Fluorescence quantum efficiencies ( φqe), singlet decay rates ( kl), vibrationless ground-singlet transition energies and average fluorescence wavelengths have been obtained from solid and solution samples by quantitative optical measurements. This new data will assist in modelling calculations of MALDI processes and in highlighting desirable characteristics of MALDI matrices. φqe may be as high as 0.59 whilst the radiative decay rate ( kf) appears to be within the (0.8-4)×10 8 s -1 range. Interestingly, α-cyano-4-hydroxycinnamic acid (α-CHC) has a very low φqe and fast non-radiative decay rate which would imply a rapid and efficient thermalisation of electronic excitation. This is in keeping with observations that α-CHC exhibits low threshold fluences for ion detection and the low fluences at which α-CHC tends to fragment.

  12. Scanning transmission ion microscopy mass measurements for quantitative trace element analysis within biological samples and validation using atomic force microscopy thickness measurements

    NASA Astrophysics Data System (ADS)

    Devès, Guillaume; Cohen-Bouhacina, Touria; Ortega, Richard

    2004-10-01

    We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm).

  13. Measurement Invariance: A Foundational Principle for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…

  14. Quantitative measurement of feline colonic transit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krevsky, B.; Somers, M.B.; Maurer, A.H.

    1988-10-01

    Colonic transit scintigraphy, a method for quantitatively evaluating the movement of the fecal stream in vivo, was employed to evaluate colonic transit in the cat. Scintigraphy was performed in duplicate in five cats and repeated four times in one cat. After instillation of an 111In marker into the cecum through a surgically implanted silicone cecostomy tube, colonic movement of the instillate was quantitated for 24 h using gamma scintigraphy. Antegrade and retrograde motion of radionuclide was observed. The cecum and ascending colon emptied rapidly, with a half-emptying time of 1.68 +/- 0.56 h (mean +/- SE). After 24 h, 25.1more » +/- 5.2% of the activity remained in the transverse colon. The progression of the geometric center was initially rapid, followed later by a delayed phase. Geometric center reproducibility was found to be high when analyzed using simple linear regression (slope = 0.92; r = 0.73; P less than 0.01). Atropine (0.1 mg/kg im) was found to delay cecum and ascending colon emptying and delay progression of the geometric center. These results demonstrate both 1) the ability of colonic transit scintigraphy to detect changes in transit induced by pharmacological manipulation and 2) the fact that muscarinic blockade inhibits antegrade transit of the fecal stream. We conclude that feline colonic transit may be studied in a quantitative and reproducible manner with colonic transit scintigraphy.« less

  15. Racial Differences in Quantitative Measures of Area and Volumetric Breast Density

    PubMed Central

    McCarthy, Anne Marie; Keller, Brad M.; Pantalone, Lauren M.; Hsieh, Meng-Kang; Synnestvedt, Marie; Conant, Emily F.; Armstrong, Katrina; Kontos, Despina

    2016-01-01

    Abstract Background: Increased breast density is a strong risk factor for breast cancer and also decreases the sensitivity of mammographic screening. The purpose of our study was to compare breast density for black and white women using quantitative measures. Methods: Breast density was assessed among 5282 black and 4216 white women screened using digital mammography. Breast Imaging-Reporting and Data System (BI-RADS) density was obtained from radiologists’ reports. Quantitative measures for dense area, area percent density (PD), dense volume, and volume percent density were estimated using validated, automated software. Breast density was categorized as dense or nondense based on BI-RADS categories or based on values above and below the median for quantitative measures. Logistic regression was used to estimate the odds of having dense breasts by race, adjusted for age, body mass index (BMI), age at menarche, menopause status, family history of breast or ovarian cancer, parity and age at first birth, and current hormone replacement therapy (HRT) use. All statistical tests were two-sided. Results: There was a statistically significant interaction of race and BMI on breast density. After accounting for age, BMI, and breast cancer risk factors, black women had statistically significantly greater odds of high breast density across all quantitative measures (eg, PD nonobese odds ratio [OR] = 1.18, 95% confidence interval [CI] = 1.02 to 1.37, P = .03, PD obese OR = 1.26, 95% CI = 1.04 to 1.53, P = .02). There was no statistically significant difference in BI-RADS density by race. Conclusions: After accounting for age, BMI, and other risk factors, black women had higher breast density than white women across all quantitative measures previously associated with breast cancer risk. These results may have implications for risk assessment and screening. PMID:27130893

  16. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: QA TESTS, QUANTITATION AND SPECTROSCOPY

    EPA Science Inventory

    Confocal Microscopy System Performance: QA tests, Quantitation and Spectroscopy.

    Robert M. Zucker 1 and Jeremy M. Lerner 2,
    1Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research Development, U.S. Environmen...

  17. Piezoelectric tuning fork biosensors for the quantitative measurement of biomolecular interactions

    NASA Astrophysics Data System (ADS)

    Gonzalez, Laura; Rodrigues, Mafalda; Benito, Angel Maria; Pérez-García, Lluïsa; Puig-Vidal, Manel; Otero, Jorge

    2015-12-01

    The quantitative measurement of biomolecular interactions is of great interest in molecular biology. Atomic force microscopy (AFM) has proved its capacity to act as a biosensor and determine the affinity between biomolecules of interest. Nevertheless, the detection scheme presents certain limitations when it comes to developing a compact biosensor. Recently, piezoelectric quartz tuning forks (QTFs) have been used as laser-free detection sensors for AFM. However, only a few studies along these lines have considered soft biological samples, and even fewer constitute quantified molecular recognition experiments. Here, we demonstrate the capacity of QTF probes to perform specific interaction measurements between biotin-streptavidin complexes in buffer solution. We propose in this paper a variant of dynamic force spectroscopy based on representing adhesion energies E (aJ) against pulling rates v (nm s-1). Our results are compared with conventional AFM measurements and show the great potential of these sensors in molecular interaction studies.

  18. Development of an exposure measurement database on five lung carcinogens (ExpoSYN) for quantitative retrospective occupational exposure assessment.

    PubMed

    Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2012-01-01

    SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.

  19. Sensitive and quantitative measurement of gene expression directly from a small amount of whole blood.

    PubMed

    Zheng, Zhi; Luo, Yuling; McMaster, Gary K

    2006-07-01

    Accurate and precise quantification of mRNA in whole blood is made difficult by gene expression changes during blood processing, and by variations and biases introduced by sample preparations. We sought to develop a quantitative whole-blood mRNA assay that eliminates blood purification, RNA isolation, reverse transcription, and target amplification while providing high-quality data in an easy assay format. We performed single- and multiplex gene expression analysis with multiple hybridization probes to capture mRNA directly from blood lysate and used branched DNA to amplify the signal. The 96-well plate singleplex assay uses chemiluminescence detection, and the multiplex assay combines Luminex-encoded beads with fluorescent detection. The single- and multiplex assays could quantitatively measure as few as 6000 and 24,000 mRNA target molecules (0.01 and 0.04 amoles), respectively, in up to 25 microL of whole blood. Both formats had CVs < 10% and dynamic ranges of 3-4 logs. Assay sensitivities allowed quantitative measurement of gene expression in the minority of cells in whole blood. The signals from whole-blood lysate correlated well with signals from purified RNA of the same sample, and absolute mRNA quantification results from the assay were similar to those obtained by quantitative reverse transcription-PCR. Both single- and multiplex assay formats were compatible with common anticoagulants and PAXgene-treated samples; however, PAXgene preparations induced expression of known antiapoptotic genes in whole blood. Both the singleplex and the multiplex branched DNA assays can quantitatively measure mRNA expression directly from small volumes of whole blood. The assay offers an alternative to current technologies that depend on RNA isolation and is amenable to high-throughput gene expression analysis of whole blood.

  20. Leadership Strategies of Performance Measures Impacts in Public Sector Management: A National Content Analysis.

    ERIC Educational Resources Information Center

    Kubala, James Joseph

    A quantitative and qualitative study examined three leadership strategies found in performance-based management (human resource, scientific management and political strategies used in public sector management); a framework by which performance measurement (PM) supports leadership strategies; and how the strategies impact PM. It examined leadership…

  1. Effects of performance measure implementation on clinical manager and provider motivation.

    PubMed

    Damschroder, Laura J; Robinson, Claire H; Francis, Joseph; Bentley, Douglas R; Krein, Sarah L; Rosland, Ann-Marie; Hofer, Timothy P; Kerr, Eve A

    2014-12-01

    Clinical performance measurement has been a key element of efforts to transform the Veterans Health Administration (VHA). However, there are a number of signs that current performance measurement systems used within and outside the VHA may be reaching the point of maximum benefit to care and in some settings, may be resulting in negative consequences to care, including overtreatment and diminished attention to patient needs and preferences. Our research group has been involved in a long-standing partnership with the office responsible for clinical performance measurement in the VHA to understand and develop potential strategies to mitigate the unintended consequences of measurement. Our aim was to understand how the implementation of diabetes performance measures (PMs) influences management actions and day-to-day clinical practice. This is a mixed methods study design based on quantitative administrative data to select study facilities and quantitative data from semi-structured interviews. Sixty-two network-level and facility-level executives, managers, front-line providers and staff participated in the study. Qualitative content analyses were guided by a team-based consensus approach using verbatim interview transcripts. A published interpretive motivation theory framework is used to describe potential contributions of local implementation strategies to unintended consequences of PMs. Implementation strategies used by management affect providers' response to PMs, which in turn potentially undermines provision of high-quality patient-centered care. These include: 1) feedback reports to providers that are dissociated from a realistic capability to address performance gaps; 2) evaluative criteria set by managers that are at odds with patient-centered care; and 3) pressure created by managers' narrow focus on gaps in PMs that is viewed as more punitive than motivating. Next steps include working with VHA leaders to develop and test implementation approaches to help

  2. Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate

    NASA Astrophysics Data System (ADS)

    Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv

    2010-03-01

    Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and

  3. Coming up short on nonfinancial performance measurement.

    PubMed

    Ittner, Christopher D; Larcker, David F

    2003-11-01

    Companies in increasing numbers are measuring customer loyalty, employee satisfaction, and other nonfinancial areas of performance that they believe affect profitability. But they've failed to relate these measures to their strategic goals or establish a connection between activities undertaken and financial outcomes achieved. Failure to make such connections has led many companies to misdirect their investments and reward ineffective managers. Extensive field research now shows that businesses make some common mistakes when choosing, analyzing, and acting on their nonfinancial measures. Among these mistakes: They set the wrong performance targets because they focus too much on short-term financial results, and they use metrics that lack strong statistical validity and reliability. As a result, the companies can't demonstrate that improvements in nonfinancial measures actually affect their financial results. The authors lay out a series of steps that will allow companies to realize the genuine promise of nonfinancial performance measures. First, develop a model that proposes a causal relationship between the chosen nonfinancial drivers of strategic success and specific outcomes. Next, take careful inventory of all the data within your company. Then use established statistical methods for validating the assumed relationships and continue to test the model as market conditions evolve. Finally, base action plans on analysis of your findings, and determine whether those plans and their investments actually produce the desired results. Nonfinancial measures will offer little guidance unless you use a process for choosing and analyzing them that relies on sophisticated quantitative and qualitative inquiries into the factors actually contributing to economic results.

  4. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  5. Prediction of Coronal Mass Ejections from Vector Magnetograms: Quantitative Measures as Predictors

    NASA Astrophysics Data System (ADS)

    Falconer, D. A.; Moore, R. L.; Gary, G. A.

    2001-05-01

    In a pilot study of 4 active regions (Falconer, D.A. 2001, JGR, in press), we derived two quantitative measures of an active region's global nonpotentiality from the region's vector magnetogram, 1) the net current (IN), and 2) the length of the strong-shear, strong-field main neutral line (LSS), and used these two measures of the CME productivity of the active regions. We compared the global nonpotentiality measures to the active regions' CME productivity determined from GOES and Yohkoh/SXT observations. We found that two of the active regions were highly globally nonpotential and were CME productive, while the other two active regions had little global nonpotentiality and produced no CMEs. At the Fall 2000 AGU (Falconer, Moore, & Gary, 2000, EOS 81, 48 F998), we reported on an expanded study (12 active regions and 17 magnetograms) in which we evaluated four quantitative global measures of an active region's magnetic field and compared these measures with the CME productivity. The four global measures (all derived from MSFC vector magnetograms) included our two previous measures (IN and LSS) as well as two new ones, the total magnetic flux (Φ ) (a measure of an active region's size), and the normalized twist (α =μ IN/Φ ). We found that the three measures of global nonpotentiality (IN, LSS, α ) were all well correlated (>99% confidence level) with an active region's CME productivity within (2 days of the day of the magnetogram. We will now report on our findings of how good our quantitative measures are as predictors of active-region CME productivity, using only CMEs that occurred after the magnetogram. We report the preliminary skill test of these quantitative measures as predictors. We compare the CME prediction success of our quantitative measures to the CME prediction success based on an active region's past CME productivity. We examine the cases of the handful of false positive and false negatives to look for improvements to our predictors. This work is

  6. MR-ARFI-based method for the quantitative measurement of tissue elasticity: application for monitoring HIFU therapy

    NASA Astrophysics Data System (ADS)

    Vappou, Jonathan; Bour, Pierre; Marquet, Fabrice; Ozenne, Valery; Quesson, Bruno

    2018-05-01

    Monitoring thermal therapies through medical imaging is essential in order to ensure that they are safe, efficient and reliable. In this paper, we propose a new approach, halfway between MR acoustic radiation force imaging (MR-ARFI) and MR elastography (MRE), allowing for the quantitative measurement of the elastic modulus of tissue in a highly localized manner. It relies on the simulation of the MR-ARFI profile, which depends on tissue biomechanical properties, and on the identification of tissue elasticity through the fitting of experimental displacement images measured using rapid MR-ARFI. This method was specifically developed to monitor MR-guided high intensity focused ultrasound (MRgHIFU) therapy. Elasticity changes were followed during HIFU ablations (N  =  6) performed ex vivo in porcine muscle samples, and were compared to temperature changes measured by MR-thermometry. Shear modulus was found to increase consistently and steadily a few seconds after the heating started, and such changes were found to be irreversible. The shear modulus was found to increase from 1.49  ±  0.48 kPa (before ablation) to 3.69  ±  0.93 kPa (after ablation and cooling). Thanks to its ability to perform quantitative elasticity measurements in a highly localized manner around the focal spot, this method proved to be particularly attractive for monitoring HIFU ablations.

  7. The predictive value of quantitative fibronectin testing in combination with cervical length measurement in symptomatic women.

    PubMed

    Bruijn, Merel M C; Kamphuis, Esme I; Hoesli, Irene M; Martinez de Tejada, Begoña; Loccufier, Anne R; Kühnert, Maritta; Helmer, Hanns; Franz, Marie; Porath, Martina M; Oudijk, Martijn A; Jacquemyn, Yves; Schulzke, Sven M; Vetter, Grit; Hoste, Griet; Vis, Jolande Y; Kok, Marjolein; Mol, Ben W J; van Baaren, Gert-Jan

    2016-12-01

    The combination of the qualitative fetal fibronectin test and cervical length measurement has a high negative predictive value for preterm birth within 7 days; however, positive prediction is poor. A new bedside quantitative fetal fibronectin test showed potential additional value over the conventional qualitative test, but there is limited evidence on the combination with cervical length measurement. The purpose of this study was to compare quantitative fetal fibronectin and qualitative fetal fibronectin testing in the prediction of spontaneous preterm birth within 7 days in symptomatic women who undergo cervical length measurement. We performed a European multicenter cohort study in 10 perinatal centers in 5 countries. Women between 24 and 34 weeks of gestation with signs of active labor and intact membranes underwent quantitative fibronectin testing and cervical length measurement. We assessed the risk of preterm birth within 7 days in predefined strata based on fibronectin concentration and cervical length. Of 455 women who were included in the study, 48 women (11%) delivered within 7 days. A combination of cervical length and qualitative fibronectin resulted in the identification of 246 women who were at low risk: 164 women with a cervix between 15 and 30 mm and a negative fibronectin test (<50 ng/mL; preterm birth rate, 2%) and 82 women with a cervix at >30 mm (preterm birth rate, 2%). Use of quantitative fibronectin alone resulted in a predicted risk of preterm birth within 7 days that ranged from 2% in the group with the lowest fibronectin level (<10 ng/mL) to 38% in the group with the highest fibronectin level (>500 ng/mL), with similar accuracy as that of the combination of cervical length and qualitative fibronectin. Combining cervical length and quantitative fibronectin resulted in the identification of an additional 19 women at low risk (preterm birth rate, 5%), using a threshold of 10 ng/mL in women with a cervix at <15 mm, and 6 women at high risk

  8. Quantitative measures detect sensory and motor impairments in multiple sclerosis.

    PubMed

    Newsome, Scott D; Wang, Joseph I; Kang, Jonathan Y; Calabresi, Peter A; Zackowski, Kathleen M

    2011-06-15

    Sensory and motor dysfunction in multiple sclerosis (MS) is often assessed with rating scales which rely heavily on clinical judgment. Quantitative devices may be more precise than rating scales. To quantify lower extremity sensorimotor measures in individuals with MS, evaluate the extent to which they can detect functional systems impairments, and determine their relationship to global disability measures. We tested 145 MS subjects and 58 controls. Vibration thresholds were quantified using a Vibratron-II device. Strength was quantified by a hand-held dynamometer. We also recorded Expanded Disability Status Scale (EDSS) and Timed 25-Foot Walk (T25FW). t-tests and Wilcoxon-rank sum were used to compare group data. Spearman correlations were used to assess relationships between each measure. We also used a step-wise linear regression model to determine how much the quantitative measures explain the variance in the respective functional systems scores (FSS). EDSS scores ranged from 0-7.5, mean disease duration was 10.4 ± 9.6 years, and 66% were female. In relapsing-remitting MS, but not progressive MS, poorer vibration sensation correlated with a worse EDSS score, whereas progressive groups' ankle/hip strength changed significantly with EDSS progression. Interestingly, not only did sensorimotor measures significantly correlate with global disability measures (i.e., EDSS), but they had improved sensitivity, as they detected impairments in up to 32% of MS subjects with normal sensory and pyramidal FSS. Sensory and motor deficits in MS can be quantified using clinically accessible tools and distinguish differences among MS subtypes. We show that quantitative sensorimotor measures are more sensitive than FSS from the EDSS. These tools have the potential to be used as clinical outcome measures in practice and for future MS clinical trials of neurorehabilitative and neuroreparative interventions. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Quantitative measures detect sensory and motor impairments in multiple sclerosis

    PubMed Central

    Newsome, Scott D.; Wang, Joseph I.; Kang, Jonathan Y.; Calabresi, Peter A.; Zackowski, Kathleen M.

    2011-01-01

    Background Sensory and motor dysfunction in multiple sclerosis (MS) is often assessed with rating scales which rely heavily on clinical judgment. Quantitative devices may be more precise than rating scales. Objective To quantify lower extremity sensorimotor measures in individuals with MS, evaluate the extent to which they can detect functional systems impairments, and determine their relationship to global disability measures. Methods We tested 145 MS subjects and 58 controls. Vibration thresholds were quantified using a Vibratron-II device. Strength was quantified by a hand-held dynamometer. We also recorded Expanded Disability Status Scale (EDSS) and timed 25-foot walk (T25FW). T-tests and Wilcoxon-rank sum were used to compare group data. Spearman correlations were used to assess relationships between each measure. We also used a step-wise linear regression model to determine how much the quantitative measures explain the variance in the respective functional systems scores (FSS). Results EDSS scores ranged from 0-7.5, mean disease duration was 10.4±9.6 years, and 66% were female. In RRMS, but not progressive MS, poorer vibration sensation correlated with a worse EDSS score, whereas progressive groups’ ankle/hip strength changed significantly with EDSS progression. Interestingly, not only did sensorimotor measures significantly correlate with global disability measures (EDSS), but they had improved sensitivity, as they detected impairments in up to 32% of MS subjects with normal sensory FSS. Conclusions Sensory and motor deficits can be quantified using clinically accessible tools and distinguish differences among MS subtypes. We show that quantitative sensorimotor measures are more sensitive than FSS from the EDSS. These tools have the potential to be used as clinical outcome measures in practice and for future MS clinical trials of neurorehabilitative and neuroreparative interventions. PMID:21458828

  10. A Quantitative Assessment of Student Performance and Examination Format

    ERIC Educational Resources Information Center

    Davison, Christopher B.; Dustova, Gandzhina

    2017-01-01

    This research study describes the correlations between student performance and examination format in a higher education teaching and research institution. The researchers employed a quantitative, correlational methodology utilizing linear regression analysis. The data was obtained from undergraduate student test scores over a three-year time span.…

  11. Quantitative Ultrasound for Measuring Obstructive Severity in Children with Hydronephrosis.

    PubMed

    Cerrolaza, Juan J; Peters, Craig A; Martin, Aaron D; Myers, Emmarie; Safdar, Nabile; Linguraru, Marius George

    2016-04-01

    We define sonographic biomarkers for hydronephrotic renal units that can predict the necessity of diuretic nuclear renography. We selected a cohort of 50 consecutive patients with hydronephrosis of varying severity in whom 2-dimensional sonography and diuretic mercaptoacetyltriglycine renography had been performed. A total of 131 morphological parameters were computed using quantitative image analysis algorithms. Machine learning techniques were then applied to identify ultrasound based safety thresholds that agreed with the t½ for washout. A best fit model was then derived for each threshold level of t½ that would be clinically relevant at 20, 30 and 40 minutes. Receiver operating characteristic curve analysis was performed. Sensitivity, specificity and area under the receiver operating characteristic curve were determined. Improvement obtained by the quantitative imaging method compared to the Society for Fetal Urology grading system and the hydronephrosis index was statistically verified. For the 3 thresholds considered and at 100% sensitivity the specificities of the quantitative imaging method were 94%, 70% and 74%, respectively. Corresponding area under the receiver operating characteristic curve values were 0.98, 0.94 and 0.94, respectively. Improvement obtained by the quantitative imaging method over the Society for Fetal Urology grade and hydronephrosis index was statistically significant (p <0.05 in all cases). Quantitative imaging analysis of renal sonograms in children with hydronephrosis can identify thresholds of clinically significant washout times with 100% sensitivity to decrease the number of diuretic renograms in up to 62% of children. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  12. Quantitative Measures of Swallowing Deficits in Patients With Parkinson's Disease.

    PubMed

    Ellerston, Julia K; Heller, Amanda C; Houtz, Daniel R; Kendall, Katherine A

    2016-05-01

    Dysphagia and associated aspiration pneumonia are commonly reported sequelae of Parkinson's disease (PD). Previous studies of swallowing in patients with PD have described prolonged pharyngeal transit time, delayed onset of pharyngeal transit, cricopharyngeal (CP) achalasia, reduced pharyngeal constriction, and slowed hyolaryngeal elevation. These studies were completed using inconsistent evaluation methodology, reliance on qualitative analysis, and a lack of a large control group, resulting in concerns regarding diagnostic precision. The purpose of this study was to investigate swallowing function in patients with PD using a norm-referenced, quantitative approach. This retrospective study includes 34 patients with a diagnosis of PD referred to a multidisciplinary voice and swallowing clinic. Modified barium swallow studies were performed using quantitative measures of pharyngeal transit time, hyoid displacement, CP sphincter opening, area of the pharynx at maximal constriction, and timing of laryngeal vestibule closure relative to bolus arrival at the CP sphincter. Reduced pharyngeal constriction was found in 30.4%, and a delay in airway closure relative to arrival of the bolus at the CP sphincter was the most common abnormality, present in 62% of patients. Previously reported findings of prolonged pharyngeal transit, poor hyoid elevation, and CP achalasia were not identified as prominent features. © The Author(s) 2015.

  13. Cleavage Entropy as Quantitative Measure of Protease Specificity

    PubMed Central

    Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.

    2013-01-01

    A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583

  14. Quantitation of absorbed or deposited materials on a substrate that measures energy deposition

    DOEpatents

    Grant, Patrick G.; Bakajin, Olgica; Vogel, John S.; Bench, Graham

    2005-01-18

    This invention provides a system and method for measuring an energy differential that correlates to quantitative measurement of an amount mass of an applied localized material. Such a system and method remains compatible with other methods of analysis, such as, for example, quantitating the elemental or isotopic content, identifying the material, or using the material in biochemical analysis.

  15. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  16. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    DTIC Science & Technology

    2017-08-09

    per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...Journal Article 3. DATES COVERED (From – To) January 2015 – July 2017 4. TITLE AND SUBTITLE Reproducibility of Quantitative Structural and...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USAF School of Aerospace Medicine Aeromedical Research Dept/FHOH 2510 Fifth St., Bldg

  17. Novel method for quantitative ANA measurement using near-infrared imaging.

    PubMed

    Peterson, Lisa K; Wells, Daniel; Shaw, Laura; Velez, Maria-Gabriela; Harbeck, Ronald; Dragone, Leonard L

    2009-09-30

    Antinuclear antibodies (ANA) have been detected in patients with systemic rheumatic diseases and are used in the screening and/or diagnosis of autoimmunity in patients as well as mouse models of systemic autoimmunity. Indirect immunofluorescence (IIF) on HEp-2 cells is the gold standard for ANA screening. However, its usefulness is limited in diagnosis, prognosis and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. Various immunological techniques have been developed in an attempt to improve upon the method to quantify ANA, including enzyme-linked immunosorbent assays (ELISAs), line immunoassays (LIAs), multiplexed bead immunoassays and IIF on substrates other than HEp-2 cells. Yet IIF on HEp-2 cells remains the most common screening method for ANA. In this study, we describe a simple quantitative method to detect ANA which combines IIF on HEp-2 coated slides with analysis using a near-infrared imaging (NII) system. Using NII to determine ANA titer, 86.5% (32 of 37) of the titers for human patient samples were within 2 dilutions of those determined by IIF, which is the acceptable range for proficiency testing. Combining an initial screening for nuclear staining using microscopy with titration by NII resulted in 97.3% (36 of 37) of the titers detected to be within two dilutions of those determined by IIF. The NII method for quantitative ANA measurements using serum from both patients and mice with autoimmunity provides a fast, relatively simple, objective, sensitive and reproducible assay, which could easily be standardized for comparison between laboratories.

  18. Distance measures and optimization spaces in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Rode, Karyn D.; Budge, Suzanne M.; Thiemann, Gregory W.

    2015-01-01

    Quantitative fatty acid signature analysis has become an important method of diet estimation in ecology, especially marine ecology. Controlled feeding trials to validate the method and estimate the calibration coefficients necessary to account for differential metabolism of individual fatty acids have been conducted with several species from diverse taxa. However, research into potential refinements of the estimation method has been limited. We compared the performance of the original method of estimating diet composition with that of five variants based on different combinations of distance measures and calibration-coefficient transformations between prey and predator fatty acid signature spaces. Fatty acid signatures of pseudopredators were constructed using known diet mixtures of two prey data sets previously used to estimate the diets of polar bears Ursus maritimus and gray seals Halichoerus grypus, and their diets were then estimated using all six variants. In addition, previously published diets of Chukchi Sea polar bears were re-estimated using all six methods. Our findings reveal that the selection of an estimation method can meaningfully influence estimates of diet composition. Among the pseudopredator results, which allowed evaluation of bias and precision, differences in estimator performance were rarely large, and no one estimator was universally preferred, although estimators based on the Aitchison distance measure tended to have modestly superior properties compared to estimators based on the Kullback-Leibler distance measure. However, greater differences were observed among estimated polar bear diets, most likely due to differential estimator sensitivity to assumption violations. Our results, particularly the polar bear example, suggest that additional research into estimator performance and model diagnostics is warranted.

  19. Initial Description of a Quantitative, Cross-Species (Chimpanzee-Human) Social Responsiveness Measure

    ERIC Educational Resources Information Center

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.

    2011-01-01

    Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…

  20. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the frameworkmore » of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative

  1. Clinical performance of the LCx HCV RNA quantitative assay.

    PubMed

    Bertuzis, Rasa; Hardie, Alison; Hottentraeger, Barbara; Izopet, Jacques; Jilg, Wolfgang; Kaesdorf, Barbara; Leckie, Gregor; Leete, Jean; Perrin, Luc; Qiu, Chunfu; Ran, Iris; Schneider, George; Simmonds, Peter; Robinson, John

    2005-02-01

    This study was conducted to assess the performance of the Abbott laboratories LCx HCV RNA Quantitative Assay (LCx assay) in the clinical setting. Four clinical laboratories measured LCx assay precision, specificity, and linearity. In addition, a method comparison was conducted between the LCx assay and the Roche HCV Amplicor Monitor, version 2.0 (Roche Monitor 2.0) and the Bayer VERSANT HCV RNA 3.0 Assay (Bayer bDNA 3.0) quantitative assays. For precision, the observed LCx assay intra-assay standard deviation (S.D.) was 0.060-0.117 log IU/ml, the inter-assay S.D. was 0.083-0.133 log IU/ml, the inter-lot S.D. was 0.105-0.177 log IU/ml, the inter-site S.D. was 0.099-0.190 log IU/ml, and the total S.D. was 0.113-0.190 log IU/ml. The specificity of the LCx assay was 99.4% (542/545; 95% CI, 98.4-99.9%). For linearity, the mean pooled LCx assay results were linear (r=0.994) over the range of the panel (2.54-5.15 log IU/ml). A method comparison demonstrated a correlation coefficient of 0.881 between the LCx assay and Roche Monitor 2.0, 0.872 between the LCx assay and Bayer bDNA 3.0, and 0.870 between Roche Monitor 2.0 and Bayer bDNA 3.0. The mean LCx assay result was 0.04 log IU/ml (95% CI, -0.08, 0.01) lower than the mean Roche Monitor 2.0 result, but 0.57 log IU/ml (95% CI, 0.53, 0.61) higher than the mean Bayer bDNA 3.0 result. The mean Roche Monitor 2.0 result was 0.60 log IU/ml (95% CI, 0.56, 0.65) higher than the mean Bayer bDNA 3.0 result. The LCx assay quantitated genotypes 1-4 with statistical equivalency. The vast majority (98.9%, 278/281) of paired LCx assay-Roche Monitor 2.0 specimen results were within 1 log IU/ml. Similarly, 86.6% (240/277) of paired LCx assay and Bayer bDNA 3.0 specimen results were within 1 log, as were 85.6% (237/277) of paired Roche Monitor 2.0 and Bayer specimen results. These data demonstrate that the LCx assay may be used for quantitation of HCV RNA in HCV-infected individuals.

  2. Academic Performance of Business Students in Quantitative Courses: A Study in the Faculty of Business and Economics at the UAE University

    ERIC Educational Resources Information Center

    Yousef, Darwish Abdulrahman

    2011-01-01

    This article aims to investigate the academic performance (measured by quality points) of the business students in quantitative courses. It also explores the impact of a number of factors on the academic performance of business students in these courses. A random sample of 750 third- and fourth-level business students at the United Arab Emirates…

  3. Quantitative and Qualitative Change in Children's Mental Rotation Performance

    ERIC Educational Resources Information Center

    Geiser, Christian; Lehmann, Wolfgang; Corth, Martin; Eid, Michael

    2008-01-01

    This study investigated quantitative and qualitative changes in mental rotation performance and solution strategies with a focus on sex differences. German children (N = 519) completed the Mental Rotations Test (MRT) in the 5th and 6th grades (interval: one year; age range at time 1: 10-11 years). Boys on average outperformed girls on both…

  4. Improvement of Quantitative Measurements in Multiplex Proteomics Using High-Field Asymmetric Waveform Spectrometry.

    PubMed

    Pfammatter, Sibylle; Bonneil, Eric; Thibault, Pierre

    2016-12-02

    Quantitative proteomics using isobaric reagent tandem mass tags (TMT) or isobaric tags for relative and absolute quantitation (iTRAQ) provides a convenient approach to compare changes in protein abundance across multiple samples. However, the analysis of complex protein digests by isobaric labeling can be undermined by the relative large proportion of co-selected peptide ions that lead to distorted reporter ion ratios and affect the accuracy and precision of quantitative measurements. Here, we investigated the use of high-field asymmetric waveform ion mobility spectrometry (FAIMS) in proteomic experiments to reduce sample complexity and improve protein quantification using TMT isobaric labeling. LC-FAIMS-MS/MS analyses of human and yeast protein digests led to significant reductions in interfering ions, which increased the number of quantifiable peptides by up to 68% while significantly improving the accuracy of abundance measurements compared to that with conventional LC-MS/MS. The improvement in quantitative measurements using FAIMS is further demonstrated for the temporal profiling of protein abundance of HEK293 cells following heat shock treatment.

  5. A potential quantitative method for assessing individual tree performance

    Treesearch

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  6. Business Performance Measurements in Asset Management with the Support of Big Data Technologies

    NASA Astrophysics Data System (ADS)

    Campos, Jaime; Sharma, Pankaj; Jantunen, Erkki; Baglee, David; Fumagalli, Luca

    2017-09-01

    The paper reviews the performance measurement in the domain of interest. Important data in asset management are further, discussed. The importance and the characteristics of today's ICTs capabilities are also mentioned in the paper. The role of new concepts such as big data and data mining analytical technologies in managing the performance measurements in asset management are discussed in detail. The authors consequently suggest the use of the modified Balanced Scorecard methodology highlighting both quantitative and qualitative aspects, which is crucial for optimal use of the big data approach and technologies.

  7. Quantitative approach for optimizing e-beam condition of photoresist inspection and measurement

    NASA Astrophysics Data System (ADS)

    Lin, Chia-Jen; Teng, Chia-Hao; Cheng, Po-Chung; Sato, Yoshishige; Huang, Shang-Chieh; Chen, Chu-En; Maruyama, Kotaro; Yamazaki, Yuichiro

    2018-03-01

    Severe process margin in advanced technology node of semiconductor device is controlled by e-beam metrology system and e-beam inspection system with scanning electron microscopy (SEM) image. By using SEM, larger area image with higher image quality is required to collect massive amount of data for metrology and to detect defect in a large area for inspection. Although photoresist is the one of the critical process in semiconductor device manufacturing, observing photoresist pattern by SEM image is crucial and troublesome especially in the case of large image. The charging effect by e-beam irradiation on photoresist pattern causes deterioration of image quality, and it affect CD variation on metrology system and causes difficulties to continue defect inspection in a long time for a large area. In this study, we established a quantitative approach for optimizing e-beam condition with "Die to Database" algorithm of NGR3500 on photoresist pattern to minimize charging effect. And we enhanced the performance of measurement and inspection on photoresist pattern by using optimized e-beam condition. NGR3500 is the geometry verification system based on "Die to Database" algorithm which compares SEM image with design data [1]. By comparing SEM image and design data, key performance indicator (KPI) of SEM image such as "Sharpness", "S/N", "Gray level variation in FOV", "Image shift" can be retrieved. These KPIs were analyzed with different e-beam conditions which consist of "Landing Energy", "Probe Current", "Scanning Speed" and "Scanning Method", and the best e-beam condition could be achieved with maximum image quality, maximum scanning speed and minimum image shift. On this quantitative approach of optimizing e-beam condition, we could observe dependency of SEM condition on photoresist charging. By using optimized e-beam condition, measurement could be continued on photoresist pattern over 24 hours stably. KPIs of SEM image proved image quality during measurement and

  8. Correlation of visual in vitro cytotoxicity ratings of biomaterials with quantitative in vitro cell viability measurements.

    PubMed

    Bhatia, Sujata K; Yetter, Ann B

    2008-08-01

    Medical devices and implanted biomaterials are often assessed for biological reactivity using visual scores of cell-material interactions. In such testing, biomaterials are assigned cytotoxicity ratings based on visual evidence of morphological cellular changes, including cell lysis, rounding, spreading, and proliferation. For example, ISO 10993 cytotoxicity testing of medical devices allows the use of a visual grading scale. The present study compared visual in vitro cytotoxicity ratings to quantitative in vitro cytotoxicity measurements for biomaterials to determine the level of correlation between visual scoring and a quantitative cell viability assay. Biomaterials representing a spectrum of biological reactivity levels were evaluated, including organo-tin polyvinylchloride (PVC; a known cytotoxic material), ultra-high molecular weight polyethylene (a known non-cytotoxic material), and implantable tissue adhesives. Each material was incubated in direct contact with mouse 3T3 fibroblast cell cultures for 24 h. Visual scores were assigned to the materials using a 5-point rating scale; the scorer was blinded to the material identities. Quantitative measurements of cell viability were performed using a 3-(4,5-dimethylthiozol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) colorimetric assay; again, the assay operator was blinded to material identities. The investigation revealed a high degree of correlation between visual cytotoxicity ratings and quantitative cell viability measurements; a Pearson's correlation gave a correlation coefficient of 0.90 between the visual cytotoxicity score and the percent viable cells. An equation relating the visual cytotoxicity score and the percent viable cells was derived. The results of this study are significant for the design and interpretation of in vitro cytotoxicity studies of novel biomaterials.

  9. Evaluation of patients with painful total hip arthroplasty using combined single photon emission tomography and conventional computerized tomography (SPECT/CT) - a comparison of semi-quantitative versus 3D volumetric quantitative measurements.

    PubMed

    Barthassat, Emilienne; Afifi, Faik; Konala, Praveen; Rasch, Helmut; Hirschmann, Michael T

    2017-05-08

    It was the primary purpose of our study to evaluate the inter- and intra-observer reliability of a standardized SPECT/CT algorithm for evaluating patients with painful primary total hip arthroplasty (THA). The secondary purpose was a comparison of semi-quantitative and 3D volumetric quantification method for assessment of bone tracer uptake (BTU) in those patients. A novel SPECT/CT localization scheme consisting of 14 femoral and 4 acetabular regions on standardized axial and coronal slices was introduced and evaluated in terms of inter- and intra-observer reliability in 37 consecutive patients with hip pain after THA. BTU for each anatomical region was assessed semi-quantitatively using a color-coded Likert type scale (0-10) and volumetrically quantified using a validated software. Two observers interpreted the SPECT/CT findings in all patients two times with six weeks interval between interpretations in random order. Semi-quantitative and quantitative measurements were compared in terms of reliability. In addition, the values were correlated using Pearson`s correlation. A factorial cluster analysis of BTU was performed to identify clinically relevant regions, which should be grouped and analysed together. The localization scheme showed high inter- and intra-observer reliabilities for all femoral and acetabular regions independent of the measurement method used (semiquantitative versus 3D volumetric quantitative measurements). A high to moderate correlation between both measurement methods was shown for the distal femur, the proximal femur and the acetabular cup. The factorial cluster analysis showed that the anatomical regions might be summarized into three distinct anatomical regions. These were the proximal femur, the distal femur and the acetabular cup region. The SPECT/CT algorithm for assessment of patients with pain after THA is highly reliable independent from the measurement method used. Three clinically relevant anatomical regions (proximal femoral

  10. Quantitative measurement of intervertebral disc signal using MRI.

    PubMed

    Niemeläinen, R; Videman, T; Dhillon, S S; Battié, M C

    2008-03-01

    To investigate the spinal cord as an alternative intra-body reference to cerebrospinal fluid (CSF) in evaluating thoracic disc signal intensity. T2-weighted magnetic resonance imaging (MRI) images of T6-T12 were obtained using 1.5 T machines for a population-based sample of 523 men aged 35-70 years. Quantitative data on the signal intensities were acquired using an image analysis program (SpEx). A random sample of 30 subjects and intraclass correlation coefficients (ICC) were used to examine the repeatability of the spinal cord measurements. The validity of using the spinal cord as a reference was examined by correlating cord and CSF samples. Finally, thoracic disc signal was validated by correlating it with age without adjustment and adjusting for either cord or CSF. Pearson's r was used for correlational analyses. The repeatability of the spinal cord signal measurements was extremely high (>or=0.99). The correlations between the signals of spinal cord and CSF by level were all above 0.9. The spinal cord-adjusted disc signal and age correlated similarly with CSF-adjusted disc signal and age (r=-0.30 to -0.40 versus r=-0.26 to -0.36). Adjacent spinal cord is a good alternative reference to the current reference standard, CSF, for quantitative measurements of disc signal intensity. Clearly fewer levels were excluded when using spinal cord as compared to CSF due to missing reference samples.

  11. A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems

    DOE PAGES

    Shin, Sangmin; Lee, Seungyub; Judi, David; ...

    2018-02-07

    Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less

  12. A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Sangmin; Lee, Seungyub; Judi, David

    Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less

  13. The relationship between quantitative measures of disc height and disc signal intensity with Pfirrmann score of disc degeneration.

    PubMed

    Salamat, Sara; Hutchings, John; Kwong, Clemens; Magnussen, John; Hancock, Mark J

    2016-01-01

    To assess the relationship between quantitative measures of disc height and signal intensity with the Pfirrmann disc degeneration scoring system and to test the inter-rater reliability of the quantitative measures. Participants were 76 people who had recently recovered from their last episode of acute low back pain and underwent MRI scan on a single 3T machine. At all 380 lumbar discs, quantitative measures of disc height and signal intensity were made by 2 independent raters and compared to Pfirrmann scores from a single radiologist. For quantitative measures of disc height and signal intensity a "raw" score and 2 adjusted ratios were calculated and the relationship with Pfirrmann scores was assessed. The inter-tester reliability of quantitative measures was also investigated. There was a strong linear relationship between quantitative disc signal intensity and Pfirrmann scores for grades 1-4, but not for grades 4 and 5. For disc height only, Pfirrmann grade 5 had significantly reduced disc height compared to all other grades. Results were similar regardless of whether raw or adjusted scores were used. Inter-rater reliability for the quantitative measures was excellent (ICC > 0.97). Quantitative measures of disc signal intensity were strongly related to Pfirrmann scores from grade 1 to 4; however disc height only differentiated between grade 4 and 5 Pfirrmann scores. Using adjusted ratios for quantitative measures of disc height or signal intensity did not significantly alter the relationship with Pfirrmann scores.

  14. Quantitative Morphology Measures in Galaxies: Ground-Truthing from Simulations

    NASA Astrophysics Data System (ADS)

    Narayanan, Desika T.; Abruzzo, Matthew W.; Dave, Romeel; Thompson, Robert

    2017-01-01

    The process of galaxy assembly is a prevalent question in astronomy; there are a variety of potentially important effects, including baryonic accretion from the intergalactic medium, as well as major galaxy mergers. Recent years have ushered in the development of quantitative measures of morphology such as the Gini coefficient (G), the second-order moment of the brightest quintile of a galaxy’s light (M20), and the concentration (C), asymmetry (A), and clumpiness (S) of galaxies. To investigate the efficacy of these observational methods at identifying major mergers, we have run a series of very high resolution cosmological zoom simulations, and coupled these with 3D Monte Carlo dust radiative transfer. Our methodology is powerful in that it allows us to “observe” the simulation as an observer would, while maintaining detailed knowledge of the true merger history of the galaxy. In this presentation, we will present our main results from our analysis of these quantitative morphology measures, with a particular focus on high-redshift (z>2) systems.

  15. Quantitative head ultrasound measurements to determine thresholds for preterm neonates requiring interventional therapies following intraventricular hemorrhage

    NASA Astrophysics Data System (ADS)

    Kishimoto, Jessica; Fenster, Aaron; Salehi, Fateme; Romano, Walter; Lee, David S. C.; de Ribaupierre, Sandrine

    2016-04-01

    Dilation of the cerebral ventricles is a common condition in preterm neonates with intraventricular hemorrhage (IVH). This post hemorrhagic ventricle dilation (PHVD) can lead to lifelong neurological impairment through ischemic injury due to increased intracranial pressure and without treatment, can lead to death. Clinically, 2D ultrasound (US) through the fontanelles ('soft spots') of the patients are serially acquired to monitor the progression of the ventricle dilation. These images are used to determine when interventional therapies such as needle aspiration of the built up cerebrospinal fluid (CSF) ('ventricle tap', VT) might be indicated for a patient; however, quantitative measurements of the growth of the ventricles are often not performed. There is no consensus on when a neonate with PHVD should have an intervention and often interventions are performed after the potential for brain damage is quite high. Previously we have developed and validated a 3D US system to monitor the progression of ventricle volumes (VV) in IVH patients. We will describe the potential utility of quantitative 2D and 3D US to monitor and manage PHVD in neonates. Specifically, we will look to determine image-based measurement thresholds for patients who will require VT in comparison to patients with PHVD who resolve without intervention. Additionally, since many patients who have an initial VT will require subsequent interventions, we look at the potential for US to determine which PHVD patients will require additional VT after the initial one has been performed.

  16. Boron concentration measurements by alpha spectrometry and quantitative neutron autoradiography in cells and tissues treated with different boronated formulations and administration protocols.

    PubMed

    Bortolussi, Silva; Ciani, Laura; Postuma, Ian; Protti, Nicoletta; Luca Reversi; Bruschi, Piero; Ferrari, Cinzia; Cansolino, Laura; Panza, Luigi; Ristori, Sandra; Altieri, Saverio

    2014-06-01

    The possibility to measure boron concentration with high precision in tissues that will be irradiated represents a fundamental step for a safe and effective BNCT treatment. In Pavia, two techniques have been used for this purpose, a quantitative method based on charged particles spectrometry and a boron biodistribution imaging based on neutron autoradiography. A quantitative method to determine boron concentration by neutron autoradiography has been recently set-up and calibrated for the measurement of biological samples, both solid and liquid, in the frame of the feasibility study of BNCT. This technique was calibrated and the obtained results were cross checked with those of α spectrometry, in order to validate them. The comparisons were performed using tissues taken form animals treated with different boron administration protocols. Subsequently the quantitative neutron autoradiography was employed to measure osteosarcoma cell samples treated with BPA and with new boronated formulations. © 2013 Published by Elsevier Ltd.

  17. Detecting Genetic Interactions for Quantitative Traits Using m-Spacing Entropy Measure

    PubMed Central

    Yee, Jaeyong; Kwon, Min-Seok; Park, Taesung; Park, Mira

    2015-01-01

    A number of statistical methods for detecting gene-gene interactions have been developed in genetic association studies with binary traits. However, many phenotype measures are intrinsically quantitative and categorizing continuous traits may not always be straightforward and meaningful. Association of gene-gene interactions with an observed distribution of such phenotypes needs to be investigated directly without categorization. Information gain based on entropy measure has previously been successful in identifying genetic associations with binary traits. We extend the usefulness of this information gain by proposing a nonparametric evaluation method of conditional entropy of a quantitative phenotype associated with a given genotype. Hence, the information gain can be obtained for any phenotype distribution. Because any functional form, such as Gaussian, is not assumed for the entire distribution of a trait or a given genotype, this method is expected to be robust enough to be applied to any phenotypic association data. Here, we show its use to successfully identify the main effect, as well as the genetic interactions, associated with a quantitative trait. PMID:26339620

  18. Gender, Math Confidence, and Grit: Relationships with Quantitative Skills and Performance in an Undergraduate Biology Course

    PubMed Central

    Flanagan, K. M.; Einarson, J.

    2017-01-01

    In a world filled with big data, mathematical models, and statistics, the development of strong quantitative skills is becoming increasingly critical for modern biologists. Teachers in this field must understand how students acquire quantitative skills and explore barriers experienced by students when developing these skills. In this study, we examine the interrelationships among gender, grit, and math confidence for student performance on a pre–post quantitative skills assessment and overall performance in an undergraduate biology course. Here, we show that females significantly underperformed relative to males on a quantitative skills assessment at the start of term. However, females showed significantly higher gains over the semester, such that the gender gap in performance was nearly eliminated by the end of the semester. Math confidence plays an important role in the performance on both the pre and post quantitative skills assessments and overall performance in the course. The effect of grit on student performance, however, is mediated by a student’s math confidence; as math confidence increases, the positive effect of grit decreases. Consequently, the positive impact of a student’s grittiness is observed most strongly for those students with low math confidence. We also found grit to be positively associated with the midterm score and the final grade in the course. Given the relationships established in this study among gender, grit, and math confidence, we provide “instructor actions” from the literature that can be applied in the classroom to promote the development of quantitative skills in light of our findings. PMID:28798209

  19. Comparison of the quantitative analysis performance between pulsed voltage atom probe and pulsed laser atom probe.

    PubMed

    Takahashi, J; Kawakami, K; Raabe, D

    2017-04-01

    The difference in quantitative analysis performance between the voltage-mode and laser-mode of a local electrode atom probe (LEAP3000X HR) was investigated using a Fe-Cu binary model alloy. Solute copper atoms in ferritic iron preferentially field evaporate because of their significantly lower evaporation field than the matrix iron, and thus, the apparent concentration of solute copper tends to be lower than the actual concentration. However, in voltage-mode, the apparent concentration was higher than the actual concentration at 40K or less due to a detection loss of matrix iron, and the concentration decreased with increasing specimen temperature due to the preferential evaporation of solute copper. On the other hand, in laser-mode, the apparent concentration never exceeded the actual concentration, even at lower temperatures (20K), and this mode showed better quantitative performance over a wide range of specimen temperatures. These results indicate that the pulsed laser atom probe prevents both detection loss and preferential evaporation under a wide range of measurement conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Quantitative performance evaluation of 124I PET/MRI lesion dosimetry in differentiated thyroid cancer

    NASA Astrophysics Data System (ADS)

    Wierts, R.; Jentzen, W.; Quick, H. H.; Wisselink, H. J.; Pooters, I. N. A.; Wildberger, J. E.; Herrmann, K.; Kemerink, G. J.; Backes, W. H.; Mottaghy, F. M.

    2018-01-01

    The aim was to investigate the quantitative performance of 124I PET/MRI for pre-therapy lesion dosimetry in differentiated thyroid cancer (DTC). Phantom measurements were performed on a PET/MRI system (Biograph mMR, Siemens Healthcare) using 124I and 18F. The PET calibration factor and the influence of radiofrequency coil attenuation were determined using a cylindrical phantom homogeneously filled with radioactivity. The calibration factor was 1.00  ±  0.02 for 18F and 0.88  ±  0.02 for 124I. Near the radiofrequency surface coil an underestimation of less than 5% in radioactivity concentration was observed. Soft-tissue sphere recovery coefficients were determined using the NEMA IEC body phantom. Recovery coefficients were systematically higher for 18F than for 124I. In addition, the six spheres of the phantom were segmented using a PET-based iterative segmentation algorithm. For all 124I measurements, the deviations in segmented lesion volume and mean radioactivity concentration relative to the actual values were smaller than 15% and 25%, respectively. The effect of MR-based attenuation correction (three- and four-segment µ-maps) on bone lesion quantification was assessed using radioactive spheres filled with a K2HPO4 solution mimicking bone lesions. The four-segment µ-map resulted in an underestimation of the imaged radioactivity concentration of up to 15%, whereas the three-segment µ-map resulted in an overestimation of up to 10%. For twenty lesions identified in six patients, a comparison of 124I PET/MRI to PET/CT was performed with respect to segmented lesion volume and radioactivity concentration. The interclass correlation coefficients showed excellent agreement in segmented lesion volume and radioactivity concentration (0.999 and 0.95, respectively). In conclusion, it is feasible that accurate quantitative 124I PET/MRI could be used to perform radioiodine pre-therapy lesion dosimetry in DTC.

  1. New ways to analyze word generation performance in brain injury: A systematic review and meta-analysis of additional performance measures.

    PubMed

    Thiele, Kristina; Quinting, Jana Marie; Stenneken, Prisca

    2016-09-01

    The investigation of word generation performance is an accepted, widely used, and well-established method for examining cognitive, language, or communication impairment due to brain damage. The performance measure traditionally applied in the investigation of word generation is the number of correct responses. Previous studies, however, have suggested that this measure does not capture all potentially relevant aspects of word generation performance and hence its underlying processes, so that its analytical and explanatory power of word generation performance might be rather limited. Therefore, additional qualitative or quantitative performance measures have been introduced to gain information that goes beyond the deficit and allows for therapeutic implications. We undertook a systematic review and meta-analysis of original research that focused on the application of additional measures of word generation performance in adult clinical populations with acquired brain injury. Word generation tasks are an integral part of many different tests, but only few use additional performance measures in addition to the number of correct responses in the analysis of word generation performance. Additional measures, which showed increased or similar diagnostic utility relative to the traditional performance measure, regarded clustering and switching, error types, and temporal characteristics. The potential of additional performance measures is not yet fully exhausted in patients with brain injury. The temporal measure of response latencies in particular is not adequately represented, though it may be a reliable measure especially for identifying subtle impairments. Unfortunately, there is no general consensus as of yet on which additional measures are best suited to characterizing word generation performance. Further research is needed to specify the additional parameters that are best qualified for identifying and characterizing impaired word generation performance.

  2. Challenge in Enhancing the Teaching and Learning of Variable Measurements in Quantitative Research

    ERIC Educational Resources Information Center

    Kee, Chang Peng; Osman, Kamisah; Ahmad, Fauziah

    2013-01-01

    Statistical analysis is one component that cannot be avoided in a quantitative research. Initial observations noted that students in higher education institution faced difficulty analysing quantitative data which were attributed to the confusions of various variable measurements. This paper aims to compare the outcomes of two approaches applied in…

  3. Semi-quantitative estimation of cellular SiO2 nanoparticles using flow cytometry combined with X-ray fluorescence measurements.

    PubMed

    Choi, Seo Yeon; Yang, Nuri; Jeon, Soo Kyung; Yoon, Tae Hyun

    2014-09-01

    In this study, we have demonstrated feasibility of a semi-quantitative approach for the estimation of cellular SiO2 nanoparticles (NPs), which is based on the flow cytometry measurements of their normalized side scattering intensity. In order to improve our understanding on the quantitative aspects of cell-nanoparticle interactions, flow cytometry, transmission electron microscopy, and X-ray fluorescence experiments were carefully performed for the HeLa cells exposed to SiO2 NPs with different core diameters, hydrodynamic sizes, and surface charges. Based on the observed relationships among the experimental data, a semi-quantitative cellular SiO2 NPs estimation method from their normalized side scattering and core diameters was proposed, which can be applied for the determination of cellular SiO2 NPs within their size-dependent linear ranges. © 2014 International Society for Advancement of Cytometry.

  4. Imaging and quantitative measurement of corrosion in painted automotive and aircraft structures

    NASA Astrophysics Data System (ADS)

    Sun, G.; Wang, Xun; Feng, Z. J.; Jin, Huijia; Sui, Hua; Ouyang, Zhong; Han, Xiaoyan; Favro, L. D.; Thomas, R. L.; Bomback, J. L.

    2000-05-01

    Some of the authors have shown that it is possible to image and make rapid, quantitative measurements of metal thickness loss due to corrosion on the rear surface of a single layer structure, with an accuracy better than one percent. These measurements are complicated by the presence of thick and/or uneven layers of paint on either the front surface, the back surface, or both. We will discuss progress in overcoming these complications. Examples from both automotive and aircraft structures will be presented.—This material is based in part upon work performed at the FAA Center for Aviation Systems Reliability operated at Iowa State University and supported by the Federal Aviation Administration Technical Center, Atlantic City, New Jersey, under Grant number 95-G-025, and is also supported in part by the Institute for Manufacturing Research, Wayne State University, and by Ford Motor Company. Supported by a Grant from Ford Motor Company.

  5. Performance analysis of a film dosimetric quality assurance procedure for IMRT with regard to the employment of quantitative evaluation methods.

    PubMed

    Winkler, Peter; Zurl, Brigitte; Guss, Helmuth; Kindl, Peter; Stuecklschweiger, Georg

    2005-02-21

    A system for dosimetric verification of intensity-modulated radiotherapy (IMRT) treatment plans using absolute calibrated radiographic films is presented. At our institution this verification procedure is performed for all IMRT treatment plans prior to patient irradiation. Therefore clinical treatment plans are transferred to a phantom and recalculated. Composite treatment plans are irradiated to a single film. Film density to absolute dose conversion is performed automatically based on a single calibration film. A software application encompassing film calibration, 2D registration of measurement and calculated distributions, image fusion, and a number of visual and quantitative evaluation utilities was developed. The main topic of this paper is a performance analysis for this quality assurance procedure, with regard to the specification of tolerance levels for quantitative evaluations. Spatial and dosimetric precision and accuracy were determined for the entire procedure, comprising all possible sources of error. The overall dosimetric and spatial measurement uncertainties obtained thereby were 1.9% and 0.8 mm respectively. Based on these results, we specified 5% dose difference and 3 mm distance-to-agreement as our tolerance levels for patient-specific quality assurance for IMRT treatments.

  6. Performance testing accountability measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldham, R.D.; Mitchell, W.G.; Spaletto, M.I.

    The New Brunswick Laboratory (NBL) provides assessment support to the DOE Operations Offices in the area of Material Control and Accountability (MC and A). During surveys of facilities, the Operations Offices have begun to request from NBL either assistance in providing materials for performance testing of accountability measurements or both materials and personnel to do performance testing. To meet these needs, NBL has developed measurement and measurement control performance test procedures and materials. The present NBL repertoire of performance tests include the following: (1) mass measurement performance testing procedures using calibrated and traceable test weights, (2) uranium elemental concentration (assay)more » measurement performance tests which use ampulated solutions of normal uranyl nitrate containing approximately 7 milligrams of uranium per gram of solution, and (3) uranium isotopic measurement performance tests which use ampulated uranyl nitrate solutions with enrichments ranging from 4% to 90% U-235. The preparation, characterization, and packaging of the uranium isotopic and assay performance test materials were done in cooperation with the NBL Safeguards Measurements Evaluation Program since these materials can be used for both purposes.« less

  7. Quantitative NDA measurements of advanced reprocessing product materials containing uranium, neptunium, plutonium, and americium

    NASA Astrophysics Data System (ADS)

    Goddard, Braden

    The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.

  8. Probe colorimeter for quantitating enzyme-linked immunosorbent assays and other colorimetric assays performed with microplates.

    PubMed Central

    Ackerman, S B; Kelley, E A

    1983-01-01

    The performance of a fiberoptic probe colorimeter (model PC800; Brinkmann Instruments, Inc., Westbury, N.Y.) for quantitating enzymatic or colorimetric assays in 96-well microtiter plates was compared with the performances of a spectrophotometer (model 240; Gilford Instrument Laboratories, Inc., Oberlin, Ohio) and a commercially available enzyme immunoassay reader (model MR590; Dynatech Laboratories, Inc., Alexandria, Va.). Alkaline phosphatase-p-nitrophenyl phosphate in 3 M NaOH was used as the chromophore source. Six types of plates were evaluated for use with the probe colorimeter; they generated reproducibility values (100% coefficient of variation) ranging from 91 to 98% when one individual made 24 independent measurements on the same dilution of chromophore on each plate. Eleven individuals each performed 24 measurements with the colorimeter on either a visually light (absorbance of 0.10 at 420 nm) or a dark (absorbance of 0.80 at 420 nm) dilution of chromophore; reproducibilities averaged 87% for the light dilution and 97% for the dark dilution. When one individual measured the same chromophore sample at least 20 times in the colorimeter, in the spectrophotometer or in the enzyme immunoassay reader, reproducibility for each instrument was greater than 99%. Measurements of a dilution series of chromophore in a fixed volume indicated that the optical responses of each instrument were linear in a range of 0.05 to 1.10 absorbance units. Images PMID:6341399

  9. Probe colorimeter for quantitating enzyme-linked immunosorbent assays and other colorimetric assays performed with microplates.

    PubMed

    Ackerman, S B; Kelley, E A

    1983-03-01

    The performance of a fiberoptic probe colorimeter (model PC800; Brinkmann Instruments, Inc., Westbury, N.Y.) for quantitating enzymatic or colorimetric assays in 96-well microtiter plates was compared with the performances of a spectrophotometer (model 240; Gilford Instrument Laboratories, Inc., Oberlin, Ohio) and a commercially available enzyme immunoassay reader (model MR590; Dynatech Laboratories, Inc., Alexandria, Va.). Alkaline phosphatase-p-nitrophenyl phosphate in 3 M NaOH was used as the chromophore source. Six types of plates were evaluated for use with the probe colorimeter; they generated reproducibility values (100% coefficient of variation) ranging from 91 to 98% when one individual made 24 independent measurements on the same dilution of chromophore on each plate. Eleven individuals each performed 24 measurements with the colorimeter on either a visually light (absorbance of 0.10 at 420 nm) or a dark (absorbance of 0.80 at 420 nm) dilution of chromophore; reproducibilities averaged 87% for the light dilution and 97% for the dark dilution. When one individual measured the same chromophore sample at least 20 times in the colorimeter, in the spectrophotometer or in the enzyme immunoassay reader, reproducibility for each instrument was greater than 99%. Measurements of a dilution series of chromophore in a fixed volume indicated that the optical responses of each instrument were linear in a range of 0.05 to 1.10 absorbance units.

  10. Rapid Quantitation of Furanocoumarins and Flavonoids in Grapefruit Juice using Ultra Performance Liquid Chromatography

    PubMed Central

    VanderMolen, Karen M.; Cech, Nadja B.; Paine, Mary F.

    2013-01-01

    Introduction Grapefruit juice can increase or decrease the systemic exposure of myriad oral medications, leading to untoward effects or reduced efficacy. Furanocoumarins in grapefruit juice have been established as inhibitors of cytochrome P450 3A (CYP3A)-mediated metabolism and P-glycoprotein (P-gp)-mediated efflux, while flavonoids have been implicated as inhibitors of organic anion transporting polypeptide (OATP)-mediated absorptive uptake in the intestine. The potential for drug interactions with a food product necessitates an understanding of the expected concentrations of a suite of structurally diverse and potentially bioactive compounds. Objective Develop methods for the rapid quantitation of two furanocoumarins (bergamottin and 6′,7′-dihydroxybergamottin) and four flavonoids (naringin, naringenin, narirutin, and hesperidin) in five grapefruit juice products using ultra performance liquid chromatography (UPLC). Methodology Grapefruit juice products were extracted with ethyl acetate; the concentrated extract was analyzed by UPLC using acetonitrile:water gradients and a C18 column. Analytes were detected using a photodiode array detector, set at 250 nm (furanocoumarins) and 310 nm (flavonoids). Intraday and interday precision and accuracy and limits of detection and quantitation were determined. Results Rapid (<5.0 min) UPLC methods were developed to measure the aforementioned furanocoumarins and flavonoids. R2 values for the calibration curves of all analytes were >0.999. Considerable between-juice variation in the concentrations of these compounds was observed, and the quantities measured were in agreement with the concentrations published in HPLC studies. Conclusion These analytical methods provide an expedient means to quantitate key furanocoumarins and flavonoids in grapefruit juice and other foods used in dietary substance-drug interaction studies. PMID:23780830

  11. Rapid Quantitation of Furanocoumarins and Flavonoids in Grapefruit Juice using Ultra-Performance Liquid Chromatography.

    PubMed

    Vandermolen, Karen M; Cech, Nadja B; Paine, Mary F; Oberlies, Nicholas H

    2013-01-01

    Grapefruit juice can increase or decrease the systemic exposure of myriad oral medications, leading to untoward effects or reduced efficacy. Furanocoumarins in grapefruit juice have been established as inhibitors of cytochrome P450 3A (CYP3A)-mediated metabolism and P-glycoprotein (P-gp)-mediated efflux, while flavonoids have been implicated as inhibitors of organic anion transporting polypeptide (OATP)-mediated absorptive uptake in the intestine. The potential for drug interactions with a food product necessitates an understanding of the expected concentrations of a suite of structurally diverse and potentially bioactive compounds. Develop methods for the rapid quantitation of two furanocoumarins (bergamottin and 6',7'-dihydroxybergamottin) and four flavonoids (naringin, naringenin, narirutin and hesperidin) in five grapefruit juice products using ultra-performance liquid chromatography (UPLC). Grapefruit juice products were extracted with ethyl acetate; the concentrated extract was analysed by UPLC using acetonitrile:water gradients and a C18 -column. Analytes were detected using a photodiode array detector, set at 250 nm (furanocoumarins) and 310 nm (flavonoids). Intraday and interday precision and accuracy and limits of detection and quantitation were determined. Rapid (< 5.0 min) UPLC methods were developed to measure the aforementioned furanocoumarins and flavonoids. R(2) values for the calibration curves of all analytes were >0.999. Considerable between-juice variation in the concentrations of these compounds was observed, and the quantities measured were in agreement with the concentrations published in HPLC studies. These analytical methods provide an expedient means to quantitate key furanocoumarins and flavonoids in grapefruit juice and other foods used in dietary substance-drug interaction studies. Copyright © 2013 John Wiley & Sons, Ltd.

  12. 5-Aminolevulinic acid-induced protoporphyrin IX fluorescence in meningioma: qualitative and quantitative measurements in vivo.

    PubMed

    Valdes, Pablo A; Bekelis, Kimon; Harris, Brent T; Wilson, Brian C; Leblond, Frederic; Kim, Anthony; Simmons, Nathan E; Erkmen, Kadir; Paulsen, Keith D; Roberts, David W

    2014-03-01

    The use of 5-aminolevulinic acid (ALA)-induced protoporphyrin IX (PpIX) fluorescence has shown promise as a surgical adjunct for maximizing the extent of surgical resection in gliomas. To date, the clinical utility of 5-ALA in meningiomas is not fully understood, with most descriptive studies using qualitative approaches to 5-ALA-PpIX. To assess the diagnostic performance of 5-ALA-PpIX fluorescence during surgical resection of meningioma. ALA was administered to 15 patients with meningioma undergoing PpIX fluorescence-guided surgery at our institution. At various points during the procedure, the surgeon performed qualitative, visual assessments of fluorescence by using the surgical microscope, followed by a quantitative fluorescence measurement by using an intraoperative probe. Specimens were collected at each point for subsequent neuropathological analysis. Clustered data analysis of variance was used to ascertain a difference between groups, and receiver operating characteristic analyses were performed to assess diagnostic capabilities. Red-pink fluorescence was observed in 80% (12/15) of patients, with visible fluorescence generally demonstrating a strong, homogenous character. Quantitative fluorescence measured diagnostically significant PpIX concentrations (cPpIx) in both visibly and nonvisibly fluorescent tissues, with significantly higher cPpIx in both visibly fluorescent (P < .001) and tumor tissue (P = .002). Receiver operating characteristic analyses also showed diagnostic accuracies up to 90% for differentiating tumor from normal dura. ALA-induced PpIX fluorescence guidance is a potential and promising adjunct in accurately detecting neoplastic tissue during meningioma resective surgery. These results suggest a broader reach for PpIX as a biomarker for meningiomas than was previously noted in the literature.

  13. 5-Aminolevulinic Acid-Induced Protoporphyrin IX Fluorescence in Meningioma: Qualitative and Quantitative Measurements In Vivo

    PubMed Central

    Valdes, Pablo A.; Bekelis, Kimon; Harris, Brent T.; Wilson, Brian C.; Leblond, Frederic; Kim, Anthony; Simmons, Nathan E.; Erkmen, Kadir; Paulsen, Keith D.; Roberts, David W.

    2014-01-01

    BACKGROUND The use of 5-aminolevulinic acid (ALA)-induced protoporphyrin IX (PpIX) fluorescence has shown promise as a surgical adjunct for maximizing the extent of surgical resection in gliomas. To date, the clinical utility of 5-ALA in meningiomas is not fully understood, with most descriptive studies using qualitative approaches to 5-ALA-PpIX. OBJECTIVE To assess the diagnostic performance of 5-ALA-PpIX fluorescence during surgical resection of meningioma. METHODS ALA was administered to 15 patients with meningioma undergoing PpIX fluorescence-guided surgery at our institution. At various points during the procedure, the surgeon performed qualitative, visual assessments of fluorescence by using the surgical microscope, followed by a quantitative fluorescence measurement by using an intra-operative probe. Specimens were collected at each point for subsequent neuropathological analysis. Clustered data analysis of variance was used to ascertain a difference between groups, and receiver operating characteristic analyses were performed to assess diagnostic capabilities. RESULTS Red-pink fluorescence was observed in 80% (12/15) of patients, with visible fluorescence generally demonstrating a strong, homogenous character. Quantitative fluorescence measured diagnostically significant PpIX concentrations (CPpIx) in both visibly and nonvisibly fluorescent tissues, with significantly higher CPpIx in both visibly fluorescent (P < .001) and tumor tissue (P = .002). Receiver operating characteristic analyses also showed diagnostic accuracies up to 90% for differentiating tumor from normal dura. CONCLUSION ALA-induced PpIX fluorescence guidance is a potential and promising adjunct in accurately detecting neoplastic tissue during meningioma resective surgery. These results suggest a broader reach for PpIX as a biomarker for meningiomas than was previously noted in the literature. PMID:23887194

  14. Measuring teamwork in primary care: Triangulation of qualitative and quantitative data.

    PubMed

    Brown, Judith Belle; Ryan, Bridget L; Thorpe, Cathy; Markle, Emma K R; Hutchison, Brian; Glazier, Richard H

    2015-09-01

    This article describes the triangulation of qualitative dimensions, reflecting high functioning teams, with the results of standardized teamwork measures. The study used a mixed methods design using qualitative and quantitative approaches to assess teamwork in 19 Family Health Teams in Ontario, Canada. This article describes dimensions from the qualitative phase using grounded theory to explore the issues and challenges to teamwork. Two quantitative measures were used in the study, the Team Climate Inventory (TCI) and the Providing Effective Resources and Knowledge (PERK) scale. For the triangulation analysis, the mean scores of these measures were compared with the qualitatively derived ratings for the dimensions. The final sample for the qualitative component was 107 participants. The qualitative analysis identified 9 dimensions related to high team functioning such as common philosophy, scope of practice, conflict resolution, change management, leadership, and team evolution. From these dimensions, teams were categorized numerically as high, moderate, or low functioning. Three hundred seventeen team members completed the survey measures. Mean site scores for the TCI and PERK were 3.87 and 3.88, respectively (of 5). The TCI was associated will all dimensions except for team location, space allocation, and executive director leadership. The PERK was associated with all dimensions except team location. Data triangulation provided qualitative and quantitative evidence of what constitutes teamwork. Leadership was pivotal in forging a common philosophy and encouraging team collaboration. Teams used conflict resolution strategies and adapted to the changes they encountered. These dimensions advanced the team's evolution toward a high functioning team. (c) 2015 APA, all rights reserved).

  15. A proteomics performance standard to support measurement quality in proteomics.

    PubMed

    Beasley-Green, Ashley; Bunk, David; Rudnick, Paul; Kilpatrick, Lisa; Phinney, Karen

    2012-04-01

    The emergence of MS-based proteomic platforms as a prominent technology utilized in biochemical and biomedical research has increased the need for high-quality MS measurements. To address this need, National Institute of Standards and Technology (NIST) reference material (RM) 8323 yeast protein extract is introduced as a proteomics quality control material for benchmarking the preanalytical and analytical performance of proteomics-based experimental workflows. RM 8323 yeast protein extract is based upon the well-characterized eukaryote Saccharomyces cerevisiae and can be utilized in the design and optimization of proteomics-based methodologies from sample preparation to data analysis. To demonstrate its utility as a proteomics quality control material, we coupled LC-MS/MS measurements of RM 8323 with the NIST MS Quality Control (MSQC) performance metrics to quantitatively assess the LC-MS/MS instrumentation parameters that influence measurement accuracy, repeatability, and reproducibility. Due to the complexity of the yeast proteome, we also demonstrate how NIST RM 8323, along with the NIST MSQC performance metrics, can be used in the evaluation and optimization of proteomics-based sample preparation methods. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Measures of rowing performance.

    PubMed

    Smith, T Brett; Hopkins, Will G

    2012-04-01

    Accurate measures of performance are important for assessing competitive athletes in practi~al and research settings. We present here a review of rowing performance measures, focusing on the errors in these measures and the implications for testing rowers. The yardstick for assessing error in a performance measure is the random variation (typical or standard error of measurement) in an elite athlete's competitive performance from race to race: ∼1.0% for time in 2000 m rowing events. There has been little research interest in on-water time trials for assessing rowing performance, owing to logistic difficulties and environmental perturbations in performance time with such tests. Mobile ergometry via instrumented oars or rowlocks should reduce these problems, but the associated errors have not yet been reported. Measurement of boat speed to monitor on-water training performance is common; one device based on global positioning system (GPS) technology contributes negligible extra random error (0.2%) in speed measured over 2000 m, but extra error is substantial (1-10%) with other GPS devices or with an impeller, especially over shorter distances. The problems with on-water testing have led to widespread use of the Concept II rowing ergometer. The standard error of the estimate of on-water 2000 m time predicted by 2000 m ergometer performance was 2.6% and 7.2% in two studies, reflecting different effects of skill, body mass and environment in on-water versus ergometer performance. However, well trained rowers have a typical error in performance time of only ∼0.5% between repeated 2000 m time trials on this ergometer, so such trials are suitable for tracking changes in physiological performance and factors affecting it. Many researchers have used the 2000 m ergometer performance time as a criterion to identify other predictors of rowing performance. Standard errors of the estimate vary widely between studies even for the same predictor, but the lowest

  17. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  18. Longitudinal change in quantitative meniscus measurements in knee osteoarthritis--data from the Osteoarthritis Initiative.

    PubMed

    Bloecker, Katja; Wirth, W; Guermazi, A; Hitzl, W; Hunter, D J; Eckstein, F

    2015-10-01

    We aimed to apply 3D MRI-based measurement technology to studying 2-year change in quantitative measurements of meniscus size and position. Forty-seven knees from the Osteoarthritis Initiative with medial radiographic joint space narrowing had baseline and 2-year follow-up MRIs. Quantitative measures were obtained from manual segmentation of the menisci and tibia using coronal DESSwe images. The standardized response mean (SRM = mean/SD change) was used as measure of sensitivity to longitudinal change. Medial tibial plateau coverage decreased from 34.8% to 29.9% (SRM -0.82; p < 0.001). Change in medial meniscus extrusion in a central image (SRM 0.18) and in the central five slices (SRM 0.22) did not reach significance, but change in extrusion across the entire meniscus (SRM 0.32; p = 0.03) and in the relative area of meniscus extrusion (SRM 0.56; p < 0.001) did. There was a reduction in medial meniscus volume (10%; p < 0.001), width (7%; p < 0.001), and height (2%; p = 0.08); meniscus substance loss was strongest in the posterior (SRM -0.51; p = 0.001) and weakest in the anterior horn (SRM -0.15; p = 0.31). This pilot study reports, for the first time, longitudinal change in quantitative 3D meniscus measurements in knee osteoarthritis. It provides evidence of improved sensitivity to change of 3D measurements compared with single slice analysis. • First longitudinal MRI-based measurements of change of meniscus position and size. • Quantitative longitudinal evaluation of meniscus change in knee osteoarthritis. • Improved sensitivity to change of 3D measurements compared with single slice analysis.

  19. Quantitative colorectal cancer perfusion measurement by multidetector-row CT: does greater tumour coverage improve measurement reproducibility?

    PubMed

    Goh, V; Halligan, S; Gartner, L; Bassett, P; Bartram, C I

    2006-07-01

    The purpose of this study was to determine if greater z-axis tumour coverage improves the reproducibility of quantitative colorectal cancer perfusion measurements using CT. A 65 s perfusion study was acquired following intravenous contrast administration in 10 patients with proven colorectal cancer using a four-detector row scanner. This was repeated within 48 h using identical technical parameters to allow reproducibility assessment. Quantitative tumour blood volume, blood flow, mean transit time and permeability measurements were determined using commercially available software (Perfusion 3.0; GE Healthcare, Waukesha, WI) for data obtained from a 5 mm z-axis tumour coverage, and from a 20 mm z-axis tumour coverage. Measurement reproducibility was assessed using Bland-Altman statistics, for a 5 mm z-axis tumour coverage, and 20 mm z-axis tumour coverage, respectively. The mean difference (95% limits of agreement) for blood volume, blood flow, mean transit time and permeability were 0.04 (-2.50 to +2.43) ml/100 g tissue; +8.80 (-50.5 to +68.0) ml/100 g tissue/min; -0.99 (-8.19 to +6.20) seconds; and +1.20 (-5.42 to +7.83) ml/100 g tissue/min, respectively, for a 5 mm coverage, and -0.04 (-2.61 to +2.53) ml/100 g tissue; +7.40 (-50.3 to +65.0) ml/100 g tissue/min; -2.46 (-12.61 to +7.69) seconds; and -0.23 (-8.31 to +7.85) ml/100 g tissue/min, respectively, for a 20 mm coverage, indicating similar levels of agreement. In conclusion, increasing z-axis coverage does not improve reproducibility of quantitative colorectal cancer perfusion measurements.

  20. Performing Repeated Quantitative Small-Animal PET with an Arterial Input Function Is Routinely Feasible in Rats.

    PubMed

    Huang, Chi-Cheng; Wu, Chun-Hu; Huang, Ya-Yao; Tzen, Kai-Yuan; Chen, Szu-Fu; Tsai, Miao-Ling; Wu, Hsiao-Ming

    2017-04-01

    Performing quantitative small-animal PET with an arterial input function has been considered technically challenging. Here, we introduce a catheterization procedure that keeps a rat physiologically stable for 1.5 mo. We demonstrated the feasibility of quantitative small-animal 18 F-FDG PET in rats by performing it repeatedly to monitor the time course of variations in the cerebral metabolic rate of glucose (CMR glc ). Methods: Aseptic surgery was performed on 2 rats. Each rat underwent catheterization of the right femoral artery and left femoral vein. The catheters were sealed with microinjection ports and then implanted subcutaneously. Over the next 3 wk, each rat underwent 18 F-FDG quantitative small-animal PET 6 times. The CMR glc of each brain region was calculated using a 3-compartment model and an operational equation that included a k* 4 Results: On 6 mornings, we completed 12 18 F-FDG quantitative small-animal PET studies on 2 rats. The rats grew steadily before and after the 6 quantitative small-animal PET studies. The CMR glc of the conscious brain (e.g., right parietal region, 99.6 ± 10.2 μmol/100 g/min; n = 6) was comparable to that for 14 C-deoxyglucose autoradiographic methods. Conclusion: Maintaining good blood patency in catheterized rats is not difficult. Longitudinal quantitative small-animal PET imaging with an arterial input function can be performed routinely. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  1. Gender, Math Confidence, and Grit: Relationships with Quantitative Skills and Performance in an Undergraduate Biology Course.

    PubMed

    Flanagan, K M; Einarson, J

    2017-01-01

    In a world filled with big data, mathematical models, and statistics, the development of strong quantitative skills is becoming increasingly critical for modern biologists. Teachers in this field must understand how students acquire quantitative skills and explore barriers experienced by students when developing these skills. In this study, we examine the interrelationships among gender, grit, and math confidence for student performance on a pre-post quantitative skills assessment and overall performance in an undergraduate biology course. Here, we show that females significantly underperformed relative to males on a quantitative skills assessment at the start of term. However, females showed significantly higher gains over the semester, such that the gender gap in performance was nearly eliminated by the end of the semester. Math confidence plays an important role in the performance on both the pre and post quantitative skills assessments and overall performance in the course. The effect of grit on student performance, however, is mediated by a student's math confidence; as math confidence increases, the positive effect of grit decreases. Consequently, the positive impact of a student's grittiness is observed most strongly for those students with low math confidence. We also found grit to be positively associated with the midterm score and the final grade in the course. Given the relationships established in this study among gender, grit, and math confidence, we provide "instructor actions" from the literature that can be applied in the classroom to promote the development of quantitative skills in light of our findings. © 2017 K. M. Flanagan and J. Einarson. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http

  2. Quantitative Thermochemical Measurements in High-Pressure Gaseous Combustion

    NASA Technical Reports Server (NTRS)

    Kojima, Jun J.; Fischer, David G.

    2012-01-01

    We present our strategic experiment and thermochemical analyses on combustion flow using a subframe burst gating (SBG) Raman spectroscopy. This unconventional laser diagnostic technique has promising ability to enhance accuracy of the quantitative scalar measurements in a point-wise single-shot fashion. In the presentation, we briefly describe an experimental methodology that generates transferable calibration standard for the routine implementation of the diagnostics in hydrocarbon flames. The diagnostic technology was applied to simultaneous measurements of temperature and chemical species in a swirl-stabilized turbulent flame with gaseous methane fuel at elevated pressure (17 atm). Statistical analyses of the space-/time-resolved thermochemical data provide insights into the nature of the mixing process and it impact on the subsequent combustion process in the model combustor.

  3. Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance.

    PubMed

    Romeo, Elizabeth M

    2010-07-01

    The concept of critical thinking has been influential in several disciplines. Both education and nursing in general have been attempting to define, teach, and measure this concept for decades. Nurse educators realize that critical thinking is the cornerstone of the objectives and goals for nursing students. The purpose of this article is to review and analyze quantitative research findings relevant to the measurement of critical thinking abilities and skills in undergraduate nursing students and the usefulness of critical thinking as a predictor of National Council Licensure Examination-Registered Nurse (NCLEX-RN) performance. The specific issues that this integrative review examined include assessment and analysis of the theoretical and operational definitions of critical thinking, theoretical frameworks used to guide the studies, instruments used to evaluate critical thinking skills and abilities, and the role of critical thinking as a predictor of NCLEX-RN outcomes. A list of key assumptions related to critical thinking was formulated. The limitations and gaps in the literature were identified, as well as the types of future research needed in this arena. Copyright 2010, SLACK Incorporated.

  4. Quadratic elongation: A quantitative measure of distortion in coordination polyhedra

    USGS Publications Warehouse

    Robinson, Kelly F.; Gibbs, G.V.; Ribbe, P.H.

    1971-01-01

    Quadratic elongation and the variance of bond angles are linearly correlated for distorted octahedral and tetrahedral coordination complexes, both of which show variations in bond length and bond angle. The quadratic elonga tion is dimensionless, giving a quantitative measure of polyhedral distortion which is independent of the effective size of the polyhedron.

  5. Quantitative measures of healthy aging and biological age

    PubMed Central

    Kim, Sangkyu; Jazwinski, S. Michal

    2015-01-01

    Numerous genetic and non-genetic factors contribute to aging. To facilitate the study of these factors, various descriptors of biological aging, including ‘successful aging’ and ‘frailty’, have been put forth as integrative functional measures of aging. A separate but related quantitative approach is the ‘frailty index’, which has been operationalized and frequently used. Various frailty indices have been constructed. Although based on different numbers and types of health variables, frailty indices possess several common properties that make them useful across different studies. We have been using a frailty index termed FI34 based on 34 health variables. Like other frailty indices, FI34 increases non-linearly with advancing age and is a better indicator of biological aging than chronological age. FI34 has a substantial genetic basis. Using FI34, we found elevated levels of resting metabolic rate linked to declining health in nonagenarians. Using FI34 as a quantitative phenotype, we have also found a genomic region on chromosome 12 that is associated with healthy aging and longevity. PMID:26005669

  6. Quantitative fiber-optic Raman spectroscopy for tissue Raman measurements

    NASA Astrophysics Data System (ADS)

    Duraipandian, Shiyamala; Bergholt, Mads; Zheng, Wei; Huang, Zhiwei

    2014-03-01

    Molecular profiling of tissue using near-infrared (NIR) Raman spectroscopy has shown great promise for in vivo detection and prognostication of cancer. The Raman spectra measured from the tissue generally contain fundamental information about the absolute biomolecular concentrations in tissue and its changes associated with disease transformation. However, producing analogues tissue Raman spectra present a great technical challenge. In this preliminary study, we propose a method to ensure the reproducible tissue Raman measurements and validated with the in vivo Raman spectra (n=150) of inner lip acquired using different laser powers (i.e., 30 and 60 mW). A rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe was utilized for tissue Raman measurements. The investigational results showed that the variations between the spectra measured with different laser powers are almost negligible, facilitating the quantitative analysis of tissue Raman measurements in vivo.

  7. Quantitative PET/CT scanner performance characterization based upon the society of nuclear medicine and molecular imaging clinical trials network oncology clinical simulator phantom.

    PubMed

    Sunderland, John J; Christian, Paul E

    2015-01-01

    The Clinical Trials Network (CTN) of the Society of Nuclear Medicine and Molecular Imaging (SNMMI) operates a PET/CT phantom imaging program using the CTN's oncology clinical simulator phantom, designed to validate scanners at sites that wish to participate in oncology clinical trials. Since its inception in 2008, the CTN has collected 406 well-characterized phantom datasets from 237 scanners at 170 imaging sites covering the spectrum of commercially available PET/CT systems. The combined and collated phantom data describe a global profile of quantitative performance and variability of PET/CT data used in both clinical practice and clinical trials. Individual sites filled and imaged the CTN oncology PET phantom according to detailed instructions. Standard clinical reconstructions were requested and submitted. The phantom itself contains uniform regions suitable for scanner calibration assessment, lung fields, and 6 hot spheric lesions with diameters ranging from 7 to 20 mm at a 4:1 contrast ratio with primary background. The CTN Phantom Imaging Core evaluated the quality of the phantom fill and imaging and measured background standardized uptake values to assess scanner calibration and maximum standardized uptake values of all 6 lesions to review quantitative performance. Scanner make-and-model-specific measurements were pooled and then subdivided by reconstruction to create scanner-specific quantitative profiles. Different makes and models of scanners predictably demonstrated different quantitative performance profiles including, in some cases, small calibration bias. Differences in site-specific reconstruction parameters increased the quantitative variability among similar scanners, with postreconstruction smoothing filters being the most influential parameter. Quantitative assessment of this intrascanner variability over this large collection of phantom data gives, for the first time, estimates of reconstruction variance introduced into trials from allowing

  8. Promoting the safety performance of industrial radiography using a quantitative assessment system.

    PubMed

    Kardan, M R; Mianji, F A; Rastkhah, N; Babakhani, A; Azad, S Borhan

    2006-12-01

    The increasing number of industrial radiographers and their considerable occupational exposure has been one of the main concerns of the Iran Nuclear Regulatory Authority (INRA) in recent years. In 2002, a quantitative system of evaluating the safety performance of licensees and a complementary enforcement system was introduced by the National Radiation Protection Department (NRPD). Each parameter of the practice is given a weighting factor according to its importance to safety. Assessment of the licensees is done quantitatively by summing up their scores using prepared tables. Implementing this system of evaluation showed a considerable decrease in deficiencies in the various centres. Tables are updated regularly as a result of findings during the inspections. This system is used in addition to enforcement to promote safety performance and to increase the culture of safety in industrial radiography.

  9. Using Technical Performance Measures

    NASA Technical Reports Server (NTRS)

    Garrett, Christopher J.; Levack, Daniel J. H.; Rhodes, Russel E.

    2011-01-01

    All programs have requirements. For these requirements to be met, there must be a means of measurement. A Technical Performance Measure (TPM) is defined to produce a measured quantity that can be compared to the requirement. In practice, the TPM is often expressed as a maximum or minimum and a goal. Example TPMs for a rocket program are: vacuum or sea level specific impulse (lsp), weight, reliability (often expressed as a failure rate), schedule, operability (turn-around time), design and development cost, production cost, and operating cost. Program status is evaluated by comparing the TPMs against specified values of the requirements. During the program many design decisions are made and most of them affect some or all of the TPMs. Often, the same design decision changes some TPMs favorably while affecting other TPMs unfavorably. The problem then becomes how to compare the effects of a design decision on different TPMs. How much failure rate is one second of specific impulse worth? How many days of schedule is one pound of weight worth? In other words, how to compare dissimilar quantities in order to trade and manage the TPMs to meet all requirements. One method that has been used successfully and has a mathematical basis is Utility Analysis. Utility Analysis enables quantitative comparison among dissimilar attributes. It uses a mathematical model that maps decision maker preferences over the tradeable range of each attribute. It is capable of modeling both independent and dependent attributes. Utility Analysis is well supported in the literature on Decision Theory. It has been used at Pratt & Whitney Rocketdyne for internal programs and for contracted work such as the J-2X rocket engine program. This paper describes the construction of TPMs and describes Utility Analysis. It then discusses the use of TPMs in design trades and to manage margin during a program using Utility Analysis.

  10. Quantitative diagnostic performance of myocardial perfusion SPECT with attenuation correction in women.

    PubMed

    Wolak, Arik; Slomka, Piotr J; Fish, Mathews B; Lorenzo, Santiago; Berman, Daniel S; Germano, Guido

    2008-06-01

    Attenuation correction (AC) for myocardial perfusion SPECT (MPS) had not been evaluated separately in women despite specific considerations in this group because of breast photon attenuation. We aimed to evaluate the performance of AC in women by using automated quantitative analysis of MPS to avoid any bias. Consecutive female patients--134 with a low likelihood (LLk) of coronary artery disease (CAD) and 114 with coronary angiography performed within less than 3 mo of MPS--who were referred for rest-stress electrocardiography-gated 99mTc-sestamibi MPS with AC were considered. Imaging data were evaluated for contour quality control. An additional 50 LLk studies in women were used to create equivalent normal limits for studies with AC and with no correction (NC). An experienced technologist unaware of the angiography and other results performed the contour quality control. All other processing was performed in a fully automated manner. Quantitative analysis was performed with the Cedars-Sinai myocardial perfusion analysis package. All automated segmental analyses were performed with the 17-segment, 5-point American Heart Association model. Summed stress scores (SSS) of > or =3 were considered abnormal. CAD (> or =70% stenosis) was present in 69 of 114 patients (60%). The normalcy rates were 93% for both NC and AC studies. The SSS for patients with CAD and without CAD for NC versus AC were 10.0 +/- 9.0 (mean +/- SD) versus 10.2 +/- 8.5 and 1.6 +/- 2.3 versus 1.8 +/- 2.5, respectively; P was not significant (NS) for all comparisons of NC versus AC. The SSS for LLk patients for NC versus AC were 0.51 +/- 1.0 versus 0.6 +/- 1.1, respectively; P was NS. The specificity for both NC and AC was 73%. The sensitivities for NC and AC were 80% and 81%, respectively, and the accuracies for NC and AC were 77% and 78%, respectively; P was NS for both comparisons. There are no significant diagnostic differences between automated quantitative MPS analyses performed in studies

  11. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges.

    PubMed

    Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.

  12. Phosphorescent nanoparticles for quantitative measurements of oxygen profiles in vitro and in vivo

    PubMed Central

    Choi, Nak Won; Verbridge, Scott S.; Williams, Rebecca M.; Chen, Jin; Kim, Ju-Young; Schmehl, Russel; Farnum, Cornelia E.; Zipfel, Warren R.; Fischbach, Claudia; Stroock, Abraham D.

    2012-01-01

    We present the development and characterization of nanoparticles loaded with a custom phosphor; we exploit these nanoparticles to perform quantitative measurements of the concentration of oxygen within three-dimensional (3-D) tissue cultures in vitro and blood vessels in vivo. We synthesized a customized ruthenium (Ru)-phosphor and incorporated it into polymeric nanoparticles via self-assembly. We demonstrate that the encapsulated phosphor is non-toxic with and without illumination. We evaluated two distinct modes of employing the phosphorescent nanoparticles for the measurement of concentrations of oxygen: 1) in vitro, in a 3-D microfluidic tumor model via ratiometric measurements of intensity with an oxygen-insensitive fluorophore as a reference, and 2) in vivo, in mouse vasculature using measurements of phosphorescence lifetime. With both methods, we demonstrated micrometer-scale resolution and absolute calibration to the dissolved oxygen concentration. Based on the ease and customizability of the synthesis of the nanoparticles and the flexibility of their application, these oxygen-sensing polymeric nanoparticles will find a natural home in a range of biological applications, benefiting studies of physiological as well as pathological processes in which oxygen availability and concentration play a critical role. PMID:22240511

  13. Quantitative single-photon emission computed tomography/computed tomography for technetium pertechnetate thyroid uptake measurement

    PubMed Central

    Lee, Hyunjong; Kim, Ji Hyun; Kang, Yeon-koo; Moon, Jae Hoon; So, Young; Lee, Won Woo

    2016-01-01

    Abstract Objectives: Technetium pertechnetate (99mTcO4) is a radioactive tracer used to assess thyroid function by thyroid uptake system (TUS). However, the TUS often fails to deliver accurate measurements of the percent of thyroid uptake (%thyroid uptake) of 99mTcO4. Here, we investigated the usefulness of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) after injection of 99mTcO4 in detecting thyroid function abnormalities. Materials and methods: We retrospectively reviewed data from 50 patients (male:female = 15:35; age, 46.2 ± 16.3 years; 17 Graves disease, 13 thyroiditis, and 20 euthyroid). All patients underwent 99mTcO4 quantitative SPECT/CT (185 MBq = 5 mCi), which yielded %thyroid uptake and standardized uptake value (SUV). Twenty-one (10 Graves disease and 11 thyroiditis) of the 50 patients also underwent conventional %thyroid uptake measurements using a TUS. Results: Quantitative SPECT/CT parameters (%thyroid uptake, SUVmean, and SUVmax) were the highest in Graves disease, second highest in euthyroid, and lowest in thyroiditis (P < 0.0001, Kruskal–Wallis test). TUS significantly overestimated the %thyroid uptake compared with SPECT/CT (P < 0.0001, paired t test) because other 99mTcO4 sources in addition to thyroid, such as salivary glands and saliva, contributed to the %thyroid uptake result by TUS, whereas %thyroid uptake, SUVmean and SUVmax from the SPECT/CT were associated with the functional status of thyroid. Conclusions: Quantitative SPECT/CT is more accurate than conventional TUS for measuring 99mTcO4 %thyroid uptake. Quantitative measurements using SPECT/CT may facilitate more accurate assessment of thyroid tracer uptake. PMID:27399139

  14. Refractive index variance of cells and tissues measured by quantitative phase imaging.

    PubMed

    Shan, Mingguang; Kandel, Mikhail E; Popescu, Gabriel

    2017-01-23

    The refractive index distribution of cells and tissues governs their interaction with light and can report on morphological modifications associated with disease. Through intensity-based measurements, refractive index information can be extracted only via scattering models that approximate light propagation. As a result, current knowledge of refractive index distributions across various tissues and cell types remains limited. Here we use quantitative phase imaging and the statistical dispersion relation (SDR) to extract information about the refractive index variance in a variety of specimens. Due to the phase-resolved measurement in three-dimensions, our approach yields refractive index results without prior knowledge about the tissue thickness. With the recent progress in quantitative phase imaging systems, we anticipate that using SDR will become routine in assessing tissue optical properties.

  15. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  16. Quantitative Velocity Field Measurements in Reduced-Gravity Combustion Science and Fluid Physics Experiments

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Wernet, Mark P.

    1999-01-01

    Systems have been developed and demonstrated for performing quantitative velocity measurements in reduced gravity combustion science and fluid physics investigations. The unique constraints and operational environments inherent to reduced-gravity experimental facilities pose special challenges to the development of hardware and software systems. Both point and planar velocimetric capabilities are described, with particular attention being given to the development of systems to support the International Space Station laboratory. Emphasis has been placed on optical methods, primarily arising from the sensitivity of the phenomena of interest to intrusive probes. Limitations on available power, volume, data storage, and attendant expertise have motivated the use of solid-state sources and detectors, as well as efficient analysis capabilities emphasizing interactive data display and parameter control.

  17. Strategic Measures of Teacher Performance

    ERIC Educational Resources Information Center

    Milanowski, Anthony

    2011-01-01

    Managing the human capital in education requires measuring teacher performance. To measure performance, administrators need to combine measures of practice with measures of outcomes, such as value-added measures, and three measurement systems are needed: classroom observations, performance assessments or work samples, and classroom walkthroughs.…

  18. MTF measurements on real time for performance analysis of electro-optical systems

    NASA Astrophysics Data System (ADS)

    Stuchi, Jose Augusto; Signoreto Barbarini, Elisa; Vieira, Flavio Pascoal; dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fatima Maria Mitsue; Castro Neto, Jarbas C.; Linhari Rodrigues, Evandro Luis

    2012-06-01

    The need of methods and tools that assist in determining the performance of optical systems is actually increasing. One of the most used methods to perform analysis of optical systems is to measure the Modulation Transfer Function (MTF). The MTF represents a direct and quantitative verification of the image quality. This paper presents the implementation of the software, in order to calculate the MTF of electro-optical systems. The software was used for calculating the MTF of Digital Fundus Camera, Thermal Imager and Ophthalmologic Surgery Microscope. The MTF information aids the analysis of alignment and measurement of optical quality, and also defines the limit resolution of optical systems. The results obtained with the Fundus Camera and Thermal Imager was compared with the theoretical values. For the Microscope, the results were compared with MTF measured of Microscope Zeiss model, which is the quality standard of ophthalmological microscope.

  19. Development of a novel nanoscratch technique for quantitative measurement of ice adhesion strength

    NASA Astrophysics Data System (ADS)

    Loho, T.; Dickinson, M.

    2018-04-01

    The mechanism for the way that ice adheres to surfaces is still not well understood. Currently there is no standard method to quantitatively measure how ice adheres to surfaces which makes ice surface studies difficult to compare. A novel quantitative lateral force adhesion measurement at the micro-nano scale for ice was created which shears micro-nano sized ice droplets (less than 3 μm in diameter and 100nm in height) using a nanoindenter. By using small ice droplets, the variables associated with bulk ice measurements were minimised which increased data repeatability compared to bulk testing. The technique provided post- testing surface scans to confirm that the ice had been removed and that measurements were of ice adhesion strength. Results show that the ice adhesion strength of a material is greatly affected by the nano-scale surface roughness of the material with rougher surfaces having higher ice adhesion strength.

  20. Large-Scale Interlaboratory Study to Develop, Analytically Validate and Apply Highly Multiplexed, Quantitative Peptide Assays to Measure Cancer-Relevant Proteins in Plasma*

    PubMed Central

    Abbatiello, Susan E.; Schilling, Birgit; Mani, D. R.; Zimmerman, Lisa J.; Hall, Steven C.; MacLean, Brendan; Albertolle, Matthew; Allen, Simon; Burgess, Michael; Cusack, Michael P.; Gosh, Mousumi; Hedrick, Victoria; Held, Jason M.; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kinsinger, Christopher R.; Lyssand, John; Makowski, Lee; Mesri, Mehdi; Rodriguez, Henry; Rudnick, Paul; Sadowski, Pawel; Sedransk, Nell; Shaddox, Kent; Skates, Stephen J.; Kuhn, Eric; Smith, Derek; Whiteaker, Jeffery R.; Whitwell, Corbin; Zhang, Shucha; Borchers, Christoph H.; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel C.; MacCoss, Michael J.; Neubert, Thomas A.; Paulovich, Amanda G.; Regnier, Fred E.; Tempst, Paul; Carr, Steven A.

    2015-01-01

    There is an increasing need in biology and clinical medicine to robustly and reliably measure tens to hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility, and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here, we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and seven control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data, we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to subnanogram/ml sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and interlaboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy-isotope-labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an interlaboratory clinical study of patient samples. Our study further establishes that LC

  1. How quantitative measures unravel design principles in multi-stage phosphorylation cascades.

    PubMed

    Frey, Simone; Millat, Thomas; Hohmann, Stefan; Wolkenhauer, Olaf

    2008-09-07

    We investigate design principles of linear multi-stage phosphorylation cascades by using quantitative measures for signaling time, signal duration and signal amplitude. We compare alternative pathway structures by varying the number of phosphorylations and the length of the cascade. We show that a model for a weakly activated pathway does not reflect the biological context well, unless it is restricted to certain parameter combinations. Focusing therefore on a more general model, we compare alternative structures with respect to a multivariate optimization criterion. We test the hypothesis that the structure of a linear multi-stage phosphorylation cascade is the result of an optimization process aiming for a fast response, defined by the minimum of the product of signaling time and signal duration. It is then shown that certain pathway structures minimize this criterion. Several popular models of MAPK cascades form the basis of our study. These models represent different levels of approximation, which we compare and discuss with respect to the quantitative measures.

  2. Measuring comparative hospital performance.

    PubMed

    Griffith, John R; Alexander, Jeffrey A; Jelinek, Richard C

    2002-01-01

    Leading healthcare provider organizations now use a "balanced scorecard" of performance measures, expanding information reviewed at the governance level to include financial, customer, and internal performance information, as well as providing an opportunity to learn and grow to provide better strategic guidance. The approach, successfully used by other industries, uses competitor data and benchmarks to identify opportunities for improved mission achievement. This article evaluates one set of nine multidimensional hospital performance measures derived from Medicare reports (cash flow, asset turnover, mortality, complications, length of inpatient stay, cost per case, occupancy, change in occupancy, and percent of revenue from outpatient care). The study examines the content validity, reliability and sensitivity, validity of comparison, and independence and concludes that seven of the nine measures (all but the two occupancy measures) represent a potentially useful set for evaluating most U.S. hospitals. This set reflects correctable differences in performance between hospitals serving similar populations, that is, the measures reflect relative performance and identify opportunities to make the organization more successful.

  3. Effects of Computer Programming on Students' Cognitive Performance: A Quantitative Synthesis.

    ERIC Educational Resources Information Center

    Liao, Yuen-Kuang Cliff

    A meta-analysis was performed to synthesize existing data concerning the effects of computer programing on cognitive outcomes of students. Sixty-five studies were located from three sources, and their quantitative data were transformed into a common scale--Effect Size (ES). The analysis showed that 58 (89%) of the study-weighted ESs were positive…

  4. Guidelines for improving the reproducibility of quantitative multiparameter immunofluorescence measurements by laser scanning cytometry on fixed cell suspensions from human solid tumors.

    PubMed

    Shackney, Stanley; Emlet, David R; Pollice, Agnese; Smith, Charles; Brown, Kathryn; Kociban, Deborah

    2006-01-01

    Laser scanning Cytometry (LSC) is a versatile technology that makes it possible to perform multiple measurements on individual cells and correlate them cell by cell with other cellular features. It would be highly desirable to be able to perform reproducible, quantitative, correlated cell-based immunofluorescence studies on individual cells from human solid tumors. However, such studies can be challenging because of the presence of large numbers of cell aggregates and other confounding factors. Techniques have been developed to deal with cell aggregates in data sets collected by LSC. Experience has also been gained in addressing other key technical and methodological issues that can affect the reproducibility of such cell-based immunofluorescence measurements. We describe practical aspects of cell sample collection, cell fixation and staining, protocols for performing multiparameter immunofluorescence measurements by LSC, use of controls and reference samples, and approaches to data analysis that we have found useful in improving the accuracy and reproducibility of LSC data obtained in human tumor samples. We provide examples of the potential advantages of LSC in examining quantitative aspects of cell-based analysis. Improvements in the quality of cell-based multiparameter immunofluorescence measurements make it possible to extract useful information from relatively small numbers of cells. This, in turn, permits the performance of multiple multicolor panels on each tumor sample. With links among the different panels that are provided by overlapping measurements, it is possible to develop increasingly more extensive profiles of intracellular expression of multiple proteins in clinical samples of human solid tumors. Examples of such linked panels of measurements are provided. Advances in methodology can improve cell-based multiparameter immunofluorescence measurements on cell suspensions from human solid tumors by LSC for use in prognostic and predictive clinical

  5. Semi-automated quantitative Drosophila wings measurements.

    PubMed

    Loh, Sheng Yang Michael; Ogawa, Yoshitaka; Kawana, Sara; Tamura, Koichiro; Lee, Hwee Kuan

    2017-06-28

    Drosophila melanogaster is an important organism used in many fields of biological research such as genetics and developmental biology. Drosophila wings have been widely used to study the genetics of development, morphometrics and evolution. Therefore there is much interest in quantifying wing structures of Drosophila. Advancement in technology has increased the ease in which images of Drosophila can be acquired. However such studies have been limited by the slow and tedious process of acquiring phenotypic data. We have developed a system that automatically detects and measures key points and vein segments on a Drosophila wing. Key points are detected by performing image transformations and template matching on Drosophila wing images while vein segments are detected using an Active Contour algorithm. The accuracy of our key point detection was compared against key point annotations of users. We also performed key point detection using different training data sets of Drosophila wing images. We compared our software with an existing automated image analysis system for Drosophila wings and showed that our system performs better than the state of the art. Vein segments were manually measured and compared against the measurements obtained from our system. Our system was able to detect specific key points and vein segments from Drosophila wing images with high accuracy.

  6. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease

    PubMed Central

    van Gilst, Merel M.; van Mierlo, Petra; Bloem, Bastiaan R.; Overeem, Sebastiaan

    2015-01-01

    Study Objectives: Many people with Parkinson disease experience “sleep benefit”: temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. Design: Eighteen Parkinson patients with and 20 without subjective sleep benefit and 20 healthy controls participated. Before and directly after a regular night sleep and an afternoon nap, subjects performed the timed pegboard dexterity task and quantified finger tapping task. Subjective ratings of motor functioning and mood/vigilange were included. Sleep was monitored using polysomnography. Results: On both tasks, patients were overall slower than healthy controls (night: F2,55 = 16.938, P < 0.001; nap: F2,55 = 15.331, P < 0.001). On the pegboard task, there was a small overall effect of night sleep (F1,55 = 9.695, P = 0.003); both patients and controls were on average slightly slower in the morning. However, in both tasks there was no sleep*group interaction for nighttime sleep nor for afternoon nap. There was a modest correlation between the score on the pegboard task and self-rated motor symptoms among patients (rho = 0.233, P = 0.004). No correlations in task performance and mood/vigilance or sleep time/efficiency were found. Conclusions: A positive effect of sleep on motor function is commonly reported by Parkinson patients. Here we show that the subjective experience of sleep benefit is not paralleled by an actual improvement in motor functioning. Sleep benefit therefore appears to be a subjective phenomenon and not a Parkinson-specific reduction in symptoms. Citation: van Gilst MM, van Mierlo P, Bloem BR, Overeem S. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease. SLEEP 2015;38(10):1567–1573. PMID:25902811

  7. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  8. Single-case synthesis tools II: Comparing quantitative outcome measures.

    PubMed

    Zimmerman, Kathleen N; Pustejovsky, James E; Ledford, Jennifer R; Barton, Erin E; Severini, Katherine E; Lloyd, Blair P

    2018-03-07

    Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes-overlap measures (percentage non-overlapping data, improvement rate difference, and Tau) and parametric within-case effect sizes (standardized mean difference and log response ratio [increasing and decreasing])-were compared to determine if choice of synthesis method within and across classes impacts conclusions regarding effectiveness. The effectiveness of sensory-based interventions (SBI), a commonly used class of treatments for young children, was evaluated. Separately from evaluations of rigor and quality, authors evaluated behavior change between baseline and SBI conditions. SBI were unlikely to result in positive behavior change across all measures except IRD. However, subgroup analyses resulted in variable conclusions, indicating that the choice of measures for SCD meta-analyses can impact conclusions. Suggestions for using the log response ratio in SCD meta-analyses and considerations for understanding variability in SCD meta-analysis conclusions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Quantitative magnetic resonance (QMR) measurement of changes in body composition of neonatal pigs

    USDA-ARS?s Scientific Manuscript database

    The survival of low birth weight pigs in particular may depend on energy stores in the body. QMR (quantitative magnetic resonance) is a new approach to measuring total body fat, lean and water. These measurements are based on quantifying protons associated with lipid and water molecules in the body...

  10. CONFOCAL MICROSCOPY SYSTEM PERFORMANCE: FOUNDATIONS FOR MEASUREMENTS, QUANTITATION AND SPECTROSCOPY

    EPA Science Inventory

    The confocal laser-scanning microscopy (CLSM) has enormous potential in many biological fields. The goal of a CLSM is to acquire and quantify fluorescence and in some instruments acquire spectral characterization of the emitted signal. The accuracy of these measurements demands t...

  11. Issues to Consider When Measuring and Applying Socioeconomic Position Quantitatively in Immigrant Health Research

    PubMed Central

    Nielsen, Signe Smith; Hempler, Nana Folmann; Krasnik, Allan

    2013-01-01

    The relationship between migration and health is complex, yet, immigrant-related inequalities in health are largely influenced by socioeconomic position. Drawing upon previous findings, this paper discusses issues to consider when measuring and applying socioeconomic position in quantitative immigrant health research. When measuring socioeconomic position, it is important to be aware of four aspects: (1) there is a lack of clarity about how socioeconomic position should be measured; (2) different types of socioeconomic position may be relevant to immigrants compared with the native-born population; (3) choices of measures of socioeconomic position in quantitative analyses often rely on data availability; and (4) different measures of socioeconomic position have different effects in population groups. Therefore, caution should be used in the collection, presentation, analyses, and interpretation of data and researchers need to display their proposed conceptual models and data limitations as well as apply different approaches for analyses. PMID:24287857

  12. Point-of-Care Quantitative Measure of Glucose-6-Phosphate Dehydrogenase Enzyme Deficiency.

    PubMed

    Bhutani, Vinod K; Kaplan, Michael; Glader, Bertil; Cotten, Michael; Kleinert, Jairus; Pamula, Vamsee

    2015-11-01

    Widespread newborn screening on a point-of-care basis could prevent bilirubin neurotoxicity in newborns with glucose-6-phosphate dehydrogenase (G6PD) deficiency. We evaluated a quantitative G6PD assay on a digital microfluidic platform by comparing its performance with standard clinical methods. G6PD activity was measured quantitatively by using digital microfluidic fluorescence and the gold standard fluorescence biochemical test on a convenience sample of 98 discarded blood samples. Twenty-four samples were designated as G6PD deficient. Mean ± SD G6PD activity for normal samples using the digital microfluidic method and the standard method, respectively, was 9.7 ± 2.8 and 11.1 ± 3.0 U/g hemoglobin (Hb), respectively; for G6PD-deficient samples, it was 0.8 ± 0.7 and 1.4 ± 0.9 U/g Hb. Bland-Altman analysis determined a mean difference of -0.96 ± 1.8 U/g Hb between the digital microfluidic fluorescence results and the standard biochemical test results. The lower and upper limits for the digital microfluidic platform were 4.5 to 19.5 U/g Hb for normal samples and 0.2 to 3.7 U/g Hb for G6PD-deficient samples. The lower and upper limits for the Stanford method were 5.5 to 20.7 U/g Hb for normal samples and 0.1 to 2.8 U/g Hb for G6PD-deficient samples. The measured activity discriminated between G6PD-deficient samples and normal samples with no overlap. Pending further validation, a digital microfluidics platform could be an accurate point-of-care screening tool for rapid newborn G6PD screening. Copyright © 2015 by the American Academy of Pediatrics.

  13. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less

  14. Measurement and Evaluation of Quantitative Performance of PET/CT Images before a Multicenter Clinical Trial.

    PubMed

    Zhu, Yanjia; Geng, Caizheng; Huang, Jia; Liu, Juzhen; Wu, Ning; Xin, Jun; Xu, Hao; Yu, Lijuan; Geng, Jianhua

    2018-06-13

    To ensure the reliability of the planned multi-center clinical trial, we assessed the consistence and comparability of the quantitative parameters of the eight PET/CT units that will be used in this trial. PET/CT images were scanned using a PET NEMA image quality phantom (Biodex) on the eight units of Discovery PET/CT 690 from GE Healthcare. The scanning parameters were the same with the ones to be used in the planned trial. The 18 F-NaF concentration in the background was 5.3 kBq/ml, while the ones in the spheres of diameter 37 mm, 22 mm, 17 mm and 10 mm were 8:1 as to that of the background and the ones in the spheres of diameter 28 mm and 13 mm were 0 kBq/ml. The consistency of hot sphere recovery coefficient (HRC), cold sphere recovery coefficient (CRC), hot sphere contrast (Q H ) and cold sphere contrast (Q c ) among these 8 PET/CTs was analyzed. The variation of the main quantitative parameters of the eight PET/CT systems was within 10%, which is acceptable for the clinical trial.

  15. Quantitative stress measurement of elastic deformation using mechanoluminescent sensor: An intensity ratio model

    NASA Astrophysics Data System (ADS)

    Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng

    2018-04-01

    The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.

  16. Medical student attitudes towards older people: a critical review of quantitative measures.

    PubMed

    Wilson, Mark A G; Kurrle, Susan; Wilson, Ian

    2018-01-24

    Further research into medical student attitudes towards older people is important, and requires accurate and detailed evaluative methodology. The two objectives for this paper are: (1) From the literature, to critically review instruments of measure for medical student attitudes towards older people, and (2) To recommend the most appropriate quantitative instrument for future research into medical student attitudes towards older people. A SCOPUS and Ovid cross search was performed using the keywords Attitude and medical student and aged or older or elderly. This search was supplemented by manual searching, guided by citations in articles identified by the initial literature search, using the SCOPUS and PubMed databases. International studies quantifying medical student attitudes have demonstrated neutral to positive attitudes towards older people, using various instruments. The most commonly used instruments are the Ageing Semantic Differential (ASD) and the University of California Los Angeles Geriatric Attitudes Scale, with several other measures occasionally used. All instruments used to date have inherent weaknesses. A reliable and valid instrument with which to quantify modern medical student attitudes towards older people has not yet been developed. Adaptation of the ASD for contemporary usage is recommended.

  17. Porous Silicon Antibody Microarrays for Quantitative Analysis: Measurement of Free and Total PSA in Clinical Plasma Samples

    PubMed Central

    Tojo, Axel; Malm, Johan; Marko-Varga, György; Lilja, Hans; Laurell, Thomas

    2014-01-01

    The antibody microarrays have become widespread, but their use for quantitative analyses in clinical samples has not yet been established. We investigated an immunoassay based on nanoporous silicon antibody microarrays for quantification of total prostate-specific-antigen (PSA) in 80 clinical plasma samples, and provide quantitative data from a duplex microarray assay that simultaneously quantifies free and total PSA in plasma. To further develop the assay the porous silicon chips was placed into a standard 96-well microtiter plate for higher throughput analysis. The samples analyzed by this quantitative microarray were 80 plasma samples obtained from men undergoing clinical PSA testing (dynamic range: 0.14-44ng/ml, LOD: 0.14ng/ml). The second dataset, measuring free PSA (dynamic range: 0.40-74.9ng/ml, LOD: 0.47ng/ml) and total PSA (dynamic range: 0.87-295ng/ml, LOD: 0.76ng/ml), was also obtained from the clinical routine. The reference for the quantification was a commercially available assay, the ProStatus PSA Free/Total DELFIA. In an analysis of 80 plasma samples the microarray platform performs well across the range of total PSA levels. This assay might have the potential to substitute for the large-scale microtiter plate format in diagnostic applications. The duplex assay paves the way for a future quantitative multiplex assay, which analyses several prostate cancer biomarkers simultaneously. PMID:22921878

  18. Utility of DWI with quantitative ADC values in ovarian tumors: a meta-analysis of diagnostic test performance.

    PubMed

    Pi, Shan; Cao, Rong; Qiang, Jin Wei; Guo, Yan Hui

    2018-01-01

    Background Diffusion-weighted imaging (DWI) and quantitative apparent diffusion coefficient (ADC) values are widely used in the differential diagnosis of ovarian tumors. Purpose To assess the diagnostic performance of quantitative ADC values in ovarian tumors. Material and Methods PubMed, Embase, the Cochrane Library, and local databases were searched for studies assessing ovarian tumors using quantitative ADC values. We quantitatively analyzed the diagnostic performances for two clinical problems: benign vs. malignant tumors and borderline vs. malignant tumors. We evaluated diagnostic performances by the pooled sensitivity and specificity values and by summary receiver operating characteristic (SROC) curves. Subgroup analyses were used to analyze study heterogeneity. Results From the 742 studies identified in the search results, 16 studies met our inclusion criteria. A total of ten studies evaluated malignant vs. benign ovarian tumors and six studies assessed malignant vs. borderline ovarian tumors. Regarding the diagnostic accuracy of quantitative ADC values for distinguishing between malignant and benign ovarian tumors, the pooled sensitivity and specificity values were 0.91 and 0.91, respectively. The area under the SROC curve (AUC) was 0.96. For differentiating borderline from malignant tumors, the pooled sensitivity and specificity values were 0.89 and 0.79, and the AUC was 0.91. The methodological quality of the included studies was moderate. Conclusion Quantitative ADC values could serve as useful preoperative markers for predicting the nature of ovarian tumors. Nevertheless, prospective trials focused on standardized imaging parameters are needed to evaluate the clinical value of quantitative ADC values in ovarian tumors.

  19. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  20. Tophaceous gout: quantitative evaluation by direct physical measurement.

    PubMed

    Schumacher, H Ralph; Becker, Michael A; Palo, William A; Streit, Janet; MacDonald, Patricia A; Joseph-Ridge, Nancy

    2005-12-01

    The absence of accepted standardized methods for monitoring tophaceous gout limits the ability to track tophus progression or regression. This multicenter study assessed intra- and interrater reproducibility of a simple and direct physical measurement. The quantitative evaluation was the area (mm2) of each measurable tophus and was determined independently by 2 raters on 2 occasions within 10 days. Intra- and interrater reproducibilities were determined by calculating mean differences and average percentage differences (APD) in measurements of areas for the same tophus at each of 2 visits and by each rater, respectively. Fifty-two tophi were measured in 13 subjects: 22 on the hand/wrist, 16 on the elbow, and 14 on the foot/ankle. The mean (+/- SD) difference in tophus areas between visits was -0.2 +/- 835 mm2 (95% CI -162 to 162 mm2) and the mean (+/- SD) APD was 29% +/- 33%. The mean (+/- SD) APD between raters was 32% +/- 27%. The largest variations in measurements were noted for elbow tophi and variations were least for well demarcated tophi on the hands. This simple and reproducible method can be easily utilized in clinical trials and in practice as a measure of efficacy of urate-lowering treatment in tophaceous gout. Among factors contributing to variability in these measurements were the anatomic site of tophi and rater experience with the method. Restriction of measurements to well circumscribed hand or foot tophi could improve reliability, but major changes, as expected with effective therapy, can clearly be documented with this simple technique.

  1. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level.

  2. Developing Effective Performance Measures

    DTIC Science & Technology

    2014-10-14

    University When Performance Measurement Goes Bad Laziness Vanity Narcissism Too Many Pettiness Inanity 52 Developing Effective...Kasunic, October 14, 2014 © 2014 Carnegie Mellon University Narcissism Measuring performance from the organization’s point of view, rather than from

  3. JMorph: Software for performing rapid morphometric measurements on digital images of fossil assemblages

    NASA Astrophysics Data System (ADS)

    Lelièvre, Peter G.; Grey, Melissa

    2017-08-01

    Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.

  4. Quantitative CT Measures of Bronchiectasis in Smokers.

    PubMed

    Diaz, Alejandro A; Young, Thomas P; Maselli, Diego J; Martinez, Carlos H; Gill, Ritu; Nardelli, Pietro; Wang, Wei; Kinney, Gregory L; Hokanson, John E; Washko, George R; San Jose Estepar, Raul

    2017-06-01

    Bronchiectasis is frequent in smokers with COPD; however, there are only limited data on objective assessments of this process. The objective was to assess bronchovascular morphology, calculate the ratio of the diameters of bronchial lumen and adjacent artery (BA ratio), and identify those measurements able to discriminate bronchiectasis. We collected quantitative CT (QCT) measures of BA ratios, peak wall attenuation, wall thickness (WT), wall area, and wall area percent (WA%) at matched fourth through sixth airway generations in 21 ever smokers with bronchiectasis (cases) and 21 never-smoking control patients (control airways). In cases, measurements were collected at both bronchiectatic and nonbronchiectatic airways. Logistic analysis and the area under receiver operating characteristic curve (AUC) were used to assess the predictive ability of QCT measurements for bronchiectasis. The whole-lung and fourth through sixth airway generation BA ratio, WT, and WA% were significantly greater in bronchiectasis cases than control patients. The AUCs for the BA ratio to predict bronchiectasis ranged from 0.90 (whole lung) to 0.79 (fourth-generation). AUCs for WT and WA% ranged from 0.72 to 0.75 and from 0.71 to 0.75. The artery diameters but not bronchial diameters were smaller in bronchiectatic than both nonbronchiectatic and control airways (P < .01 for both). Smoking-related increases in the BA ratio appear to be driven by reductions in vascular caliber. QCT measures of BA ratio, WT, and WA% may be useful to objectively identify and quantify bronchiectasis in smokers. ClinicalTrials.gov; No.: NCT00608764; URL: www.clinicaltrials.gov. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  5. Phase calibration target for quantitative phase imaging with ptychography.

    PubMed

    Godden, T M; Muñiz-Piniella, A; Claverley, J D; Yacoot, A; Humphry, M J

    2016-04-04

    Quantitative phase imaging (QPI) utilizes refractive index and thickness variations that lead to optical phase shifts. This gives contrast to images of transparent objects. In quantitative biology, phase images are used to accurately segment cells and calculate properties such as dry mass, volume and proliferation rate. The fidelity of the measured phase shifts is of critical importance in this field. However to date, there has been no standardized method for characterizing the performance of phase imaging systems. Consequently, there is an increasing need for protocols to test the performance of phase imaging systems using well-defined phase calibration and resolution targets. In this work, we present a candidate for a standardized phase resolution target, and measurement protocol for the determination of the transfer of spatial frequencies, and sensitivity of a phase imaging system. The target has been carefully designed to contain well-defined depth variations over a broadband range of spatial frequencies. In order to demonstrate the utility of the target, we measure quantitative phase images on a ptychographic microscope, and compare the measured optical phase shifts with Atomic Force Microscopy (AFM) topography maps and surface profile measurements from coherence scanning interferometry. The results show that ptychography has fully quantitative nanometer sensitivity in optical path differences over a broadband range of spatial frequencies for feature sizes ranging from micrometers to hundreds of micrometers.

  6. Separation and quantitation of polyethylene glycols 400 and 3350 from human urine by high-performance liquid chromatography.

    PubMed

    Ryan, C M; Yarmush, M L; Tompkins, R G

    1992-04-01

    Polyethylene glycol 3350 (PEG 3350) is useful as an orally administered probe to measure in vivo intestinal permeability to macromolecules. Previous methods to detect polyethylene glycol (PEG) excreted in the urine have been hampered by inherent inaccuracies associated with liquid-liquid extraction and turbidimetric analysis. For accurate quantitation by previous methods, radioactive labels were required. This paper describes a method to separate and quantitate PEG 3350 and PEG 400 in human urine that is independent of radioactive labels and is accurate in clinical practice. The method uses sized regenerated cellulose membranes and mixed ion-exchange resin for sample preparation and high-performance liquid chromatography with refractive index detection for analysis. The 24-h excretion for normal individuals after an oral dose of 40 g of PEG 3350 and 5 g of PEG 400 was 0.12 +/- 0.04% of the original dose of PEG 3350 and 26.3 +/- 5.1% of the original dose of PEG 400.

  7. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  8. Measuring iron in the brain using quantitative susceptibility mapping and X-ray fluorescence imaging

    PubMed Central

    Zheng, Weili; Nichol, Helen; Liu, Saifeng; Cheng, Yu-Chung N.; Haacke, E. Mark

    2013-01-01

    Measuring iron content in the brain has important implications for a number of neurodegenerative diseases. Quantitative susceptibility mapping (QSM), derived from magnetic resonance images, has been used to measure total iron content in vivo and in post mortem brain. In this paper, we show how magnetic susceptibility from QSM correlates with total iron content measured by X-ray fluorescence (XRF) imaging and by inductively coupled plasma mass spectrometry (ICPMS). The relationship between susceptibility and ferritin iron was estimated at 1.10 ± 0.08 ppb susceptibility per μg iron/g wet tissue, similar to that of iron in fixed (frozen/thawed) cadaveric brain and previously published data from unfixed brains. We conclude that magnetic susceptibility can provide a direct and reliable quantitative measurement of iron content and that it can be used clinically at least in regions with high iron content. PMID:23591072

  9. Measuring the Beginning: A Quantitative Study of the Transition to Higher Education

    ERIC Educational Resources Information Center

    Brooman, Simon; Darwent, Sue

    2014-01-01

    This quantitative study measures change in certain factors known to influence success of first-year students during the transition to higher education: self-efficacy, autonomous learning and social integration. A social integration scale was developed with three subscales: "sense of belonging", "relationship with staff" and…

  10. Quantitative measures with WREX usage.

    PubMed

    Shank, Tracy M; Wee, Jinyong; Ty, Jennifer; Rahman, Tariq

    2017-07-01

    This paper presents the results of two surveys conducted with users of a functional upper extremity orthosis called the Wilmington Robotic EXoskeleton (WREX). The WREX is a passive anti-gravity arm orthosis that allows people with neuromuscular disabilities to move their arms in three dimensions. An online user survey with 55 patients was conducted to determine the benefits of the WREX. The survey asked 10 questions related to upper extremity function with and without the WREX as well as subjective impressions of the device. A second survey used a phone interview based on the Canadian Occupational Performance Measure (COPM). Parents rated their child's performance and satisfaction while partaking in important activities both with and without the exoskeleton device. Scores were assessed for change between the two conditions. Twenty-five families responded to this survey. Twenty-four out of 25 subjects reported greater levels of performance and satisfaction when they were wearing the WREX. The mean change in performance score was 3.61 points, and the mean change in satisfaction score was 4.44 points. Results show a statistically significant improvement in arm function for everyday tasks with the WREX.

  11. Technical note: quantitative measures of iris color using high resolution photographs.

    PubMed

    Edwards, Melissa; Gozdzik, Agnes; Ross, Kendra; Miles, Jon; Parra, Esteban J

    2012-01-01

    Our understanding of the genetic architecture of iris color is still limited. This is partly related to difficulties associated with obtaining quantitative measurements of eye color. Here we introduce a new automated method for measuring iris color using high resolution photographs. This method extracts color measurements in the CIE 1976 L*a*b* (CIELAB) color space from a 256 by 256 pixel square sampled from the 9:00 meridian of the iris. Color is defined across three dimensions: L* (the lightness coordinate), a* (the red-green coordinate), and b* (the blue-yellow coordinate). We applied this method to a sample of individuals of diverse ancestry (East Asian, European and South Asian) that was genotyped for the HERC2 rs12913832 polymorphism, which is strongly associated with blue eye color. We identified substantial variation in the CIELAB color space, not only in the European sample, but also in the East Asian and South Asian samples. As expected, rs12913832 was significantly associated with quantitative iris color measurements in subjects of European ancestry. However, this SNP was also strongly associated with iris color in the South Asian sample, although there were no participants with blue irides in this sample. The usefulness of this method is not restricted only to the study of iris pigmentation. High-resolution pictures of the iris will also make it possible to study the genetic variation involved in iris textural patterns, which show substantial heritability in human populations. Copyright © 2011 Wiley Periodicals, Inc.

  12. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  13. Quantitative measurement of indomethacin crystallinity in indomethacin-silica gel binary system using differential scanning calorimetry and X-ray powder diffractometry.

    PubMed

    Pan, Xiaohong; Julian, Thomas; Augsburger, Larry

    2006-02-10

    Differential scanning calorimetry (DSC) and X-ray powder diffractometry (XRPD) methods were developed for the quantitative analysis of the crystallinity of indomethacin (IMC) in IMC and silica gel (SG) binary system. The DSC calibration curve exhibited better linearity than that of XRPD. No phase transformation occurred in the IMC-SG mixtures during DSC measurement. The major sources of error in DSC measurements were inhomogeneous mixing and sampling. Analyzing the amount of IMC in the mixtures using high-performance liquid chromatography (HPLC) could reduce the sampling error. DSC demonstrated greater sensitivity and had less variation in measurement than XRPD in quantifying crystalline IMC in the IMC-SG binary system.

  14. Practicable methods for histological section thickness measurement in quantitative stereological analyses.

    PubMed

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  15. Practicable methods for histological section thickness measurement in quantitative stereological analyses

    PubMed Central

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1–3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  16. Using Performance Tasks to Improve Quantitative Reasoning in an Introductory Mathematics Course

    ERIC Educational Resources Information Center

    Kruse, Gerald; Drews, David

    2013-01-01

    A full-cycle assessment of our efforts to improve quantitative reasoning in an introductory math course is described. Our initial iteration substituted more open-ended performance tasks for the active learning projects than had been used. Using a quasi-experimental design, we compared multiple sections of the same course and found non-significant…

  17. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  18. Exploring the state of health and safety management system performance measurement in mining organizations.

    PubMed

    Haas, Emily Joy; Yorio, Patrick

    2016-03-01

    Complex arguments continue to be articulated regarding the theoretical foundation of health and safety management system (HSMS) performance measurement. The culmination of these efforts has begun to enhance a collective understanding. Despite this enhanced theoretical understanding, however, there are still continuing debates and little consensus. The goal of the current research effort was to empirically explore common methods to HSMS performance measurement in mining organizations. The purpose was to determine if value and insight could be added into the ongoing approaches of the best ways to engage in health and safety performance measurement. Nine site-level health and safety management professionals were provided with 133 practices corresponding to 20 HSMS elements, each fitting into the plan, do, check, act phases common to most HSMS. Participants were asked to supply detailed information as to how they (1) assess the performance of each practice in their organization, or (2) would assess each practice if it were an identified strategic imperative. Qualitative content analysis indicated that the approximately 1200 responses provided could be described and categorized into interventions , organizational performance , and worker performance . A discussion of how these categories relate to existing indicator frameworks is provided. The analysis also revealed divergence in two important measurement issues; (1) quantitative vs qualitative measurement and reporting; and (2) the primary use of objective or subjective metrics. In lieu of these findings we ultimately recommend a balanced measurement and reporting approach within the three metric categories and conclude with suggestions for future research.

  19. Quantitative Laser-Saturated Fluorescence Measurements of Nitric Oxide in a Heptane Spray Flame

    NASA Technical Reports Server (NTRS)

    Cooper, Clayton S.; Laurendeau, Normand M.; Lee, Chi (Technical Monitor)

    1997-01-01

    We report spatially resolved laser-saturated fluorescence measurements of NO concentration in a pre-heated, lean-direct injection (LDI) spray flame at atmospheric pressure. The spray is produced by a hollow-cone, pressure-atomized nozzle supplied with liquid heptane. NO is excited via the Q2(26.5) transition of the gamma(0,0) band. Detection is performed in a 2-nm region centered on the gamma(0,1) band. Because of the relatively close spectral spacing between the excitation (226 nm) and detection wavelengths (236 nm), the gamma(0,1) band of NO cannot be isolated from the spectral wings of the Mie scattering signal produced by the spray. To account for the resulting superposition of the fluorescence and scattering signals, a background subtraction method has been developed that utilizes a nearby non-resonant wavelength. Excitation scans have been performed to locate the optimum off-line wavelength. Detection scans have been performed at problematic locations in the flame to determine possible fluorescence interferences from UHCs and PAHs at both the on-line and off-line excitation wavelengths. Quantitative radial NO profiles are presented and analyzed so as to better understand the operation of lean-direct injectors for gas turbine combustors.

  20. Reduced short term memory in congenital adrenal hyperplasia (CAH) and its relationship to spatial and quantitative performance.

    PubMed

    Collaer, Marcia L; Hindmarsh, Peter C; Pasterski, Vickie; Fane, Briony A; Hines, Melissa

    2016-02-01

    Girls and women with classical congenital adrenal hyperplasia (CAH) experience elevated androgens prenatally and show increased male-typical development for certain behaviors. Further, individuals with CAH receive glucocorticoid (GC) treatment postnatally, and this GC treatment could have negative cognitive consequences. We investigated two alternative hypotheses, that: (a) early androgen exposure in females with CAH masculinizes (improves) spatial perception and quantitative abilities at which males typically outperform females, or (b) CAH is associated with performance decrements in these domains, perhaps due to reduced short-term-memory (STM). Adolescent and adult individuals with CAH (40 female and 29 male) were compared with relative controls (29 female and 30 male) on spatial perception and quantitative abilities as well as on Digit Span (DS) to assess STM and on Vocabulary to assess general intelligence. Females with CAH did not perform better (more male-typical) on spatial perception or quantitative abilities than control females, failing to support the hypothesis of cognitive masculinization. Rather, in the sample as a whole individuals with CAH scored lower on spatial perception (p ≤ .009), a quantitative composite (p ≤ .036), and DS (p ≤ .001), despite no differences in general intelligence. Separate analyses of adolescent and adult participants suggested the spatial and quantitative effects might be present only in adult patients with CAH; however, reduced DS performance was found in patients with CAH regardless of age group. Separate regression analyses showed that DS predicted both spatial perception and quantitative performance (both p ≤ .001), when age, sex, and diagnosis status were controlled. Thus, reduced STM in CAH patients versus controls may have more general cognitive consequences, potentially reducing spatial perception and quantitative skills. Although hyponatremia or other aspects of salt-wasting crises or additional hormone

  1. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  2. Measurements of morphology and refractive indexes on human downy hairs using three-dimensional quantitative phase imaging.

    PubMed

    Lee, SangYun; Kim, Kyoohyun; Lee, Yuhyun; Park, Sungjin; Shin, Heejae; Yang, Jongwon; Ko, Kwanhong; Park, HyunJoo; Park, YongKeun

    2015-01-01

    We present optical measurements of morphology and refractive indexes (RIs) of human downy arm hairs using three-dimensional (3-D) quantitative phase imaging techniques. 3-D RI tomograms and high-resolution two-dimensional synthetic aperture images of individual downy arm hairs were measured using a Mach–Zehnder laser interferometric microscopy equipped with a two-axis galvanometer mirror. From the measured quantitative images, the RIs and morphological parameters of downy hairs were noninvasively quantified including the mean RI, volume, cylinder, and effective radius of individual hairs. In addition, the effects of hydrogen peroxide on individual downy hairs were investigated.

  3. Confirmatory Factor Analytic Structure and Measurement Invariance of Quantitative Autistic Traits Measured by the Social Responsiveness Scale-2

    ERIC Educational Resources Information Center

    Frazier, Thomas W.; Ratliff, Kristin R.; Gruber, Chris; Zhang, Yi; Law, Paul A.; Constantino, John N.

    2014-01-01

    Understanding the factor structure of autistic symptomatology is critical to the discovery and interpretation of causal mechanisms in autism spectrum disorder. We applied confirmatory factor analysis and assessment of measurement invariance to a large ("N" = 9635) accumulated collection of reports on quantitative autistic traits using…

  4. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of

  5. Quantitative nuclear magnetic resonance to measure body composition in infants and children

    USDA-ARS?s Scientific Manuscript database

    Quantitative Nuclear Magnetic Resonance (QMR) is being used in human adults to obtain measures of total body fat (FM) with high precision. The current study assessed a device specially designed to accommodate infants and children between 3 and 50 kg (EchoMRI-AH™). Body composition of 113 infants and...

  6. Quantitative wound healing measurement and monitoring system based on an innovative 3D imaging system

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Yang, Arthur; Yin, Gongjie; Wen, James

    2011-03-01

    In this paper, we report a novel three-dimensional (3D) wound imaging system (hardware and software) under development at Technest Inc. System design is aimed to perform accurate 3D measurement and modeling of a wound and track its healing status over time. Accurate measurement and tracking of wound healing enables physicians to assess, document, improve, and individualize the treatment plan given to each wound patient. In current wound care practices, physicians often visually inspect or roughly measure the wound to evaluate the healing status. This is not an optimal practice since human vision lacks precision and consistency. In addition, quantifying slow or subtle changes through perception is very difficult. As a result, an instrument that quantifies both skin color and geometric shape variations would be particularly useful in helping clinicians to assess healing status and judge the effect of hyperemia, hematoma, local inflammation, secondary infection, and tissue necrosis. Once fully developed, our 3D imaging system will have several unique advantages over traditional methods for monitoring wound care: (a) Non-contact measurement; (b) Fast and easy to use; (c) up to 50 micron measurement accuracy; (d) 2D/3D Quantitative measurements;(e) A handheld device; and (f) Reasonable cost (< $1,000).

  7. Comparative Performance of Reagents and Platforms for Quantitation of Cytomegalovirus DNA by Digital PCR

    PubMed Central

    Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.

    2016-01-01

    A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685

  8. Quantitative thickness measurement of polarity-inverted piezoelectric thin-film layer by scanning nonlinear dielectric microscopy

    NASA Astrophysics Data System (ADS)

    Odagawa, Hiroyuki; Terada, Koshiro; Tanaka, Yohei; Nishikawa, Hiroaki; Yanagitani, Takahiko; Cho, Yasuo

    2017-10-01

    A quantitative measurement method for a polarity-inverted layer in ferroelectric or piezoelectric thin film is proposed. It is performed nondestructively by scanning nonlinear dielectric microscopy (SNDM). In SNDM, linear and nonlinear dielectric constants are measured using a probe that converts the variation of capacitance related to these constants into the variation of electrical oscillation frequency. In this paper, we describe a principle for determining the layer thickness and some calculation results of the output signal, which are related to the radius of the probe tip and the thickness of the inverted layer. Moreover, we derive an equation that represents the relationship between the output signal and the oscillation frequency of the probe and explain how to determine the thickness from the measured frequency. Experimental results in Sc-doped AlN piezoelectric thin films that have a polarity-inverted layer with a thickness of 1.5 µm fabricated by radio frequency magnetron sputtering showed a fairly good value of 1.38 µm for the thickness of the polarity-inverted layer.

  9. An Optimized Method for the Measurement of Acetaldehyde by High-Performance Liquid Chromatography

    PubMed Central

    Guan, Xiangying; Rubin, Emanuel; Anni, Helen

    2011-01-01

    Background Acetaldehyde is produced during ethanol metabolism predominantly in the liver by alcohol dehydrogenase, and rapidly eliminated by oxidation to acetate via aldehyde dehydrogenase. Assessment of circulating acetaldehyde levels in biological matrices is performed by headspace gas chromatography and reverse phase high-performance liquid chromatography (RP-HPLC). Methods We have developed an optimized method for the measurement of acetaldehyde by RP-HPLC in hepatoma cell culture medium, blood and plasma. After sample deproteinization, acetaldehyde was derivatized with 2,4-dinitrophenylhydrazine (DNPH). The reaction was optimized for pH, amount of derivatization reagent,, time and temperature. Extraction methods of the acetaldehyde-hydrazone (AcH-DPN) stable derivative and product stability studies were carried out. Acetaldehyde was identified by its retention time in comparison to AcH-DPN standard, using a new chromatography gradient program, and quantitated based on external reference standards and standard addition calibration curves in the presence and absence of ethanol. Results Derivatization of acetaldehyde was performed at pH 4.0 with a 80-fold molar excess of DNPH. The reaction was completed in 40 min at ambient temperature, and the product was stable for 2 days. A clear separation of AcH-DNP from DNPH was obtained with a new 11-min chromatography program. Acetaldehyde detection was linear up to 80 μM. The recovery of acetaldehyde was >88% in culture media, and >78% in plasma. We quantitatively determined the ethanol-derived acetaldehyde in hepatoma cells, rat blood and plasma with a detection limit around 3 μM. The accuracy of the method was <9% for intraday and <15% for interday measurements, in small volume (70 μl) plasma sampling. Conclusions An optimized method for the quantitative determination of acetaldehyde in biological systems was developed using derivatization with DNPH, followed by a short RP-HPLC separation of AcH-DNP. The method has

  10. An optimized method for the measurement of acetaldehyde by high-performance liquid chromatography.

    PubMed

    Guan, Xiangying; Rubin, Emanuel; Anni, Helen

    2012-03-01

    Acetaldehyde is produced during ethanol metabolism predominantly in the liver by alcohol dehydrogenase and rapidly eliminated by oxidation to acetate via aldehyde dehydrogenase. Assessment of circulating acetaldehyde levels in biological matrices is performed by headspace gas chromatography and reverse phase high-performance liquid chromatography (RP-HPLC). We have developed an optimized method for the measurement of acetaldehyde by RP-HPLC in hepatoma cell culture medium, blood, and plasma. After sample deproteinization, acetaldehyde was derivatized with 2,4-dinitrophenylhydrazine (DNPH). The reaction was optimized for pH, amount of derivatization reagent, time, and temperature. Extraction methods of the acetaldehyde-hydrazone (AcH-DNP) stable derivative and product stability studies were carried out. Acetaldehyde was identified by its retention time in comparison with AcH-DNP standard, using a new chromatography gradient program, and quantitated based on external reference standards and standard addition calibration curves in the presence and absence of ethanol. Derivatization of acetaldehyde was performed at pH 4.0 with an 80-fold molar excess of DNPH. The reaction was completed in 40 minutes at ambient temperature, and the product was stable for 2 days. A clear separation of AcH-DNP from DNPH was obtained with a new 11-minute chromatography program. Acetaldehyde detection was linear up to 80 μM. The recovery of acetaldehyde was >88% in culture media and >78% in plasma. We quantitatively determined the ethanol-derived acetaldehyde in hepatoma cells, rat blood and plasma with a detection limit around 3 μM. The accuracy of the method was <9% for intraday and <15% for interday measurements, in small volume (70 μl) plasma sampling. An optimized method for the quantitative determination of acetaldehyde in biological systems was developed using derivatization with DNPH, followed by a short RP-HPLC separation of AcH-DNP. The method has an extended linear range, is

  11. Freight performance measures : approach analysis.

    DOT National Transportation Integrated Search

    2010-05-01

    This report reviews the existing state of the art and also the state of the practice of freight performance measurement. Most performance measures at the state level have aimed at evaluating highway or transit infrastructure performance with an empha...

  12. A tunable ratiometric pH sensor based on carbon nanodots for the quantitative measurement of the intracellular pH of whole cells.

    PubMed

    Shi, Wen; Li, Xiaohua; Ma, Huimin

    2012-06-25

    The whole picture: Carbon nanodots labeled with two fluorescent dyes have been developed as a tunable ratiometric pH sensor to measure intracellular pH. The nanosensor shows good biocompatibility and cellular dispersibility. Quantitative determinations on intact HeLa cells and pH fluctuations associated with oxidative stress were performed. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Factors Influencing Academic Performance in Quantitative Courses among Undergraduate Business Students of a Public Higher Education Institution

    ERIC Educational Resources Information Center

    Yousef, Darwish Abdulrahamn

    2017-01-01

    Purpose: This paper aims to investigate the impacts of teaching style, English language and communication and assessment methods on the academic performance of undergraduate business students in introductory quantitative courses such as Statistics for Business 1 and 2, Quantitative Methods for Business, Operations and Production Management and…

  14. The performance measurement manifesto.

    PubMed

    Eccles, R G

    1991-01-01

    The leading indicators of business performance cannot be found in financial data alone. Quality, customer satisfaction, innovation, market share--metrics like these often reflect a company's economic condition and growth prospects better than its reported earnings do. Depending on an accounting department to reveal a company's future will leave it hopelessly mired in the past. More and more managers are changing their company's performance measurement systems to track nonfinancial measures and reinforce new competitive strategies. Five activities are essential: developing an information architecture; putting the technology in place to support this architecture; aligning bonuses and other incentives with the new system; drawing on outside resources; and designing an internal process to ensure the other four activities occur. New technologies and more sophisticated databases have made the change to nonfinancial performance measurement systems possible and economically feasible. Industry and trade associations, consulting firms, and public accounting firms that already have well-developed methods for assessing market share and other performance metrics can add to the revolution's momentum--as well as profit from the business opportunities it presents. Every company will have its own key measures and distinctive process for implementing the change. But making it happen will always require careful preparation, perseverance, and the conviction of the CEO that it must be carried through. When one leading company can demonstrate the long-term advantage of its superior performance on quality or innovation or any other nonfinancial measure, it will change the rules for all its rivals forever.

  15. A systematic literature search to identify performance measure outcomes used in clinical studies of racehorses.

    PubMed

    Wylie, C E; Newton, J R

    2018-05-01

    Racing performance is often used as a measurable outcome variable in research studies investigating clinical diagnoses or interventions. However, the use of many different performance measures largely precludes conduct of meaningful comparative studies and, to date, those being used have not been collated. To systematically review the veterinary scientific literature for the use of racing performance as a measurable outcome variable in clinical studies of racehorses, collate and identify those most popular, and identify their advantages and disadvantages. Systematic literature search. The search criteria "((racing AND performance) AND (horses OR equidae))" were adapted for both MEDLINE and CAB Abstracts databases. Data were collected in standardised recording forms for binary, categorical and quantitative measures, and the use of performance indices. In total, 217 studies that described racing performance were identified, contributing 117 different performance measures. No one performance measure was used in all studies, despite 90.3% using more than one variable. Data regarding race starts and earnings were used most commonly, with 88.0% and 54.4% of studies including at least one measure of starts and earnings, respectively. Seventeen variables were used 10 times or more, with the top five comprising: 'return to racing', 'number of starts', 'days to first start', 'earnings per period of time' and 'earnings per start'. The search strategies may not have identified all relevant papers, introducing bias to the review. Performance indices have been developed to improve assessment of interventions; however, they are not widely adopted in the scientific literature. Use of the two most commonly identified measures, whether the horse returned to racing and number of starts over a defined period of time, would best facilitate future systematic reviews and meta-analyses in advance of the development of a gold-standard measure of race performance outcome. © 2017 EVJ Ltd.

  16. Diagnostic colonoscopy: performance measurement study.

    PubMed

    Kuznets, Naomi

    2002-07-01

    This is the fifth of a series of best practices studies undertaken by the Performance Measurement Initiative (PMI), the centerpiece of the Institute for Quality Improvement (IQI), a not-for-profit quality improvement subsidiary of the Accreditation Association for Ambulatory Health Care (AAAHC) (Performance Measurement Initiative, 1999a, 1999b, 2000a, 2000b). The IQI was created to offer clinical performance measurement and improvement opportunities to ambulatory health care organizations and others interested in quality patient care. The purpose of the study was to provide opportunities to initiate clinical performance measurement on key processes and outcomes for this procedure and use this information for clinical quality improvement. This article provides performance measurement information on how organizations that have demonstrated and validated differences in clinical practice can have similar outcomes, but at a dramatically lower cost. The intent of the article is to provide organizations with alternatives in practice to provide a better value to their patients.

  17. Quantitative facial asymmetry: using three-dimensional photogrammetry to measure baseline facial surface symmetry.

    PubMed

    Taylor, Helena O; Morrison, Clinton S; Linden, Olivia; Phillips, Benjamin; Chang, Johnny; Byrne, Margaret E; Sullivan, Stephen R; Forrest, Christopher R

    2014-01-01

    Although symmetry is hailed as a fundamental goal of aesthetic and reconstructive surgery, our tools for measuring this outcome have been limited and subjective. With the advent of three-dimensional photogrammetry, surface geometry can be captured, manipulated, and measured quantitatively. Until now, few normative data existed with regard to facial surface symmetry. Here, we present a method for reproducibly calculating overall facial symmetry and present normative data on 100 subjects. We enrolled 100 volunteers who underwent three-dimensional photogrammetry of their faces in repose. We collected demographic data on age, sex, and race and subjectively scored facial symmetry. We calculated the root mean square deviation (RMSD) between the native and reflected faces, reflecting about a plane of maximum symmetry. We analyzed the interobserver reliability of the subjective assessment of facial asymmetry and the quantitative measurements and compared the subjective and objective values. We also classified areas of greatest asymmetry as localized to the upper, middle, or lower facial thirds. This cluster of normative data was compared with a group of patients with subtle but increasing amounts of facial asymmetry. We imaged 100 subjects by three-dimensional photogrammetry. There was a poor interobserver correlation between subjective assessments of asymmetry (r = 0.56). There was a high interobserver reliability for quantitative measurements of facial symmetry RMSD calculations (r = 0.91-0.95). The mean RMSD for this normative population was found to be 0.80 ± 0.24 mm. Areas of greatest asymmetry were distributed as follows: 10% upper facial third, 49% central facial third, and 41% lower facial third. Precise measurement permitted discrimination of subtle facial asymmetry within this normative group and distinguished norms from patients with subtle facial asymmetry, with placement of RMSDs along an asymmetry ruler. Facial surface symmetry, which is poorly assessed

  18. Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.

    PubMed

    Obuchowski, Nancy A; Bullen, Jennifer

    2017-01-01

    Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in

  19. Steps to achieve quantitative measurements of microRNA using two step droplet digital PCR.

    PubMed

    Stein, Erica V; Duewer, David L; Farkas, Natalia; Romsos, Erica L; Wang, Lili; Cole, Kenneth D

    2017-01-01

    Droplet digital PCR (ddPCR) is being advocated as a reference method to measure rare genomic targets. It has consistently been proven to be more sensitive and direct at discerning copy numbers of DNA than other quantitative methods. However, one of the largest obstacles to measuring microRNA (miRNA) using ddPCR is that reverse transcription efficiency depends upon the target, meaning small RNA nucleotide composition directly effects primer specificity in a manner that prevents traditional quantitation optimization strategies. Additionally, the use of reagents that are optimized for miRNA measurements using quantitative real-time PCR (qRT-PCR) appear to either cause false positive or false negative detection of certain targets when used with traditional ddPCR quantification methods. False readings are often related to using inadequate enzymes, primers and probes. Given that two-step miRNA quantification using ddPCR relies solely on reverse transcription and uses proprietary reagents previously optimized only for qRT-PCR, these barriers are substantial. Therefore, here we outline essential controls, optimization techniques, and an efficacy model to improve the quality of ddPCR miRNA measurements. We have applied two-step principles used for miRNA qRT-PCR measurements and leveraged the use of synthetic miRNA targets to evaluate ddPCR following cDNA synthesis with four different commercial kits. We have identified inefficiencies and limitations as well as proposed ways to circumvent identified obstacles. Lastly, we show that we can apply these criteria to a model system to confidently quantify miRNA copy number. Our measurement technique is a novel way to quantify specific miRNA copy number in a single sample, without using standard curves for individual experiments. Our methodology can be used for validation and control measurements, as well as a diagnostic technique that allows scientists, technicians, clinicians, and regulators to base miRNA measures on a single

  20. Steps to achieve quantitative measurements of microRNA using two step droplet digital PCR

    PubMed Central

    Duewer, David L.; Farkas, Natalia; Romsos, Erica L.; Wang, Lili; Cole, Kenneth D.

    2017-01-01

    Droplet digital PCR (ddPCR) is being advocated as a reference method to measure rare genomic targets. It has consistently been proven to be more sensitive and direct at discerning copy numbers of DNA than other quantitative methods. However, one of the largest obstacles to measuring microRNA (miRNA) using ddPCR is that reverse transcription efficiency depends upon the target, meaning small RNA nucleotide composition directly effects primer specificity in a manner that prevents traditional quantitation optimization strategies. Additionally, the use of reagents that are optimized for miRNA measurements using quantitative real-time PCR (qRT-PCR) appear to either cause false positive or false negative detection of certain targets when used with traditional ddPCR quantification methods. False readings are often related to using inadequate enzymes, primers and probes. Given that two-step miRNA quantification using ddPCR relies solely on reverse transcription and uses proprietary reagents previously optimized only for qRT-PCR, these barriers are substantial. Therefore, here we outline essential controls, optimization techniques, and an efficacy model to improve the quality of ddPCR miRNA measurements. We have applied two-step principles used for miRNA qRT-PCR measurements and leveraged the use of synthetic miRNA targets to evaluate ddPCR following cDNA synthesis with four different commercial kits. We have identified inefficiencies and limitations as well as proposed ways to circumvent identified obstacles. Lastly, we show that we can apply these criteria to a model system to confidently quantify miRNA copy number. Our measurement technique is a novel way to quantify specific miRNA copy number in a single sample, without using standard curves for individual experiments. Our methodology can be used for validation and control measurements, as well as a diagnostic technique that allows scientists, technicians, clinicians, and regulators to base miRNA measures on a single

  1. Susceptibility Testing by Polymerase Chain Reaction DNA Quantitation: A Method to Measure Drug Resistance of Human Immunodeficiency Virus Type 1 Isolates

    NASA Astrophysics Data System (ADS)

    Eron, Joseph J.; Gorczyca, Paul; Kaplan, Joan C.; D'Aquila, Richard T.

    1992-04-01

    Polymerase chain reaction (PCR) DNA quantitation (PDQ) susceptibility testing rapidly and directly measures nucleoside sensitivity of human immunodeficiency virus type 1 (HIV-1) isolates. PCR is used to quantitate the amount of HIV-1 DNA synthesized after in vitro infection of peripheral blood mononuclear cells. The relative amounts of HIV-1 DNA in cell lysates from cultures maintained at different drug concentrations reflect drug inhibition of virus replication. The results of PDQ susceptibility testing of 2- or 3-day cultures are supported by assays measuring HIV-1 p24 antigen production in supernatants of 7- or 10-day cultures. DNA sequence analyses to identify mutations in the reverse transcriptase gene that cause resistance to 3'-azido-3'-deoxythymidine also support the PDQ results. With the PDQ method, both infectivity titration and susceptibility testing can be performed on supernatants from primary cultures of peripheral blood mononuclear cells. PDQ susceptibility testing should facilitate epidemiologic studies of the clinical significance of drug-resistant HIV-1 isolates.

  2. Exploring the state of health and safety management system performance measurement in mining organizations

    PubMed Central

    Haas, Emily Joy; Yorio, Patrick

    2016-01-01

    Complex arguments continue to be articulated regarding the theoretical foundation of health and safety management system (HSMS) performance measurement. The culmination of these efforts has begun to enhance a collective understanding. Despite this enhanced theoretical understanding, however, there are still continuing debates and little consensus. The goal of the current research effort was to empirically explore common methods to HSMS performance measurement in mining organizations. The purpose was to determine if value and insight could be added into the ongoing approaches of the best ways to engage in health and safety performance measurement. Nine site-level health and safety management professionals were provided with 133 practices corresponding to 20 HSMS elements, each fitting into the plan, do, check, act phases common to most HSMS. Participants were asked to supply detailed information as to how they (1) assess the performance of each practice in their organization, or (2) would assess each practice if it were an identified strategic imperative. Qualitative content analysis indicated that the approximately 1200 responses provided could be described and categorized into interventions, organizational performance, and worker performance. A discussion of how these categories relate to existing indicator frameworks is provided. The analysis also revealed divergence in two important measurement issues; (1) quantitative vs qualitative measurement and reporting; and (2) the primary use of objective or subjective metrics. In lieu of these findings we ultimately recommend a balanced measurement and reporting approach within the three metric categories and conclude with suggestions for future research. PMID:26823642

  3. 20 CFR 638.302 - Performance measurement.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Performance measurement. 638.302 Section 638... § 638.302 Performance measurement. The Job Corps Director shall establish a national performance measurement system for centers and other program components which shall include annual performance standards...

  4. 20 CFR 638.302 - Performance measurement.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 3 2012-04-01 2012-04-01 false Performance measurement. 638.302 Section 638... § 638.302 Performance measurement. The Job Corps Director shall establish a national performance measurement system for centers and other program components which shall include annual performance standards...

  5. Quantitative dispersion microscopy

    PubMed Central

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234

  6. Quantitative measurement of solvation shells using frequency modulated atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Uchihashi, T.; Higgins, M.; Nakayama, Y.; Sader, J. E.; Jarvis, S. P.

    2005-03-01

    The nanoscale specificity of interaction measurements and additional imaging capability of the atomic force microscope make it an ideal technique for measuring solvation shells in a variety of liquids next to a range of materials. Unfortunately, the widespread use of atomic force microscopy for the measurement of solvation shells has been limited by uncertainties over the dimensions, composition and durability of the tip during the measurements, and problems associated with quantitative force calibration of the most sensitive dynamic measurement techniques. We address both these issues by the combined use of carbon nanotube high aspect ratio probes and quantifying the highly sensitive frequency modulation (FM) detection technique using a recently developed analytical method. Due to the excellent reproducibility of the measurement technique, additional information regarding solvation shell size as a function of proximity to the surface has been obtained for two very different liquids. Further, it has been possible to identify differences between chemical and geometrical effects in the chosen systems.

  7. Feasibility of Quantitative Ultrasound Measurement of the Heel Bone in People with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Mergler, S.; Lobker, B.; Evenhuis, H. M.; Penning, C.

    2010-01-01

    Low bone mineral density (BMD) and fractures are common in people with intellectual disabilities (ID). Reduced mobility in case of motor impairment and the use of anti-epileptic drugs contribute to the development of low BMD. Quantitative ultrasound (QUS) measurement of the heel bone is a non-invasive and radiation-free method for measuring bone…

  8. Quantitative measurement and analysis for detection and treatment planning of developmental dysplasia of the hip

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Lu, Hongbing; Chen, Hanyong; Zhao, Li; Shi, Zhengxing; Liang, Zhengrong

    2009-02-01

    Developmental dysplasia of the hip is a congenital hip joint malformation affecting the proximal femurs and acetabulum that are subluxatable, dislocatable, and dislocated. Conventionally, physicians made diagnoses and treatments only based on findings from two-dimensional (2D) images by manually calculating clinic parameters. However, anatomical complexity of the disease and the limitation of current standard procedures make accurate diagnosis quite difficultly. In this study, we developed a system that provides quantitative measurement of 3D clinical indexes based on computed tomography (CT) images. To extract bone structure from surrounding tissues more accurately, the system firstly segments the bone using a knowledge-based fuzzy clustering method, which is formulated by modifying the objective function of the standard fuzzy c-means algorithm with additive adaptation penalty. The second part of the system calculates automatically the clinical indexes, which are extended from 2D to 3D for accurate description of spatial relationship between femurs and acetabulum. To evaluate the system performance, experimental study based on 22 patients with unilateral or bilateral affected hip was performed. The results of 3D acetabulum index (AI) automatically provided by the system were validated by comparison with 2D results measured by surgeons manually. The correlation between the two results was found to be 0.622 (p<0.01).

  9. Quantitative skills as a graduate learning outcome of university science degree programmes: student performance explored through theplanned-enacted-experiencedcurriculum model

    NASA Astrophysics Data System (ADS)

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2016-07-01

    Application of mathematical and statistical thinking and reasoning, typically referred to as quantitative skills, is essential for university bioscience students. First, this study developed an assessment task intended to gauge graduating students' quantitative skills. The Quantitative Skills Assessment of Science Students (QSASS) was the result, which examined 10 mathematical and statistical sub-topics. Second, the study established an evidential baseline of students' quantitative skills performance and confidence levels by piloting the QSASS with 187 final-year biosciences students at a research-intensive university. The study is framed within the planned-enacted-experienced curriculum model and contributes to science reform efforts focused on enhancing the quantitative skills of university graduates, particularly in the biosciences. The results found, on average, weak performance and low confidence on the QSASS, suggesting divergence between academics' intentions and students' experiences of learning quantitative skills. Implications for curriculum design and future studies are discussed.

  10. Accurate quantitation standards of glutathione via traceable sulfur measurement by inductively coupled plasma optical emission spectrometry and ion chromatography

    PubMed Central

    Rastogi, L.; Dash, K.; Arunachalam, J.

    2013-01-01

    The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814

  11. Performance measurement for information systems: Industry perspectives

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Yoes, Cissy; Hamilton, Kay

    1992-01-01

    Performance measurement has become a focal topic for information systems (IS) organizations. Historically, IS performance measures have dealt with the efficiency of the data processing function. Today, the function of most IS organizations goes beyond simple data processing. To understand how IS organizations have developed meaningful performance measures that reflect their objectives and activities, industry perspectives on IS performance measurement was studied. The objectives of the study were to understand the state of the practice in IS performance techniques for IS performance measurement; to gather approaches and measures of actual performance measures used in industry; and to report patterns, trends, and lessons learned about performance measurement to NASA/JSC. Examples of how some of the most forward looking companies are shaping their IS processes through measurement is provided. Thoughts on the presence of a life-cycle to performance measures development and a suggested taxonomy for performance measurements are included in the appendices.

  12. Quantitation of polymethoxylated flavones in orange juice by high-performance liquid chromatography.

    PubMed

    Rouseff, R L; Ting, S V

    1979-08-01

    A quantitative high-performance liquid chromatographic (HPLC) procedure for the determination of the five major polymethoxylated flavones (PMFs) in orange juice has been developed. It employs a unique ternary solvent system with coupled UV-fluorescence detection. The dual detectors were employed to determine the presence of interfering substances and served as a cross check on quantitation. Stop flow UV and fluorescence scanning was used to identify peaks and determine the presence of impurities. Although all five citrus PMFs fluoresce, some HPLC fluorescence peaks were too small to be of much practical use. All five citrus PMFs could be quantitated satisfactorily with the fixed wavelength UV (313 nm) detector. The HPLC procedure has been used to evaluate each step in the preparation. The optimum extracting solvent was selected and one time consuming step was eliminated, as it was found to be unnecessary. HPLC values for nobiletin and sinensetin are in good agreement with the thin-layer chromatographic (TLC) values in the literature. HPLC values for the other three flavones were considerably lower than those reported in the literature. The HPLC procedure is considerably faster than the TLC procedure with equal or superior precision and accuracy.

  13. Accurate Construction of Photoactivated Localization Microscopy (PALM) Images for Quantitative Measurements

    PubMed Central

    Coltharp, Carla; Kessler, Rene P.; Xiao, Jie

    2012-01-01

    Localization-based superresolution microscopy techniques such as Photoactivated Localization Microscopy (PALM) and Stochastic Optical Reconstruction Microscopy (STORM) have allowed investigations of cellular structures with unprecedented optical resolutions. One major obstacle to interpreting superresolution images, however, is the overcounting of molecule numbers caused by fluorophore photoblinking. Using both experimental and simulated images, we determined the effects of photoblinking on the accurate reconstruction of superresolution images and on quantitative measurements of structural dimension and molecule density made from those images. We found that structural dimension and relative density measurements can be made reliably from images that contain photoblinking-related overcounting, but accurate absolute density measurements, and consequently faithful representations of molecule counts and positions in cellular structures, require the application of a clustering algorithm to group localizations that originate from the same molecule. We analyzed how applying a simple algorithm with different clustering thresholds (tThresh and dThresh) affects the accuracy of reconstructed images, and developed an easy method to select optimal thresholds. We also identified an empirical criterion to evaluate whether an imaging condition is appropriate for accurate superresolution image reconstruction with the clustering algorithm. Both the threshold selection method and imaging condition criterion are easy to implement within existing PALM clustering algorithms and experimental conditions. The main advantage of our method is that it generates a superresolution image and molecule position list that faithfully represents molecule counts and positions within a cellular structure, rather than only summarizing structural properties into ensemble parameters. This feature makes it particularly useful for cellular structures of heterogeneous densities and irregular geometries, and

  14. Influence of region-of-interest designs on quantitative measurement of multimodal imaging of MR non-enhancing gliomas.

    PubMed

    Takano, Koji; Kinoshita, Manabu; Arita, Hideyuki; Okita, Yoshiko; Chiba, Yasuyoshi; Kagawa, Naoki; Watanabe, Yoshiyuki; Shimosegawa, Eku; Hatazawa, Jun; Hashimoto, Naoya; Fujimoto, Yasunori; Kishima, Haruhiko

    2018-05-01

    A number of studies have revealed the usefulness of multimodal imaging in gliomas. Although the results have been heavily affected by the method used for region of interest (ROI) design, the most discriminatory method for setting the ROI remains unclear. The aim of the present study was to determine the most suitable ROI design for 18 F-fluorodeoxyglucose (FDG) and 11 C-methionine (MET) positron emission tomography (PET), apparent diffusion coefficient (ADC), and fractional anisotropy (FA) obtained by diffusion tensor imaging (DTI) from the viewpoint of grades of non-enhancing gliomas. A total of 31 consecutive patients with newly diagnosed, histologically confirmed magnetic resonance (MR) non-enhancing gliomas who underwent FDG-PET, MET-PET and DTI were retrospectively investigated. Quantitative measurements were performed using four different ROIs; hotspot/tumor center and whole tumor, constructed in either two-dimensional (2D) or three-dimensional (3D). Histopathological grading of the tumor was considered as empirical truth and the quantitative measurements obtained from each ROI was correlated with the grade of the tumor. The most discriminating ROI for non-enhancing glioma grading was different according to the different imaging modalities. 2D-hotspot/center ROI was most discriminating for FDG-PET (P=0.087), ADC map (P=0.0083), and FA map (P=0.25), whereas 3D-whole tumor ROI was best for MET-PET (P=0.0050). In the majority of scenarios, 2D-ROIs performed better than 3D-ROIs. Results from the image analysis using FDG-PET, MET-PET, ADC and FA may be affected by ROI design and the most discriminating ROI for non-enhancing glioma grading was different according to the imaging modality.

  15. Performance measures for a dialysis setting.

    PubMed

    Gu, Xiuzhu; Itoh, Kenji

    2018-03-01

    This study from Japan extracted performance measures for dialysis unit management and investigated their characteristics from professional views. Two surveys were conducted using self-administered questionnaires, in which dialysis managers/staff were asked to rate the usefulness of 44 performance indicators. A total of 255 managers and 2,097 staff responded. Eight performance measures were elicited from dialysis manager and staff responses: these were safety, operational efficiency, quality of working life, financial effectiveness, employee development, mortality, patient/employee satisfaction and patient-centred health care. These performance measures were almost compatible with those extracted in overall healthcare settings in a previous study. Internal reliability, content and construct validity of the performance measures for the dialysis setting were ensured to some extent. As a general trend, both dialysis managers and staff perceived performance measures as highly useful, especially for safety, mortality, operational efficiency and patient/employee satisfaction, but showed relatively low concerns for patient-centred health care and employee development. However, dialysis managers' usefulness perceptions were significantly higher than staff. Important guidelines for designing a holistic hospital/clinic management system were yielded. Performance measures must be balanced for outcomes and performance shaping factors (PSF); a common set of performance measures could be applied to all the healthcare settings, although performance indicators of each measure should be composed based on the application field and setting; in addition, sound causal relationships between PSF and outcome measures/indicators should be explored for further improvement. © 2017 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  16. Transit performance measures in California.

    DOT National Transportation Integrated Search

    2016-04-01

    This research is the result of a California Department of Transportation (Caltrans) request to assess the most commonly : available transit performance measures in California. Caltrans wanted to understand performance measures and data used by : Metr...

  17. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    ERIC Educational Resources Information Center

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

  18. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  19. Quantitative Measurement of Vocal Fold Vibration in Male Radio Performers and Healthy Controls Using High-Speed Videoendoscopy

    PubMed Central

    Warhurst, Samantha; McCabe, Patricia; Heard, Rob; Yiu, Edwin; Wang, Gaowu; Madill, Catherine

    2014-01-01

    Purpose Acoustic and perceptual studies show a number of differences between the voices of radio performers and controls. Despite this, the vocal fold kinematics underlying these differences are largely unknown. Using high-speed videoendoscopy, this study sought to determine whether the vocal vibration features of radio performers differed from those of non-performing controls. Method Using high-speed videoendoscopy, recordings of a mid-phonatory/i/ in 16 male radio performers (aged 25–52 years) and 16 age-matched controls (aged 25–52 years) were collected. Videos were extracted and analysed semi-automatically using High-Speed Video Program, obtaining measures of fundamental frequency (f0), open quotient and speed quotient. Post-hoc analyses of sound pressure level (SPL) were also performed (n = 19). Pearson's correlations were calculated between SPL and both speed and open quotients. Results Male radio performers had a significantly higher speed quotient than their matched controls (t = 3.308, p = 0.005). No significant differences were found for f0 or open quotient. No significant correlation was found between either open or speed quotient with SPL. Discussion A higher speed quotient in male radio performers suggests that their vocal fold vibration was characterised by a higher ratio of glottal opening to closing times than controls. This result may explain findings of better voice quality, higher equivalent sound level and greater spectral tilt seen in previous research. Open quotient was not significantly different between groups, indicating that the durations of complete vocal fold closure were not different between the radio performers and controls. Further validation of these results is required to determine the aetiology of the higher speed quotient result and its implications for voice training and clinical management in performers. PMID:24971625

  20. Quantitative computed tomography assessment of transfusional iron overload.

    PubMed

    Wood, John C; Mo, Ashley; Gera, Aakansha; Koh, Montre; Coates, Thomas; Gilsanz, Vicente

    2011-06-01

    Quantitative computed tomography (QCT) has been proposed for iron quantification for more than 30 years, however there has been little clinical validation. We compared liver attenuation by QCT with magnetic resonance imaging (MRI)-derived estimates of liver iron concentration (LIC) in 37 patients with transfusional siderosis. MRI and QCT measurements were performed as clinically indicated monitoring of LIC and vertebral bone-density respectively, over a 6-year period. Mean time difference between QCT and MRI studies was 14 d, with 25 studies performed on the same day. For liver attenuation outside the normal range, attenuation values rose linearly with LIC (r(2) = 0·94). However, intersubject variability in intrinsic liver attenuation prevented quantitation of LIC <8 mg/g dry weight of liver, and was the dominant source of measurement uncertainty. Calculated QCT and MRI accuracies were equivalent for LIC values approaching 22 mg/g dry weight, with QCT having superior performance at higher LIC's. Although not suitable for monitoring patients with good iron control, QCT may nonetheless represent a viable technique for liver iron quantitation in patients with moderate to severe iron in regions where MRI resources are limited because of its low cost, availability, and high throughput. © 2011 Blackwell Publishing Ltd.

  1. Enhancement of a virtual reality wheelchair simulator to include qualitative and quantitative performance metrics.

    PubMed

    Harrison, C S; Grant, P M; Conway, B A

    2010-01-01

    The increasing importance of inclusive design and in particular accessibility guidelines established in the U.K. 1996 Disability Discrimination Act (DDA) has been a prime motivation for the work on wheelchair access, a subset of the DDA guidelines, described in this article. The development of these guidelines mirrors the long-standing provisions developed in the U.S. In order to raise awareness of these guidelines and in particular to give architects, building designers, and users a physical sensation of how a planned development could be experienced, a wheelchair virtual reality system was developed. This compares with conventional methods of measuring against drawings and comparing dimensions against building regulations, established in the U.K. under British standards. Features of this approach include the marriage of an electromechanical force-feedback system with high-quality immersive graphics as well as the potential ability to generate a physiological rating of buildings that do not yet exist. The provision of this sense of "feel" augments immersion within the virtual reality environment and also provides the basis from which both qualitative and quantitative measures of a building's access performance can be gained.

  2. Simple and cost-effective liquid chromatography-mass spectrometry method to measure dabrafenib quantitatively and six metabolites semi-quantitatively in human plasma.

    PubMed

    Vikingsson, Svante; Dahlberg, Jan-Olof; Hansson, Johan; Höiom, Veronica; Gréen, Henrik

    2017-06-01

    Dabrafenib is an inhibitor of BRAF V600E used for treating metastatic melanoma but a majority of patients experience adverse effects. Methods to measure the levels of dabrafenib and major metabolites during treatment are needed to allow development of individualized dosing strategies to reduce the burden of such adverse events. In this study, an LC-MS/MS method capable of measuring dabrafenib quantitatively and six metabolites semi-quantitatively is presented. The method is fully validated with regard to dabrafenib in human plasma in the range 5-5000 ng/mL. The analytes were separated on a C18 column after protein precipitation and detected in positive electrospray ionization mode using a Xevo TQ triple quadrupole mass spectrometer. As no commercial reference standards are available, the calibration curve of dabrafenib was used for semi-quantification of dabrafenib metabolites. Compared to earlier methods the presented method represents a simpler and more cost-effective approach suitable for clinical studies. Graphical abstract Combined multi reaction monitoring transitions of dabrafenib and metabolites in a typical case sample.

  3. EVALUATION OF QUANTITATIVE REAL TIME PCR FOR THE MEASUREMENT OF HELICOBATER PYLORI AT LOW CONCENTRATIONS IN DRINKING WATER

    EPA Science Inventory

    Aims: To determine the performance of a rapid, real time polymerase chain reaction (PCR) method for the detection and quantitative analysis Helicobacter pylori at low concentrations in drinking water.

    Methods and Results: A rapid DNA extraction and quantitative PCR (QPCR)...

  4. Nuclear Enterprise Performance Measurement

    DTIC Science & Technology

    2011-03-01

    xi I. Introduction ...WSA: Weapons Storage Area 1 I. Introduction Overview This paper discusses United States Air Force nuclear enterprise...sustainment systems. Keywords Performance measurement, process measurement, strategy, multicriteria decision- making, aggregation 1. Introduction Nuclear

  5. Bone measurements of infants with hyperbilirubinemia by quantitative ultrasound: the influence of phototherapy.

    PubMed

    Arıkan, Fatma İnci; Kara, Semra; Bilgin, Hüseyin; Özkan, Fatma; Bilge, Yıldız Dallar

    2017-07-01

    The purpose of the current study was to investigate the possible effects of phototherapy on bone status of term infants evaluated by measurement of tibial bone speed of sound (SOS). The phototherapy group (n = 30) consisted of children who had undergone phototherapy for at least 24 h and the control group (n = 30) comprised children who had not received phototherapy. Blood samples were obtained from all infants for serum calcium, phosphorus, magnesium, alkaline phosphatase, parathyroid hormone and vitamin D concentrations. The left tibial quantitative ultrasound (QUS) measurements were performed using a commercial device. There was no statistically significant difference between phototherapy-exposed and nonexposed infants in terms of Ca, P, ALP, PTH and vitamin D levels. Comparison of bone SOS between the phototherapy-exposed and control group revealed no statistically difference. Also, no significant difference in Z-score for SOS was observed between those with or without exposure. The data of our study indicate that phototherapy treatment has no impact on bone status in the hyperbilirubinemic infants. Although there is no statistically significant evidence of an excess risk of bone damage following phototherapy, studies with larger sample sizes and longer duration of follow-up are needed to gain a better understanding of its effects.

  6. Performance measurement in healthcare: part II--state of the science findings by stage of the performance measurement process.

    PubMed

    Adair, Carol E; Simpson, Elizabeth; Casebeer, Ann L; Birdsell, Judith M; Hayden, Katharine A; Lewis, Steven

    2006-07-01

    This paper summarizes findings of a comprehensive, systematic review of the peer-reviewed and grey literature on performance measurement according to each stage of the performance measurement process--conceptualization, selection and development, data collection, and reporting and use. It also outlines implications for practice. Six hundred sixty-four articles about organizational performance measurement from the health and business literature were reviewed after systematic searches of the literature, multi-rater relevancy ratings, citation checks and expert author nominations. Key themes were extracted and summarized from the most highly rated papers for each performance measurement stage. Despite a virtually universal consensus on the potential benefits of performance measurement, little evidence currently exists to guide practice in healthcare. Issues in conceptualizing systems include strategic alignment and scope. There are debates on the criteria for selecting measures and on the types and quality of measures. Implementation of data collection and analysis systems is complex and costly, and challenges persist in reporting results, preventing unintended effects and putting findings for improvement into action. There is a need for further development and refinement of performance measures and measurement systems, with a particular focus on strategies to ensure that performance measurement leads to healthcare improvement.

  7. 26 CFR 801.2 - Measuring organizational performance.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 20 2011-04-01 2011-04-01 false Measuring organizational performance. 801.2... REVENUE PRACTICE BALANCED SYSTEM FOR MEASURING ORGANIZATIONAL AND EMPLOYEE PERFORMANCE WITHIN THE INTERNAL REVENUE SERVICE § 801.2 Measuring organizational performance. The performance measures that comprise the...

  8. Epithelium percentage estimation facilitates epithelial quantitative protein measurement in tissue specimens.

    PubMed

    Chen, Jing; Toghi Eshghi, Shadi; Bova, George Steven; Li, Qing Kay; Li, Xingde; Zhang, Hui

    2013-12-01

    The rapid advancement of high-throughput tools for quantitative measurement of proteins has demonstrated the potential for the identification of proteins associated with cancer. However, the quantitative results on cancer tissue specimens are usually confounded by tissue heterogeneity, e.g. regions with cancer usually have significantly higher epithelium content yet lower stromal content. It is therefore necessary to develop a tool to facilitate the interpretation of the results of protein measurements in tissue specimens. Epithelial cell adhesion molecule (EpCAM) and cathepsin L (CTSL) are two epithelial proteins whose expressions in normal and tumorous prostate tissues were confirmed by measuring staining intensity with immunohistochemical staining (IHC). The expressions of these proteins were measured by ELISA in protein extracts from OCT embedded frozen prostate tissues. To eliminate the influence of tissue heterogeneity on epithelial protein quantification measured by ELISA, a color-based segmentation method was developed in-house for estimation of epithelium content using H&E histology slides from the same prostate tissues and the estimated epithelium percentage was used to normalize the ELISA results. The epithelium contents of the same slides were also estimated by a pathologist and used to normalize the ELISA results. The computer based results were compared with the pathologist's reading. We found that both EpCAM and CTSL levels, measured by ELISA assays itself, were greatly affected by epithelium content in the tissue specimens. Without adjusting for epithelium percentage, both EpCAM and CTSL levels appeared significantly higher in tumor tissues than normal tissues with a p value less than 0.001. However, after normalization by the epithelium percentage, ELISA measurements of both EpCAM and CTSL were in agreement with IHC staining results, showing a significant increase only in EpCAM with no difference in CTSL expression in cancer tissues. These results

  9. High-performance liquid chromatographic separation of human haemoglobins. Simultaneous quantitation of foetal and glycated haemoglobins.

    PubMed

    Bisse, E; Wieland, H

    1988-12-29

    A high-performance liquid chromatographic system, which uses a weak cation exchanger (PolyCATA) together with Bis-Tris buffer (pH 6.47-7.0) and sodium acetate gradients, is described. Samples from adults and newborns were analysed and a clean separation of many minor and major normal and abnormal haemoglobin (Hb) variants was greatly improved. The method allows the separation of minor foetal haemoglobin (HbF) variants and the simultaneous quantitation of HbF and glycated HbA. HbF values correlated well with those obtained by the alkali denaturation method (r = 0.997). The glycated haemoglobin (HbAIc) levels measured in patients with high HbF concentrations correlated with the total glycated haemoglobin determined by bioaffinity chromatography (r = 0.973). The procedure is useful for diagnostic applications and affords an effective and sensitive way of examining blood samples for haemoglobin abnormalities.

  10. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  11. Serial semi-quantitative measurement of fecal calprotectin in patients with ulcerative colitis in remission.

    PubMed

    Garcia-Planella, Esther; Mañosa, Míriam; Chaparro, María; Beltrán, Belén; Barreiro-de-Acosta, Manuel; Gordillo, Jordi; Ricart, Elena; Bermejo, Fernando; García-Sánchez, Valle; Piqueras, Marta; Llaó, Jordina; Gisbert, Javier P; Cabré, Eduard; Domènech, Eugeni

    2018-02-01

    Fecal calprotectin (FC) correlates with clinical and endoscopic activity in ulcerative colitis (UC), and it is a good predictor of relapse. However, its use in clinical practice is constrained by the need for the patient to deliver stool samples, and for their handling and processing in the laboratory. The availability of hand held devices might spread the use of FC in clinical practice. To evaluate the usefulness of a rapid semi-quantitative test of FC in predicting relapse in patients with UC in remission. Prospective, multicenter study that included UC patients in clinical remission for ≥6 months on maintenance treatment with mesalamine. Patients were evaluated clinically and semi-quantitative FC was measured using a monoclonal immunochromatography rapid test at baseline and every three months until relapse or 12 months of follow-up. One hundred and ninety-one patients had at least one determination of FC. At the end of follow-up, 33 patients (17%) experienced clinical relapse. Endoscopic activity at baseline (p = .043) and having had at least one FC > 60 μg/g during the study period (p = .03) were associated with a higher risk of relapse during follow-up. We obtained a total of 636 semi-quantitative FC determinations matched with a three-month follow-up clinical assessment. Having undetectable FC was inversely associated with early relapse (within three months), with a negative predictive value of 98.6% and a sensitivity of 93.9%. Serial, rapid semi-quantitative measurement of FC may be a useful, easy and cheap monitoring tool for patients with UC in remission.

  12. [Imaging and quantitative measurement of brain extracellular space using MRI Gd-DTPA tracer method].

    PubMed

    He, Qing-yuan; Han, Hong-bin; Xu, Fang-jing-wei; Yan, Jun-hao; Zeng, Jin-jin; Li, Xiao-gang; Fu, Yu; Peng, Yun; Chen, He; Hou, Chao; Xu, Xiao-juan

    2010-04-18

    To observe the diffusion of Gd-DTPA in brain extracellular space (ECS) by magnetic resonance imaging(MRI) and investigate the feasibility of ECS measurement by using MRI tracer method in vivo. 2 microL Gd-DTPA was introduced into ECS by caudate nucleus according to stereotaxic atlas in 8 Sprague Dawley(SD) rats (male, 280-320 g). The MRI scans were performed at 1 h, 3 h, 6 h, 9 h and 12 h respectively after administration. MRI appearances of Gd-DTPA diffusion and distribution was observed and compared. The MRI signal enhancement was measured at each time point. The neuroethology assessment was performed after MRI scanning at 12 h. The injection was accurate at the center of the caudate nucleus in 6 rats, while, at the capsula externa in other 2 rats. Gd-DTPA diffused isotropically after it was introduced into caudate nucleus, which spread into lateral cortex at 3 h. The MRI signal enhancement distributed mainly in the middle cerebral artery territory. A significant difference was found between the signal enhancement ratio at 1 h and that at 3 h in the original point of caudate nucleus (t=95.63, P<0.01), and the signal enhancement attenuated following the exponential power function y=1.7886x(-0.1776) (R2=0.94). In 2 rats with the injection point at capsula externa, Gd-DTPA diffused anisotropically along the fiber track of white matter during 1 h to 3 h, and spread into the lateral cortex at 6 h. The diffusion and clearance of Gd-DTPA in brain ECS could be monitored and measured quantitatively in vivo by MRI tracer method.

  13. Effect of ethnicity on performance in a final objective structured clinical examination: qualitative and quantitative study

    PubMed Central

    Wass, Val; Roberts, Celia; Hoogenboom, Ron; Jones, Roger; Van der Vleuten, Cees

    2003-01-01

    Objective To assess the effect of ethnicity on student performance in stations assessing communication skills within an objective structured clinical examination. Design Quantitative and qualitative study. Setting A final UK clinical examination consisting of a two day objective structured clinical examination with 22 stations. Participants 82 students from ethnic minorities and 97 white students. Main outcome measures Mean scores for stations (quantitative) and observations made using discourse analysis on selected communication stations (qualitative). Results Mean performance of students from ethnic minorities was significantly lower than that of white students for stations assessing communication skills on days 1 (67.0% (SD 6.8%) and 72.3% (7.6%); P=0.001) and 2 (65.2% (6.6%) and 69.5% (6.3%); P=0.003). No examples of overt discrimination were found in 309 video recordings. Transcriptions showed subtle differences in communication styles in some students from ethnic minorities who performed poorly. Examiners' assumptions about what is good communication may have contributed to differences in grading. Conclusions There was no evidence of explicit discrimination between students from ethnic minorities and white students in the objective structured clinical examination. A small group of male students from ethnic minorities used particularly poorly rated communicative styles, and some subtle problems in assessing communication skills may have introduced bias. Tests need to reflect issues of diversity to ensure that students from ethnic minorities are not disadvantaged. What is already known on this topicUK medical schools are concerned that students from ethnic minorities may perform less well than white students in examinationsIt is important to understand whether our examination system disadvantages themWhat this study addsMean performance of students from ethnic minorities was significantly lower than that of white students in a final year objective structured

  14. Spaceport Performance Measures

    NASA Technical Reports Server (NTRS)

    Finger, G. Wayne

    2010-01-01

    Spaceports have traditionally been characterized by performance measures associated with their site characteristics. Measures such as "Latitude" (proximity to the equator), "Azimuth" (range of available launch azimuths) and "Weather" (days of favorable weather) are commonly used to characterize a particular spaceport. However, other spaceport performance measures may now be of greater value. These measures can provide insight into areas of operational differences between competing spaceports and identify areas for improving the performance of spaceports. This paper suggests Figures of Merit (FOMs) for spaceport "Capacity" (number of potential launch opportunities per year and / or potential mass' to low earth orbit (LEO) per year); "Throughput" (actual mass to orbit per year compared to capacity); "Productivity" (labor effort hours per unit mass to orbit); "Energy Efficiency" (joules expended at spaceport per unit mass to orbit); "Carbon Footprint" tons CO2 per unit mass to orbit). Additional FOMS are investigated with regards to those areas of special interest to commercial launch operators, such as "Assignment Schedule" (days required for a binding assignment of a launch site from the spaceport); "Approval Schedule" (days to complete a range safety assessment leading to an approval or disapproval of a launch vehicle); "Affordability" (cost for a spaceport to assess a new launch vehicle); "Launch Affordability" (fixed range costs per launch); "Reconfigure Time" (hours to reconfigure the range from one vehicle's launch ready configuration to another vehicle's configuration); "Turn,Around Time" (minimum range hours required between launches of an identical type launch vehicle). Available or notional data is analyzed for the KSC/CCAFS area and other spaceports. Observations regarding progress over the past few decades are made. Areas where improvement are needed or indicated are suggested.

  15. The importance of quantitative measurement methods for uveitis: laser flare photometry endorsed in Europe while neglected in Japan where the technology measuring quantitatively intraocular inflammation was developed.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur

    2017-06-01

    Laser flare photometry (LFP) is an objective and quantitative method to measure intraocular inflammation. The LFP technology was developed in Japan and has been commercially available since 1990. The aim of this work was to review the application of LFP in uveitis practice in Europe compared to Japan where the technology was born. We reviewed PubMed articles published on LFP and uveitis. Although LFP has been largely integrated in routine uveitis practice in Europe, it has been comparatively neglected in Japan and still has not received FDA approval in the USA. As LFP is the only method that provides a precise measure of intraocular inflammation, it should be used as a gold standard in uveitis centres worldwide.

  16. An Improved Method for Measuring Quantitative Resistance to the Wheat Pathogen Zymoseptoria tritici Using High-Throughput Automated Image Analysis.

    PubMed

    Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A

    2016-07-01

    Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.

  17. Eight essentials of performance measurement.

    PubMed

    Moullin, Max

    2004-01-01

    A well-designed performance measurement system is vital for ensuring that organisations deliver cost-effective, high-quality services that meet the needs of service users. Without feedback on all important aspects and a system for ensuring that the organisation acts on that information, managers are struggling in the dark to improve services. However, performance measurement is not easy, particularly in health and public services where a wide range of stakeholders is involved. This article discusses what the author considers to be the eight essentials of performance measurement. Though described in the context of health and social care, they are important for organisations in all sectors.

  18. Quantitative surface temperature measurement using two-color thermographic phosphors and video equipment

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M. (Inventor)

    1989-01-01

    A thermal imaging system provides quantitative temperature information and is particularly useful in hypersonic wind tunnel applications. An object to be measured is prepared by coating with a two-color, ultraviolet-activated, thermographic phosphor. The colors emitted by the phosphor are detected by a conventional color video camera. A phosphor emitting blue and green light with a ratio that varies depending on temperature is used so that the intensity of light in the blue and green wavelengths detected by the blue and green tubes in the video camera can be compared. Signals representing the intensity of blue and green light at points on the surface of a model in a hypersonic wind tunnel are used to calculate a ratio of blue to green light intensity which provides quantitative temperature information for the surface of the model.

  19. Human performance measuring device

    NASA Technical Reports Server (NTRS)

    Michael, J.; Scow, J.

    1970-01-01

    Complex coordinator, consisting of operator control console, recorder, subject display panel, and limb controls, measures human performance by testing perceptual and motor skills. Device measures psychophysiological functions in drug and environmental studies, and is applicable to early detection of psychophysiological body changes.

  20. Quantitative assessment based on kinematic measures of functional impairments during upper extremity movements: A review.

    PubMed

    de los Reyes-Guzmán, Ana; Dimbwadyo-Terrer, Iris; Trincado-Alonso, Fernando; Monasterio-Huelin, Félix; Torricelli, Diego; Gil-Agudo, Angel

    2014-08-01

    Quantitative measures of human movement quality are important for discriminating healthy and pathological conditions and for expressing the outcomes and clinically important changes in subjects' functional state. However the most frequently used instruments for the upper extremity functional assessment are clinical scales, that previously have been standardized and validated, but have a high subjective component depending on the observer who scores the test. But they are not enough to assess motor strategies used during movements, and their use in combination with other more objective measures is necessary. The objective of the present review is to provide an overview on objective metrics found in literature with the aim of quantifying the upper extremity performance during functional tasks, regardless of the equipment or system used for registering kinematic data. A search in Medline, Google Scholar and IEEE Xplore databases was performed following a combination of a series of keywords. The full scientific papers that fulfilled the inclusion criteria were included in the review. A set of kinematic metrics was found in literature in relation to joint displacements, analysis of hand trajectories and velocity profiles. These metrics were classified into different categories according to the movement characteristic that was being measured. These kinematic metrics provide the starting point for a proposed objective metrics for the functional assessment of the upper extremity in people with movement disorders as a consequence of neurological injuries. Potential areas of future and further research are presented in the Discussion section. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Perk Station – Percutaneous Surgery Training and Performance Measurement Platform

    PubMed Central

    Vikal, Siddharth; U-Thainual, Paweena; Carrino, John A.; Iordachita, Iulian; Fischer, Gregory S.; Fichtinger, Gabor

    2009-01-01

    Motivation Image-guided percutaneous (through the skin) needle-based surgery has become part of routine clinical practice in performing procedures such as biopsies, injections and therapeutic implants. A novice physician typically performs needle interventions under the supervision of a senior physician; a slow and inherently subjective training process that lacks objective, quantitative assessment of the surgical skill and performance[S1]. Shortening the learning curve and increasing procedural consistency are important factors in assuring high-quality medical care. Methods This paper describes a laboratory validation system, called Perk Station, for standardized training and performance measurement under different assistance techniques for needle-based surgical guidance systems. The initial goal of the Perk Station is to assess and compare different techniques: 2D image overlay, biplane laser guide, laser protractor and conventional freehand. The main focus of this manuscript is the planning and guidance software system developed on the 3D Slicer platform, a free, open source software package designed for visualization and analysis of medical image data. Results The prototype Perk Station has been successfully developed, the associated needle insertion phantoms were built, and the graphical user interface was fully implemented. The system was inaugurated in undergraduate teaching and a wide array of outreach activities. Initial results, experiences, ongoing activities and future plans are reported. PMID:19539446

  2. Quantitative Measurement of Local Infrared Absorption and Dielectric Function with Tip-Enhanced Near-Field Microscopy.

    PubMed

    Govyadinov, Alexander A; Amenabar, Iban; Huth, Florian; Carney, P Scott; Hillenbrand, Rainer

    2013-05-02

    Scattering-type scanning near-field optical microscopy (s-SNOM) and Fourier transform infrared nanospectroscopy (nano-FTIR) are emerging tools for nanoscale chemical material identification. Here, we push s-SNOM and nano-FTIR one important step further by enabling them to quantitatively measure local dielectric constants and infrared absorption. Our technique is based on an analytical model, which allows for a simple inversion of the near-field scattering problem. It yields the dielectric permittivity and absorption of samples with 2 orders of magnitude improved spatial resolution compared to far-field measurements and is applicable to a large class of samples including polymers and biological matter. We verify the capabilities by determining the local dielectric permittivity of a PMMA film from nano-FTIR measurements, which is in excellent agreement with far-field ellipsometric data. We further obtain local infrared absorption spectra with unprecedented accuracy in peak position and shape, which is the key to quantitative chemometrics on the nanometer scale.

  3. Measuring the Nonuniform Evaporation Dynamics of Sprayed Sessile Microdroplets with Quantitative Phase Imaging.

    PubMed

    Edwards, Chris; Arbabi, Amir; Bhaduri, Basanta; Wang, Xiaozhen; Ganti, Raman; Yunker, Peter J; Yodh, Arjun G; Popescu, Gabriel; Goddard, Lynford L

    2015-10-13

    We demonstrate real-time quantitative phase imaging as a new optical approach for measuring the evaporation dynamics of sessile microdroplets. Quantitative phase images of various droplets were captured during evaporation. The images enabled us to generate time-resolved three-dimensional topographic profiles of droplet shape with nanometer accuracy and, without any assumptions about droplet geometry, to directly measure important physical parameters that characterize surface wetting processes. Specifically, the time-dependent variation of the droplet height, volume, contact radius, contact angle distribution along the droplet's perimeter, and mass flux density for two different surface preparations are reported. The studies clearly demonstrate three phases of evaporation reported previously: pinned, depinned, and drying modes; the studies also reveal instances of partial pinning. Finally, the apparatus is employed to investigate the cooperative evaporation of the sprayed droplets. We observe and explain the neighbor-induced reduction in evaporation rate, that is, as compared to predictions for isolated droplets. In the future, the new experimental methods should stimulate the exploration of colloidal particle dynamics on the gas-liquid-solid interface.

  4. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    PubMed

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for

  5. Quantitative evaluation of statistical errors in small-angle X-ray scattering measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sedlak, Steffen M.; Bruetzel, Linda K.; Lipfert, Jan

    A new model is proposed for the measurement errors incurred in typical small-angle X-ray scattering (SAXS) experiments, which takes into account the setup geometry and physics of the measurement process. The model accurately captures the experimentally determined errors from a large range of synchrotron and in-house anode-based measurements. Its most general formulation gives for the variance of the buffer-subtracted SAXS intensity σ 2(q) = [I(q) + const.]/(kq), whereI(q) is the scattering intensity as a function of the momentum transferq;kand const. are fitting parameters that are characteristic of the experimental setup. The model gives a concrete procedure for calculating realistic measurementmore » errors for simulated SAXS profiles. In addition, the results provide guidelines for optimizing SAXS measurements, which are in line with established procedures for SAXS experiments, and enable a quantitative evaluation of measurement errors.« less

  6. Electrons, Photons, and Force: Quantitative Single-Molecule Measurements from Physics to Biology

    PubMed Central

    2011-01-01

    Single-molecule measurement techniques have illuminated unprecedented details of chemical behavior, including observations of the motion of a single molecule on a surface, and even the vibration of a single bond within a molecule. Such measurements are critical to our understanding of entities ranging from single atoms to the most complex protein assemblies. We provide an overview of the strikingly diverse classes of measurements that can be used to quantify single-molecule properties, including those of single macromolecules and single molecular assemblies, and discuss the quantitative insights they provide. Examples are drawn from across the single-molecule literature, ranging from ultrahigh vacuum scanning tunneling microscopy studies of adsorbate diffusion on surfaces to fluorescence studies of protein conformational changes in solution. PMID:21338175

  7. Airborne radar and radiometer experiment for quantitative remote measurements of rain

    NASA Technical Reports Server (NTRS)

    Kozu, Toshiaki; Meneghini, Robert; Boncyk, Wayne; Wilheit, Thomas T.; Nakamura, Kenji

    1989-01-01

    An aircraft experiment has been conducted with a dual-frequency (10 GHz and 35 GHz) radar/radiometer system and an 18-GHz radiometer to test various rain-rate retrieval algorithms from space. In the experiment, which took place in the fall of 1988 at the NASA Wallops Flight Facility, VA, both stratiform and convective storms were observed. A ground-based radar and rain gauges were also used to obtain truth data. An external radar calibration is made with rain gauge data, thereby enabling quantitative reflectivity measurements. Comparisons between path attenuations derived from the surface return and from the radar reflectivity profile are made to test the feasibility of a technique to estimate the raindrop size distribution from simultaneous radar and path-attenuation measurements.

  8. Quantitative analysis of three chiral pesticide enantiomers by high-performance column liquid chromatography.

    PubMed

    Wang, Peng; Liu, Donghui; Gu, Xu; Jiang, Shuren; Zhou, Zhiqiang

    2008-01-01

    Methods for the enantiomeric quantitative determination of 3 chiral pesticides, paclobutrazol, myclobutanil, and uniconazole, and their residues in soil and water are reported. An effective chiral high-performance liquid chromatographic (HPLC)-UV method using an amylose-tris(3,5-dimethylphenylcarbamate; AD) column was developed for resolving the enantiomers and quantitative determination. The enantiomers were identified by a circular dichroism detector. Validation involved complete resolution of each of the 2 enantiomers, plus determination of linearity, precision, and limit of detection (LOD). The pesticide enantiomers were isolated by solvent extraction from soil and C18 solid-phase extraction from water. The 2 enantiomers of the 3 pesticides could be completely separated on the AD column using n-hexane isopropanol mobile phase. The linearity and precision results indicated that the method was reliable for the quantitative analysis of the enantiomers. LODs were 0.025, 0.05, and 0.05 mg/kg for each enantiomer of paclobutrazol, myclobutanil, and uniconazole, respectively. Recovery and precision data showed that the pretreatment procedures were satisfactory for enantiomer extraction and cleanup. This method can be used for optical purity determination of technical material and analysis of environmental residues.

  9. 2 CFR 200.301 - Performance measurement.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Performance measurement. 200.301 Section 200... § 200.301 Performance measurement. The Federal awarding agency must require the recipient to use OMB-approved governmentwide standard information collections when providing financial and performance...

  10. Implementing online quantitative support modules in an intermediate-level course

    NASA Astrophysics Data System (ADS)

    Daly, J.

    2011-12-01

    While instructors typically anticipate that students in introductory geology courses enter a class with a wide range of quantitative ability, we often overlook the fact that this may also be true in upper-level courses. Some students are drawn to the subject and experience success in early courses with an emphasis on descriptive geology, then experience frustration and disappointment in mid- and upper-level courses that are more quantitative. To bolster student confidence in quantitative skills and enhance their performance in an upper-level course, I implemented several modules from The Math You Need (TMYN) online resource with a 200-level geomorphology class. Student facility with basic quantitative skills (rearranging equations, manipulating units, and graphing) was assessed with an online pre- and post-test. During the semester, modules were assigned to complement existing course activities (for example, the module on manipulating units was assigned prior to a lab on measurement of channel area and water velocity, then calculation of discharge). The implementation was designed to be a concise review of relevant skills for students with higher confidence in their quantitative abilities, and to provide a self-paced opportunity for students with less quantitative facility to build skills. This course already includes a strong emphasis on quantitative data collection, analysis, and presentation; in the past, student performance in the course has been strongly influenced by their individual quantitative ability. I anticipate that giving students the opportunity to improve mastery of fundamental quantitative skills will improve their performance on higher-stakes assignments and exams, and will enhance their sense of accomplishment in the course.

  11. High throughput and quantitative approaches for measuring circadian rhythms in cyanobacteria using bioluminescence

    PubMed Central

    Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.

    2016-01-01

    The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451

  12. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, III, William B.

    1997-01-01

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity are disclosed. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie's Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation.

  13. Reineke’s stand density index: a quantitative and non-unitless measure of stand density

    Treesearch

    Curtis L. VanderSchaaf

    2013-01-01

    When used as a measure of relative density, Reineke’s stand density index (SDI) can be made unitless by relating the current SDI to a standard density but when used as a quantitative measure of stand density SDI is not unitless. Reineke’s SDI relates the current stand density to an equivalent number of trees per unit area in a stand with a quadratic mean diameter (Dq)...

  14. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; Zheng, Lu; Jiang, Zhanzhi; Ganesan, Vishal; Wang, Yayu; Lai, Keji

    2018-04-01

    We report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-field microwave imaging with small distance modulation.

  15. Evaluation of quantitative PCR measurement of bacterial colonization of epithelial cells.

    PubMed

    Schmidt, Marcin T; Olejnik-Schmidt, Agnieszka K; Myszka, Kamila; Borkowska, Monika; Grajek, Włodzimierz

    2010-01-01

    Microbial colonization is an important step in establishing pathogenic or probiotic relations to host cells and in biofilm formation on industrial or medical devices. The aim of this work was to verify the applicability of quantitative PCR (Real-Time PCR) to measure bacterial colonization of epithelial cells. Salmonella enterica and Caco-2 intestinal epithelial cell line was used as a model. To verify sensitivity of the assay a competition of the pathogen cells to probiotic microorganism was tested. The qPCR method was compared to plate count and radiolabel approach, which are well established techniques in this area of research. The three methods returned similar results. The best quantification accuracy had radiolabel method, followed by qPCR. The plate count results showed coefficient of variation two-times higher than this of qPCR. The quantitative PCR proved to be a reliable method for enumeration of microbes in colonization assay. It has several advantages that make it very useful in case of analyzing mixed populations, where several different species or even strains can be monitored at the same time.

  16. Timed function tests, motor function measure, and quantitative thigh muscle MRI in ambulant children with Duchenne muscular dystrophy: A cross-sectional analysis.

    PubMed

    Schmidt, Simone; Hafner, Patricia; Klein, Andrea; Rubino-Nacht, Daniela; Gocheva, Vanya; Schroeder, Jonas; Naduvilekoot Devasia, Arjith; Zuesli, Stephanie; Bernert, Guenther; Laugel, Vincent; Bloetzer, Clemens; Steinlin, Maja; Capone, Andrea; Gloor, Monika; Tobler, Patrick; Haas, Tanja; Bieri, Oliver; Zumbrunn, Thomas; Fischer, Dirk; Bonati, Ulrike

    2018-01-01

    The development of new therapeutic agents for the treatment of Duchenne muscular dystrophy has put a focus on defining outcome measures most sensitive to capture treatment effects. This cross-sectional analysis investigates the relation between validated clinical assessments such as the 6-minute walk test, motor function measure and quantitative muscle MRI of thigh muscles in ambulant Duchenne muscular dystrophy patients, aged 6.5 to 10.8 years (mean 8.2, SD 1.1). Quantitative muscle MRI included the mean fat fraction using a 2-point Dixon technique, and transverse relaxation time (T2) measurements. All clinical assessments were highly significantly inter-correlated with p < 0.001. The strongest correlation with the motor function measure and its D1-subscore was shown by the 6-minute walk test. Clinical assessments showed no correlation with age. Importantly, quantitative muscle MRI values significantly correlated with all clinical assessments with the extensors showing the strongest correlation. In contrast to the clinical assessments, quantitative muscle MRI values were highly significantly correlated with age. In conclusion, the motor function measure and timed function tests measure disease severity in a highly comparable fashion and all tests correlated with quantitative muscle MRI values quantifying fatty muscle degeneration. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Retrieving the Quantitative Chemical Information at Nanoscale from Scanning Electron Microscope Energy Dispersive X-ray Measurements by Machine Learning

    NASA Astrophysics Data System (ADS)

    Jany, B. R.; Janas, A.; Krok, F.

    2017-11-01

    The quantitative composition of metal alloy nanowires on InSb(001) semiconductor surface and gold nanostructures on germanium surface is determined by blind source separation (BSS) machine learning (ML) method using non negative matrix factorization (NMF) from energy dispersive X-ray spectroscopy (EDX) spectrum image maps measured in a scanning electron microscope (SEM). The BSS method blindly decomposes the collected EDX spectrum image into three source components, which correspond directly to the X-ray signals coming from the supported metal nanostructures, bulk semiconductor signal and carbon background. The recovered quantitative composition is validated by detailed Monte Carlo simulations and is confirmed by separate cross-sectional TEM EDX measurements of the nanostructures. This shows that SEM EDX measurements together with machine learning blind source separation processing could be successfully used for the nanostructures quantitative chemical composition determination.

  18. Quantitative measurements of intercellular adhesion between a macrophage and cancer cells using a cup-attached AFM chip.

    PubMed

    Kim, Hyonchol; Yamagishi, Ayana; Imaizumi, Miku; Onomura, Yui; Nagasaki, Akira; Miyagi, Yohei; Okada, Tomoko; Nakamura, Chikashi

    2017-07-01

    Intercellular adhesion between a macrophage and cancer cells was quantitatively measured using atomic force microscopy (AFM). Cup-shaped metal hemispheres were fabricated using polystyrene particles as a template, and a cup was attached to the apex of the AFM cantilever. The cup-attached AFM chip (cup-chip) approached a murine macrophage cell (J774.2), the cell was captured on the inner concave of the cup, and picked up by withdrawing the cup-chip from the substrate. The cell-attached chip was advanced towards a murine breast cancer cell (FP10SC2), and intercellular adhesion between the two cells was quantitatively measured. To compare cell adhesion strength, the work required to separate two adhered cells (separation work) was used as a parameter. Separation work was almost 2-fold larger between a J774.2 cell and FP10SC2 cell than between J774.2 cell and three additional different cancer cells (4T1E, MAT-LyLu, and U-2OS), two FP10SC2 cells, or two J774.2 cells. FP10SC2 was established from 4T1E as a highly metastatic cell line, indicates separation work increased as the malignancy of cancer cells became higher. One possible explanation of the strong adhesion of macrophages to cancer cells observed in this study is that the measurement condition mimicked the microenvironment of tumor-associated macrophages (TAMs) in vivo, and J774.2 cells strongly expressed CD204, which is a marker of TAMs. The results of the present study, which were obtained by measuring cell adhesion strength quantitatively, indicate that the fabricated cup-chip is a useful tool for measuring intercellular adhesion easily and quantitatively. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Measuring changes in transmission of neglected tropical diseases, malaria, and enteric pathogens from quantitative antibody levels.

    PubMed

    Arnold, Benjamin F; van der Laan, Mark J; Hubbard, Alan E; Steel, Cathy; Kubofcik, Joseph; Hamlin, Katy L; Moss, Delynn M; Nutman, Thomas B; Priest, Jeffrey W; Lammie, Patrick J

    2017-05-01

    Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P. falciparum (Spearman's rho = 0.75). In

  20. Quantitative performance of a polarization diffraction grating polarimeter encoded onto two liquid-crystal-on-silicon displays

    NASA Astrophysics Data System (ADS)

    Cofré, Aarón; Vargas, Asticio; Torres-Ruiz, Fabián A.; Campos, Juan; Lizana, Angel; del Mar Sánchez-López, María; Moreno, Ignacio

    2017-11-01

    We present a quantitative analysis of the performance of a complete snapshot polarimeter based on a polarization diffraction grating (PDGr). The PDGr is generated in a common path polarization interferometer with a Z optical architecture that uses two liquid-crystal on silicon (LCoS) displays to imprint two different phase-only diffraction gratings onto two orthogonal linear states of polarization. As a result, we obtain a programmable PDGr capable to act as a simultaneous polarization state generator (PSG), yielding diffraction orders with different states of polarization. The same system is also shown to operate as a polarization state analyzer (PSA), therefore useful for the realization of a snapshot polarimeter. We analyze its performance using quantitative metrics such as the conditional number, and verify its reliability for the detection of states of polarization.

  1. A quantitative ELISA procedure for the measurement of membrane-bound platelet-associated IgG (PAIgG).

    PubMed

    Lynch, D M; Lynch, J M; Howe, S E

    1985-03-01

    A quantitative ELISA assay for the measurement of in vivo bound platelet-associated IgG (PAIgG) using intact patient platelets is presented. The assay requires quantitation and standardization of the number of platelets bound to microtiter plate wells and an absorbance curve using quantitated IgG standards. Platelet-bound IgG was measured using an F(ab')2 peroxidase labeled anti-human IgG and o-phenylenediamine dihydrochloride (OPD) as the substrate. Using this assay, PAIgG for normal individuals was 2.8 +/- 1.6 fg/platelet (mean +/- 1 SD; n = 30). Increased levels were found in 28 of 30 patients with clinical autoimmune thrombocytopenia (ATP) with a range of 7.0-80 fg/platelet. Normal PAIgG levels were found in 26 of 30 patients with nonimmune thrombocytopenia. In the sample population studied, the PAIgG assay showed a sensitivity of 93%, specificity of 90%, a positive predictive value of 0.90, and a negative predictive value of 0.93. The procedure is highly reproducible (CV = 6.8%) and useful in evaluating patients with suspected immune mediated thrombocytopenia.

  2. Quantitative and simultaneous non-invasive measurement of skin hydration and sebum levels

    PubMed Central

    Ezerskaia, Anna; Pereira, S. F.; Urbach, H. Paul; Verhagen, Rieko; Varghese, Babu

    2016-01-01

    We report a method on quantitative and simultaneous non-contact in-vivo hydration and sebum measurements of the skin using an infrared optical spectroscopic set-up. The method utilizes differential detection with three wavelengths 1720, 1750, and 1770 nm, corresponding to the lipid vibrational bands that lay “in between” the prominent water absorption bands. We have used an emulsifier containing hydro- and lipophilic components to mix water and sebum in various volume fractions which was applied to the skin to mimic different oily-dry skin conditions. We also measured the skin sebum and hydration values on the forehead under natural conditions and its variations to external stimuli. Good agreement was found between our experimental results and reference values measured using conventional biophysical methods such as Corneometer and Sebumeter. PMID:27375946

  3. Measuring the performance of livability programs.

    DOT National Transportation Integrated Search

    2013-07-01

    This report analyzes the performance measurement processes adopted by five large livability programs throughout the United States. It compares and contrasts these programs by examining existing research in performance measurement methods. The ...

  4. The implications of low-speed fixed-wing aerofoil measurements on the analysis and performance of flapping bird wings.

    PubMed

    Spedding, G R; Hedenström, A H; McArthur, J; Rosén, M

    2008-01-01

    Bird flight occurs over a range of Reynolds numbers (Re; 10(4) < or = Re < or = 10(5), where Re is a measure of the relative importance of inertia and viscosity) that includes regimes where standard aerofoil performance is difficult to predict, compute or measure, with large performance jumps in response to small changes in geometry or environmental conditions. A comparison of measurements of fixed wing performance as a function of Re, combined with quantitative flow visualisation techniques, shows that, surprisingly, wakes of flapping bird wings at moderate flight speeds admit to certain simplifications where their basic properties can be understood through quasi-steady analysis. Indeed, a commonly cited measure of the relative flapping frequency, or wake unsteadiness, the Strouhal number, is seen to be approximately constant in accordance with a simple requirement for maintaining a moderate local angle of attack on the wing. Together, the measurements imply a fine control of boundary layer separation on the wings, with implications for control strategies and wing shape selection by natural and artificial fliers.

  5. Quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis--data from the Osteoarthritis Initiative.

    PubMed

    Emmanuel, K; Quinn, E; Niu, J; Guermazi, A; Roemer, F; Wirth, W; Eckstein, F; Felson, D

    2016-02-01

    To test the hypothesis that quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis (KOA), prior to the advent of radiographic disease. 206 knees with incident radiographic KOA (Kellgren Lawrence Grade (KLG) 0 or 1 at baseline, developing KLG 2 or greater with a definite osteophyte and joint space narrowing (JSN) grade ≥1 by year 4) were matched to 232 control knees not developing incident KOA. Manual segmentation of the central five slices of the medial and lateral meniscus was performed on coronal 3T DESS MRI and quantitative meniscus position was determined. Cases and controls were compared using conditional logistic regression adjusting for age, sex, BMI, race and clinical site. Sensitivity analyses of early (year [Y] 1/2) and late (Y3/4) incidence was performed. Mean medial extrusion distance was significantly greater for incident compared to non-incident knees (1.56 mean ± 1.12 mm SD vs 1.29 ± 0.99 mm; +21%, P < 0.01), so was the percent extrusion area of the medial meniscus (25.8 ± 15.8% vs 22.0 ± 13.5%; +17%, P < 0.05). This finding was consistent for knees restricted to medial incidence. No significant differences were observed for the lateral meniscus in incident medial KOA, or for the tibial plateau coverage between incident and non-incident knees. Restricting the analysis to medial incident KOA at Y1/2 differences were attenuated, but reached significance for extrusion distance, whereas no significant differences were observed at incident KOA in Y3/4. Greater medial meniscus extrusion predicts incident radiographic KOA. Early onset KOA showed greater differences for meniscus position between incident and non-incident knees than late onset KOA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  6. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  7. Gold Nanoparticle Labeling Based ICP-MS Detection/Measurement of Bacteria, and Their Quantitative Photothermal Destruction

    PubMed Central

    Lin, Yunfeng

    2015-01-01

    Bacteria such as Salmonella and E. coli present a great challenge in public health care in today’s society. Protection of public safety against bacterial contamination and rapid diagnosis of infection require simple and fast assays for the detection and elimination of bacterial pathogens. After utilizing Salmonella DT104 as an example bacterial strain for our investigation, we report a rapid and sensitive assay for the qualitative and quantitative detection of bacteria by using antibody affinity binding, popcorn shaped gold nanoparticle (GNPOPs) labeling, surfance enchanced Raman spectroscopy (SERS), and inductively coupled plasma mass spectrometry (ICP-MS) detection. For qualitative analysis, our assay can detect Salmonella within 10 min by Raman spectroscopy; for quantitative analysis, our assay has the ability to measure as few as 100 Salmonella DT104 in a 1 mL sample (100 CFU/mL) within 40 min. Based on the quantitative detection, we investigated the quantitative destruction of Salmonella DT104, and the assay’s photothermal efficiency in order to reduce the amount of GNPOPs in the assay to ultimately to eliminate any potential side effects/toxicity to the surrounding cells in vivo. Results suggest that our assay may serve as a promising candidate for qualitative and quantitative detection and elimination of a variety of bacterial pathogens. PMID:26417447

  8. How Students Process Equations in Solving Quantitative Synthesis Problems? Role of Mathematical Complexity in Students' Mathematical Performance

    ERIC Educational Resources Information Center

    Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan

    2017-01-01

    We examine students' mathematical performance on quantitative "synthesis problems" with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking,…

  9. The quantitative evaluation of intracranial pressure by optic nerve sheath diameter/eye diameter CT measurement.

    PubMed

    Bekerman, Inessa; Sigal, Tal; Kimiagar, Itzhak; Ben Ely, Anna; Vaiman, Michael

    2016-12-01

    The changes of the optic nerve sheath diameter (ONSD) have been used to assess changes of the intracranial pressure for 20 years. The aim of this research was to further quantify the technique of measuring the ONSD for this purpose. Retrospective study of computed tomographic (CT) data of 1766 adult patients with intracranial hypotension (n=134) or hypertension (n=1632) were analyzed. The eyeball transverse diameter (ETD) and ONSD were obtained bilaterally, and the ONSD/ETD ratio was calculated. The ratio was used to calculate the normal ONSD for patients and to estimate the intracranial pressure of the patients before and after the onset of the pathology. Correlation analysis was performed with invasively measured intracranial pressure, the presence or absence of papilledema, sex, and age. In hypotension cases, the ONSD by CT was 3.4±0.7 mm (P=.03 against normative 4.4±0.8 mm). In cases with hypertension, the diameter was 6.9±1.3 (P=.02, with a cutoff value ˃5.5 mm). The ONSD/ETD ratio was 0.29±0.04 against 0.19±0.02 in healthy adults (P=.01). The ONSD and the ONSD/ETD ratio can indicate low intracranial pressure, but quantification is impossible at intracranial pressure less than 13 mm Hg. In elevated intracranial pressure, the ONSD and the ratio provide readings that correspond to readings in millimeters of mercury. The ONSD method, reinforced with additional calculations, may help to indicate a raised intracranial pressure, evaluate its severity quantitatively, and establish quantitative goals for treatment of intracranial hypertension, but the limitations of the method are to be taken into account. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Assessment of Renal Hemodynamics and Oxygenation by Simultaneous Magnetic Resonance Imaging (MRI) and Quantitative Invasive Physiological Measurements.

    PubMed

    Cantow, Kathleen; Arakelyan, Karen; Seeliger, Erdmann; Niendorf, Thoralf; Pohlmann, Andreas

    2016-01-01

    In vivo assessment of renal perfusion and oxygenation under (patho)physiological conditions by means of noninvasive diagnostic imaging is conceptually appealing. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) and quantitative parametric mapping of the magnetic resonance (MR) relaxation times T 2* and T 2 are thought to provide surrogates of renal tissue oxygenation. The validity and efficacy of this technique for quantitative characterization of local tissue oxygenation and its changes under different functional conditions have not been systematically examined yet and remain to be established. For this purpose, the development of an integrative multimodality approaches is essential. Here we describe an integrated hybrid approach (MR-PHYSIOL) that combines established quantitative physiological measurements with T 2* (T 2) mapping and MR-based kidney size measurements. Standardized reversible (patho)physiologically relevant interventions, such as brief periods of aortic occlusion, hypoxia, and hyperoxia, are used for detailing the relation between the MR-PHYSIOL parameters, in particular between renal T 2* and tissue oxygenation.

  11. The Reliability and Validity of Discrete and Continuous Measures of Psychopathology: A Quantitative Review

    ERIC Educational Resources Information Center

    Markon, Kristian E.; Chmielewski, Michael; Miller, Christopher J.

    2011-01-01

    In 2 meta-analyses involving 58 studies and 59,575 participants, we quantitatively summarized the relative reliability and validity of continuous (i.e., dimensional) and discrete (i.e., categorical) measures of psychopathology. Overall, results suggest an expected 15% increase in reliability and 37% increase in validity through adoption of a…

  12. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  13. Where should Momma go? Current nursing home performance measurement strategies and a less ambitious approach.

    PubMed

    Phillips, Charles D; Hawes, Catherine; Lieberman, Trudy; Koren, Mary Jane

    2007-06-25

    Nursing home performance measurement systems are practically ubiquitous. The vast majority of these systems aspire to rank order all nursing homes based on quantitative measures of quality. However, the ability of such systems to identify homes differing in quality is hampered by the multidimensional nature of nursing homes and their residents. As a result, the authors doubt the ability of many nursing home performance systems to truly help consumers differentiate among homes providing different levels of quality. We also argue that, for consumers, performance measurement models are better at identifying problem facilities than potentially good homes. In response to these concerns we present a proposal for a less ambitious approach to nursing home performance measurement than previously used. We believe consumers can make better informed choice using a simpler system designed to pinpoint poor-quality nursing homes, rather than one designed to rank hundreds of facilities based on differences in quality-of-care indicators that are of questionable importance. The suggested performance model is based on five principles used in the development of the Consumers Union 2006 Nursing Home Quality Monitor. We can best serve policy-makers and consumers by eschewing nursing home reporting systems that present information about all the facilities in a city, a state, or the nation on a website or in a report. We argue for greater modesty in our efforts and a focus on identifying only the potentially poorest or best homes. In the end, however, it is important to remember that information from any performance measurement website or report is no substitute for multiple visits to a home at different times of the day to personally assess quality.

  14. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, W.B. III

    1997-05-27

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity are disclosed. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie`s Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation. 7 figs.

  15. Measurement of lung expansion with computed tomography and comparison with quantitative histology.

    PubMed

    Coxson, H O; Mayo, J R; Behzad, H; Moore, B J; Verburgt, L M; Staples, C A; Paré, P D; Hogg, J C

    1995-11-01

    The total and regional lung volumes were estimated from computed tomography (CT), and the pleural pressure gradient was determined by using the milliliters of gas per gram of tissue estimated from the X-ray attenuation values and the pressure-volume curve of the lung. The data show that CT accurately estimated the volume of the resected lobe but overestimated its weight by 24 +/- 19%. The volume of gas per gram of tissue was less in the gravity-dependent regions due to a pleural pressure gradient of 0.24 +/- 0.08 cmH2O/cm of descent in the thorax. The proportion of tissue to air obtained with CT was similar to that obtained by quantitative histology. We conclude that the CT scan can be used to estimate total and regional lung volumes and that measurements of the proportions of tissue and air within the thorax by CT can be used in conjunction with quantitative histology to evaluate lung structure.

  16. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  17. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE PAGES

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; ...

    2018-04-01

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  18. Quantitative MRI and strength measurements in the assessment of muscle quality in Duchenne muscular dystrophy.

    PubMed

    Wokke, B H; van den Bergen, J C; Versluis, M J; Niks, E H; Milles, J; Webb, A G; van Zwet, E W; Aartsma-Rus, A; Verschuuren, J J; Kan, H E

    2014-05-01

    The purpose of this study was to assess leg muscle quality and give a detailed description of leg muscle involvement in a series of Duchenne muscular dystrophy patients using quantitative MRI and strength measurements. Fatty infiltration, as well as total and contractile (not fatty infiltrated) cross sectional areas of various leg muscles were determined in 16 Duchenne patients and 11 controls (aged 8-15). To determine specific muscle strength, four leg muscle groups (quadriceps femoris, hamstrings, anterior tibialis and triceps surae) were measured and related to the amount of contractile tissue. In patients, the quadriceps femoris showed decreased total and contractile cross sectional area, attributable to muscle atrophy. The total, but not the contractile, cross sectional area of the triceps surae was increased in patients, corresponding to hypertrophy. Specific strength decreased in all four muscle groups of Duchenne patients, indicating reduced muscle quality. This suggests that muscle hypertrophy and fatty infiltration are two distinct pathological processes, differing between muscle groups. Additionally, the quality of remaining muscle fibers is severely reduced in the legs of Duchenne patients. The combination of quantitative MRI and quantitative muscle testing could be a valuable outcome parameter in longitudinal studies and in the follow-up of therapeutic effects. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Quantitative Guidance for Stove Usage and Performance to Achieve Health and Environmental Targets

    PubMed Central

    Chiang, Ranyee A.

    2015-01-01

    Background Displacing the use of polluting and inefficient cookstoves in developing countries is necessary to achieve the potential health and environmental benefits sought through clean cooking solutions. Yet little quantitative context has been provided on how much displacement of traditional technologies is needed to achieve targets for household air pollutant concentrations or fuel savings. Objectives This paper provides instructive guidance on the usage of cooking technologies required to achieve health and environmental improvements. Methods We evaluated different scenarios of displacement of traditional stoves with use of higher performing technologies. The air quality and fuel consumption impacts were estimated for these scenarios using a single-zone box model of indoor air quality and ratios of thermal efficiency. Results Stove performance and usage should be considered together, as lower performing stoves can result in similar or greater benefits than a higher performing stove if the lower performing stove has considerably higher displacement of the baseline stove. Based on the indoor air quality model, there are multiple performance–usage scenarios for achieving modest indoor air quality improvements. To meet World Health Organization guidance levels, however, three-stone fire and basic charcoal stove usage must be nearly eliminated to achieve the particulate matter target (< 1–3 hr/week), and substantially limited to meet the carbon monoxide guideline (< 7–9 hr/week). Conclusions Moderate health gains may be achieved with various performance–usage scenarios. The greatest benefits are estimated to be achieved by near-complete displacement of traditional stoves with clean technologies, emphasizing the need to shift in the long term to near exclusive use of clean fuels and stoves. The performance–usage scenarios are also provided as a tool to guide technology selection and prioritize behavior change opportunities to maximize impact. Citation

  20. 26 CFR 801.3 - Measuring employee performance.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... REVENUE PRACTICE BALANCED SYSTEM FOR MEASURING ORGANIZATIONAL AND EMPLOYEE PERFORMANCE WITHIN THE INTERNAL REVENUE SERVICE § 801.3 Measuring employee performance. (a) In general. All employees of the IRS will be... 26 Internal Revenue 20 2011-04-01 2011-04-01 false Measuring employee performance. 801.3 Section...

  1. Quantitative Method of Measuring Metastatic Activity

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  2. A Critique of Health System Performance Measurement.

    PubMed

    Lynch, Thomas

    2015-01-01

    Health system performance measurement is a ubiquitous phenomenon. Many authors have identified multiple methodological and substantive problems with performance measurement practices. Despite the validity of these criticisms and their cross-national character, the practice of health system performance measurement persists. Theodore Marmor suggests that performance measurement invokes an "incantatory response" wrapped within "linguistic muddle." In this article, I expand upon Marmor's insights using Pierre Bourdieu's theoretical framework to suggest that, far from an aberration, the "linguistic muddle" identified by Marmor is an indicator of a broad struggle about the representation and classification of public health services as a public good. I present a case study of performance measurement from Alberta, Canada, examining how this representational struggle occurs and what the stakes are. © The Author(s) 2015.

  3. Quantitative measurement of cerebral blood flow in a juvenile porcine model by depth-resolved near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Elliott, Jonathan T.; Diop, Mamadou; Tichauer, Kenneth M.; Lee, Ting-Yim; Lawrence, Keith St.

    2010-05-01

    Nearly half a million children and young adults are affected by traumatic brain injury each year in the United States. Although adequate cerebral blood flow (CBF) is essential to recovery, complications that disrupt blood flow to the brain and exacerbate neurological injury often go undetected because no adequate bedside measure of CBF exists. In this study we validate a depth-resolved, near-infrared spectroscopy (NIRS) technique that provides quantitative CBF measurement despite significant signal contamination from skull and scalp tissue. The respiration rates of eight anesthetized pigs (weight: 16.2+/-0.5 kg, age: 1 to 2 months old) are modulated to achieve a range of CBF levels. Concomitant CBF measurements are performed with NIRS and CT perfusion. A significant correlation between CBF measurements from the two techniques is demonstrated (r2=0.714, slope=0.92, p<0.001), and the bias between the two techniques is -2.83 mL.min-1.100 g-1 (CI0.95: -19.63 mL.min-1.100 g-1-13.9 mL.min-1.100 g-1). This study demonstrates that accurate measurements of CBF can be achieved with depth-resolved NIRS despite significant signal contamination from scalp and skull. The ability to measure CBF at the bedside provides a means of detecting, and thereby preventing, secondary ischemia during neurointensive care.

  4. Differences between genders in colorectal morphology on CT colonography using a quantitative approach: a pilot study.

    PubMed

    Weber, Charles N; Poff, Jason A; Lev-Toaff, Anna S; Levine, Marc S; Zafar, Hanna M

    To explore quantitative differences between genders in morphologic colonic metrics and determine metric reproducibility. Quantitative colonic metrics from 20 male and 20 female CTC datasets were evaluated twice by two readers; all exams were performed after incomplete optical colonoscopy. Intra-/inter-reader reliability was measured with intraclass correlation coefficient (ICC) and concordance correlation coefficient (CCC). Women had overall decreased colonic volume, increased tortuosity and compactness and lower sigmoid apex height on CTC compared to men (p<0.0001,all). Quantitative measurements in colonic metrics were highly reproducible (ICC=0.9989 and 0.9970; CCC=0.9945). Quantitative morphologic differences between genders can be reproducibility measured. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Computer systems performance measurement techniques.

    DOT National Transportation Integrated Search

    1971-06-01

    Computer system performance measurement techniques, tools, and approaches are presented as a foundation for future recommendations regarding the instrumentation of the ARTS ATC data processing subsystem for purposes of measurement and evaluation.

  6. Validation of a Spectral Method for Quantitative Measurement of Color in Protein Drug Solutions.

    PubMed

    Yin, Jian; Swartz, Trevor E; Zhang, Jian; Patapoff, Thomas W; Chen, Bartolo; Marhoul, Joseph; Shih, Norman; Kabakoff, Bruce; Rahimi, Kimia

    2016-01-01

    A quantitative spectral method has been developed to precisely measure the color of protein solutions. In this method, a spectrophotometer is utilized for capturing the visible absorption spectrum of a protein solution, which can then be converted to color values (L*a*b*) that represent human perception of color in a quantitative three-dimensional space. These quantitative values (L*a*b*) allow for calculating the best match of a sample's color to a European Pharmacopoeia reference color solution. In order to qualify this instrument and assay for use in clinical quality control, a technical assessment was conducted to evaluate the assay suitability and precision. Setting acceptance criteria for this study required development and implementation of a unique statistical method for assessing precision in 3-dimensional space. Different instruments, cuvettes, protein solutions, and analysts were compared in this study. The instrument accuracy, repeatability, and assay precision were determined. The instrument and assay are found suitable for use in assessing color of drug substances and drug products and is comparable to the current European Pharmacopoeia visual assessment method. In the biotechnology industry, a visual assessment is the most commonly used method for color characterization, batch release, and stability testing of liquid protein drug solutions. Using this method, an analyst visually determines the color of the sample by choosing the closest match to a standard color series. This visual method can be subjective because it requires an analyst to make a judgment of the best match of color of the sample to the standard color series, and it does not capture data on hue and chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we developed a quantitative spectral method for color determination that greatly reduces the variability in measuring color and allows

  7. Co-Teaching in Middle School Classrooms: Quantitative Comparative Study of Special Education Student Assessment Performance

    ERIC Educational Resources Information Center

    Reese, De'borah Reese

    2017-01-01

    The purpose of this quantitative comparative study was to determine the existence or nonexistence of performance pass rate differences of special education middle school students on standardized assessments between pre and post co-teaching eras disaggregated by subject area and school. Co-teaching has altered classroom environments in many ways.…

  8. DGT Passive Sampling for Quantitative in Situ Measurements of Compounds from Household and Personal Care Products in Waters.

    PubMed

    Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C

    2017-11-21

    Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.

  9. MEASUREMENT: ACCOUNTING FOR RELIABILITY IN PERFORMANCE ESTIMATES.

    PubMed

    Waterman, Brian; Sutter, Robert; Burroughs, Thomas; Dunagan, W Claiborne

    2014-01-01

    When evaluating physician performance measures, physician leaders are faced with the quandary of determining whether departures from expected physician performance measurements represent a true signal or random error. This uncertainty impedes the physician leader's ability and confidence to take appropriate performance improvement actions based on physician performance measurements. Incorporating reliability adjustment into physician performance measurement is a valuable way of reducing the impact of random error in the measurements, such as those caused by small sample sizes. Consequently, the physician executive has more confidence that the results represent true performance and is positioned to make better physician performance improvement decisions. Applying reliability adjustment to physician-level performance data is relatively new. As others have noted previously, it's important to keep in mind that reliability adjustment adds significant complexity to the production, interpretation and utilization of results. Furthermore, the methods explored in this case study only scratch the surface of the range of available Bayesian methods that can be used for reliability adjustment; further study is needed to test and compare these methods in practice and to examine important extensions for handling specialty-specific concerns (e.g., average case volumes, which have been shown to be important in cardiac surgery outcomes). Moreover, it's important to note that the provider group average as a basis for shrinkage is one of several possible choices that could be employed in practice and deserves further exploration in future research. With these caveats, our results demonstrate that incorporating reliability adjustment into physician performance measurements is feasible and can notably reduce the incidence of "real" signals relative to what one would expect to see using more traditional approaches. A physician leader who is interested in catalyzing performance improvement

  10. Measuring changes in transmission of neglected tropical diseases, malaria, and enteric pathogens from quantitative antibody levels

    PubMed Central

    van der Laan, Mark J.; Hubbard, Alan E.; Steel, Cathy; Kubofcik, Joseph; Hamlin, Katy L.; Moss, Delynn M.; Nutman, Thomas B.; Priest, Jeffrey W.; Lammie, Patrick J.

    2017-01-01

    Background Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. Methods/Principal findings We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P

  11. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  12. Research Frontiers in Public Sector Performance Measurement

    NASA Astrophysics Data System (ADS)

    Zhonghua, Cai; Ye, Wang

    In "New Public Management" era, performance measurement has been widely used in managerial practices of public sectors. From the content and features of performance measurement, this paper aims to explore inspirations on Chinese public sector performance measurement, which based on a review of prior literatures including influencial factors, methods and indicators of public sector performance evaluation. In the end, arguments are presented in this paper pointed out the direction of future researches in this field.

  13. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, Robert V.

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.

  14. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  15. Improving competitiveness through performance-measurement systems.

    PubMed

    Stewart, L J; Lockamy, A

    2001-12-01

    Parallels exist between the competitive pressures felt by U.S. manufacturers over the past 30 years and those experienced by healthcare providers today. Increasing market deregulation, changing government policies, and growing consumerism have altered the healthcare arena. Responding to similar pressures, manufacturers adopted a strategic orientation driven by customer needs and expectations that led them to achieve high performance levels and surpass their competition. The adoption of integrated performance-measurement systems was instrumental in these firms' success. An integrated performance-measurement model for healthcare organizations can help to blend the organization's strategy with the demands of the contemporary healthcare environment. Performance-measurement systems encourage healthcare organizations to focus on their mission and vision by aligning their strategic objectives and resource-allocation decisions with customer requirements.

  16. Quantitative measurement of piezoelectric coefficient of thin film using a scanning evanescent microwave microscope.

    PubMed

    Zhao, Zhenli; Luo, Zhenlin; Liu, Chihui; Wu, Wenbin; Gao, Chen; Lu, Yalin

    2008-06-01

    This article describes a new approach to quantitatively measure the piezoelectric coefficients of thin films at the microscopic level using a scanning evanescent microwave microscope. This technique can resolve 10 pm deformation caused by the piezoelectric effect and has the advantages of high scanning speed, large scanning area, submicron spatial resolution, and a simultaneous accessibility to many other related properties. Results from the test measurements on the longitudinal piezoelectric coefficient of PZT thin film agree well with those from other techniques listed in literatures.

  17. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    PubMed

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  18. Measuring Filament Orientation: A New Quantitative, Local Approach

    NASA Astrophysics Data System (ADS)

    Green, C.-E.; Dawson, J. R.; Cunningham, M. R.; Jones, P. A.; Novak, G.; Fissel, L. M.

    2017-09-01

    The relative orientation between filamentary structures in molecular clouds and the ambient magnetic field provides insight into filament formation and stability. To calculate the relative orientation, a measurement of filament orientation is first required. We propose a new method to calculate the orientation of the one-pixel-wide filament skeleton that is output by filament identification algorithms such as filfinder. We derive the local filament orientation from the direction of the intensity gradient in the skeleton image using the Sobel filter and a few simple post-processing steps. We call this the “Sobel-gradient method.” The resulting filament orientation map can be compared quantitatively on a local scale with the magnetic field orientation map to then find the relative orientation of the filament with respect to the magnetic field at each point along the filament. It can also be used for constructing radial profiles for filament width fitting. The proposed method facilitates automation in analyses of filament skeletons, which is imperative in this era of “big data.”

  19. Standardized Approach to Quantitatively Measure Residual Limb Skin Health in Individuals with Lower Limb Amputation.

    PubMed

    Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K

    2017-07-01

    Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n  = 5) and transfemoral ( n  = 5) amputation were compared to able-limb controls ( n  = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.

  20. Quantitative magnetic resonance imaging phantoms: A review and the need for a system phantom.

    PubMed

    Keenan, Kathryn E; Ainslie, Maureen; Barker, Alex J; Boss, Michael A; Cecil, Kim M; Charles, Cecil; Chenevert, Thomas L; Clarke, Larry; Evelhoch, Jeffrey L; Finn, Paul; Gembris, Daniel; Gunter, Jeffrey L; Hill, Derek L G; Jack, Clifford R; Jackson, Edward F; Liu, Guoying; Russek, Stephen E; Sharma, Samir D; Steckner, Michael; Stupic, Karl F; Trzasko, Joshua D; Yuan, Chun; Zheng, Jie

    2018-01-01

    The MRI community is using quantitative mapping techniques to complement qualitative imaging. For quantitative imaging to reach its full potential, it is necessary to analyze measurements across systems and longitudinally. Clinical use of quantitative imaging can be facilitated through adoption and use of a standard system phantom, a calibration/standard reference object, to assess the performance of an MRI machine. The International Society of Magnetic Resonance in Medicine AdHoc Committee on Standards for Quantitative Magnetic Resonance was established in February 2007 to facilitate the expansion of MRI as a mainstream modality for multi-institutional measurements, including, among other things, multicenter trials. The goal of the Standards for Quantitative Magnetic Resonance committee was to provide a framework to ensure that quantitative measures derived from MR data are comparable over time, between subjects, between sites, and between vendors. This paper, written by members of the Standards for Quantitative Magnetic Resonance committee, reviews standardization attempts and then details the need, requirements, and implementation plan for a standard system phantom for quantitative MRI. In addition, application-specific phantoms and implementation of quantitative MRI are reviewed. Magn Reson Med 79:48-61, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  1. Air traffic control specialist performance measurement database.

    DOT National Transportation Integrated Search

    1999-06-01

    The Air Traffic Control Specialist (ATCS) Performance Measurement Database is a compilation of performance measures and : measurement techniques that researchers have used. It may be applicable to other human factor research related to air traffic co...

  2. Mid-infrared laser absorption tomography for quantitative 2D thermochemistry measurements in premixed jet flames

    NASA Astrophysics Data System (ADS)

    Wei, Chuyu; Pineda, Daniel I.; Paxton, Laurel; Egolfopoulos, Fokion N.; Spearrin, R. Mitchell

    2018-06-01

    A tomographic laser absorption spectroscopy technique, utilizing mid-infrared light sources, is presented as a quantitative method to spatially resolve species and temperature profiles in small-diameter reacting flows relevant to combustion systems. Here, tunable quantum and interband cascade lasers are used to spectrally resolve select rovibrational transitions near 4.98 and 4.19 μm to measure CO and {CO2}, respectively, as well as their vibrational temperatures, in piloted premixed jet flames. Signal processing methods are detailed for the reconstruction of axial and radial profiles of thermochemical structure in a canonical ethylene-air jet flame. The method is further demonstrated to quantitatively distinguish between different turbulent flow conditions.

  3. Methods for Quantitative Creatinine Determination.

    PubMed

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  4. Portable smartphone based quantitative phase microscope

    NASA Astrophysics Data System (ADS)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  5. Numerical analysis of quantitative measurement of hydroxyl radical concentration using laser-induced fluorescence in flame

    NASA Astrophysics Data System (ADS)

    Shuang, Chen; Tie, Su; Yao-Bang, Zheng; Li, Chen; Ting-Xu, Liu; Ren-Bing, Li; Fu-Rong, Yang

    2016-06-01

    The aim of the present work is to quantitatively measure the hydroxyl radical concentration by using LIF (laser-induced fluorescence) in flame. The detailed physical models of spectral absorption lineshape broadening, collisional transition and quenching at elevated pressure are built. The fine energy level structure of the OH molecule is illustrated to understand the process with laser-induced fluorescence emission and others in the case without radiation, which include collisional quenching, rotational energy transfer (RET), and vibrational energy transfer (VET). Based on these, some numerical results are achieved by simulations in order to evaluate the fluorescence yield at elevated pressure. These results are useful for understanding the real physical processes in OH-LIF technique and finding a way to calibrate the signal for quantitative measurement of OH concentration in a practical combustor. Project supported by the National Natural Science Foundation of China (Grant No. 11272338) and the Fund from the Science and Technology on Scramjet Key Laboratory, China (Grant No. STSKFKT2013004).

  6. Measuring the influence of professional nursing practice on global hospital performance in an organizational context.

    PubMed

    Fasoli, Dijon R

    2008-01-01

    The purpose of this study was to measure the influence of professional nursing practice (PNP) on global hospital performance (GHP). Evidence links PNP and positive outcomes for patients and nurses, however, little is known about PNP influence on GHP measures used for patient decision-making and hospital management resource allocation decisions. A quantitative study using multiple regression analysis to predict a composite measure of GHP was conducted. Two survey instruments measuring perspectives of the PNP environment were completed by 1815 (31.3%) Registered Nurses (RN) and 28 (100%) Senior Nurse Executives (SNE) at 28 northeastern US hospitals. Secondary data provided organizational attributes. The degree of PNP was consistently reported by RNs and SNEs. When regressed with organizational factors, PNP was not a significant predictor of GHP. Better GHP was associated with lower lengths of stay, lower profitability, less admission growth, and non-health system affiliation. Further research is needed to define a nursing-sensitive GHP measure.

  7. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  8. Velocity Measurement in Carotid Artery: Quantitative Comparison of Time-Resolved 3D Phase-Contrast MRI and Image-based Computational Fluid Dynamics

    PubMed Central

    Sarrami-Foroushani, Ali; Nasr Esfahany, Mohsen; Nasiraei Moghaddam, Abbas; Saligheh Rad, Hamidreza; Firouznia, Kavous; Shakiba, Madjid; Ghanaati, Hossein; Wilkinson, Iain David; Frangi, Alejandro Federico

    2015-01-01

    Background: Understanding hemodynamic environment in vessels is important for realizing the mechanisms leading to vascular pathologies. Objectives: Three-dimensional velocity vector field in carotid bifurcation is visualized using TR 3D phase-contrast magnetic resonance imaging (TR 3D PC MRI) and computational fluid dynamics (CFD). This study aimed to present a qualitative and quantitative comparison of the velocity vector field obtained by each technique. Subjects and Methods: MR imaging was performed on a 30-year old male normal subject. TR 3D PC MRI was performed on a 3 T scanner to measure velocity in carotid bifurcation. 3D anatomical model for CFD was created using images obtained from time-of-flight MR angiography. Velocity vector field in carotid bifurcation was predicted using CFD and PC MRI techniques. A statistical analysis was performed to assess the agreement between the two methods. Results: Although the main flow patterns were the same for the both techniques, CFD showed a greater resolution in mapping the secondary and circulating flows. Overall root mean square (RMS) errors for all the corresponding data points in PC MRI and CFD were 14.27% in peak systole and 12.91% in end diastole relative to maximum velocity measured at each cardiac phase. Bland-Altman plots showed a very good agreement between the two techniques. However, this study was not aimed to validate any of methods, instead, the consistency was assessed to accentuate the similarities and differences between Time-resolved PC MRI and CFD. Conclusion: Both techniques provided quantitatively consistent results of in vivo velocity vector fields in right internal carotid artery (RCA). PC MRI represented a good estimation of main flow patterns inside the vasculature, which seems to be acceptable for clinical use. However, limitations of each technique should be considered while interpreting results. PMID:26793288

  9. Advanced Engineering Technology for Measuring Performance.

    PubMed

    Rutherford, Drew N; D'Angelo, Anne-Lise D; Law, Katherine E; Pugh, Carla M

    2015-08-01

    The demand for competency-based assessments in surgical training is growing. Use of advanced engineering technology for clinical skills assessment allows for objective measures of hands-on performance. Clinical performance can be assessed in several ways via quantification of an assessee's hand movements (motion tracking), direction of visual attention (eye tracking), levels of stress (physiologic marker measurements), and location and pressure of palpation (force measurements). Innovations in video recording technology and qualitative analysis tools allow for a combination of observer- and technology-based assessments. Overall the goal is to create better assessments of surgical performance with robust validity evidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. [Supply services at health facilities: measuring performance].

    PubMed

    Dacosta Claro, I

    2001-01-01

    Performance measurement, in their different meanings--either balance scorecard or outputs measurement--have become an essential tool in today's organizations (World-Class organizations) to improve service quality and reduce costs. This paper presents a performance measurement system for the hospital supply chain. The system is organized in different levels and groups of indicators in order to show a hierarchical, coherent and integrated vision of the processes. Thus, supply services performance is measured according to (1) financial aspects, (2) customers satisfaction aspects and (3) internal aspects of the processes performed. Since the informational needs of the managers vary within the administrative structure, the performance measurement system is defined in three hierarchical levels. Firstly, the whole supply chain, with the different interrelation of activities. Secondly, the three main processes of the chain--physical management of products, purchasing and negotiation processes and the local storage units. And finally, the performance measurement of each activity involved. The system and the indicators have been evaluated with the participation of 17 health services of Quebec (Canada), however, and due to the similarities of the operation, could be equally implemented in Spanish hospitals.

  11. Quantitative and qualitative measure of intralaboratory two-dimensional protein gel reproducibility and the effects of sample preparation, sample load, and image analysis.

    PubMed

    Choe, Leila H; Lee, Kelvin H

    2003-10-01

    We investigate one approach to assess the quantitative variability in two-dimensional gel electrophoresis (2-DE) separations based on gel-to-gel variability, sample preparation variability, sample load differences, and the effect of automation on image analysis. We observe that 95% of spots present in three out of four replicate gels exhibit less than a 0.52 coefficient of variation (CV) in fluorescent stain intensity (% volume) for a single sample run on multiple gels. When four parallel sample preparations are performed, this value increases to 0.57. We do not observe any significant change in quantitative value for an increase or decrease in sample load of 30% when using appropriate image analysis variables. Increasing use of automation, while necessary in modern 2-DE experiments, does change the observed level of quantitative and qualitative variability among replicate gels. The number of spots that change qualitatively for a single sample run in parallel varies from a CV = 0.03 for fully manual analysis to CV = 0.20 for a fully automated analysis. We present a systematic method by which a single laboratory can measure gel-to-gel variability using only three gel runs.

  12. NanoDrop Microvolume Quantitation of Nucleic Acids

    PubMed Central

    Desjardins, Philippe; Conklin, Deborah

    2010-01-01

    Biomolecular assays are continually being developed that use progressively smaller amounts of material, often precluding the use of conventional cuvette-based instruments for nucleic acid quantitation for those that can perform microvolume quantitation. The NanoDrop microvolume sample retention system (Thermo Scientific NanoDrop Products) functions by combining fiber optic technology and natural surface tension properties to capture and retain minute amounts of sample independent of traditional containment apparatus such as cuvettes or capillaries. Furthermore, the system employs shorter path lengths, which result in a broad range of nucleic acid concentration measurements, essentially eliminating the need to perform dilutions. Reducing the volume of sample required for spectroscopic analysis also facilitates the inclusion of additional quality control steps throughout many molecular workflows, increasing efficiency and ultimately leading to greater confidence in downstream results. The need for high-sensitivity fluorescent analysis of limited mass has also emerged with recent experimental advances. Using the same microvolume sample retention technology, fluorescent measurements may be performed with 2 μL of material, allowing fluorescent assays volume requirements to be significantly reduced. Such microreactions of 10 μL or less are now possible using a dedicated microvolume fluorospectrometer. Two microvolume nucleic acid quantitation protocols will be demonstrated that use integrated sample retention systems as practical alternatives to traditional cuvette-based protocols. First, a direct A260 absorbance method using a microvolume spectrophotometer is described. This is followed by a demonstration of a fluorescence-based method that enables reduced-volume fluorescence reactions with a microvolume fluorospectrometer. These novel techniques enable the assessment of nucleic acid concentrations ranging from 1 pg/ μL to 15,000 ng/ μL with minimal consumption of

  13. Quantitative flow and velocity measurements of pulsatile blood flow with 4D-DSA

    NASA Astrophysics Data System (ADS)

    Shaughnessy, Gabe; Hoffman, Carson; Schafer, Sebastian; Mistretta, Charles A.; Strother, Charles M.

    2017-03-01

    Time resolved 3D angiographic data from 4D DSA provides a unique environment to explore physical properties of blood flow. Utilizing the pulsatility of the contrast waveform, the Fourier components can be used to track the waveform motion through vessels. Areas of strong pulsatility are determined through the FFT power spectrum. Using this method, we find an accuracy from 4D-DSA flow measurements within 7.6% and 6.8% RMSE of ICA PCVIPR and phantom flow probe validation measurements, respectively. The availability of velocity and flow information with fast acquisition could provide a more quantitative approach to treatment planning and evaluation in interventional radiology.

  14. SU-E-I-51: Quantitative Assessment of X-Ray Imaging Detector Performance in a Clinical Setting - a Simple Approach Using a Commercial Instrument

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjoeberg, J; Bujila, R; Omar, A

    2015-06-15

    Purpose: To measure and compare the performance of X-ray imaging detectors in a clinical setting using a dedicated instrument for the quantitative determination of detector performance. Methods: The DQEPro (DQE Instruments Inc., London, Ontario Canada) was used to determine the MTF, NPS and DQE using an IEC compliant methodology for three different imaging modalities: conventional radiography (CsI-based detector), general-purpose radioscopy (CsI-based detector), and mammography (a-Se based detector). The radiation qualities (IEC) RQA-5 and RQA-M-2 were used for the CsI-based and a-Se-based detectors, respectively. The DQEPro alleviates some of the difficulties associated with DQE measurements by automatically positioning test devices overmore » the detector, guiding the user through the image acquisition process and providing software for calculations. Results: A comparison of the NPS showed that the image noise of the a-Se detector was less correlated than the CsI detectors. A consistently higher performance was observed for the a-Se detector at all spatial frequencies (MTF: 0.97@0.25 cy/mm, DQE: 0.72@0.25 cy/mm) and the DQE drops off slower than for the CsI detectors. The CsI detector used for conventional radiography displayed a higher performance at low spatial frequencies compared to the CsI detector used for radioscopy (DQE: 0.65 vs 0.60@0.25 cy/mm). However, at spatial frequencies above 1.3 cy/mm, the radioscopy detector displayed better performance than the conventional radiography detector (DQE: 0.35 vs 0.24@2.00 cy/mm). Conclusion: The difference in the MTF, NPS and DQE that was observed for the two different CsI detectors and the a-Se detector reflect the imaging tasks that the different detector types are intended for. The DQEPro has made the determination and calculation of quantitative metrics of X-ray imaging detector performance substantially more convenient and accessible to undertake in a clinical setting.« less

  15. Statistical issues in the comparison of quantitative imaging biomarker algorithms using pulmonary nodule volume as an example.

    PubMed

    Obuchowski, Nancy A; Barnhart, Huiman X; Buckler, Andrew J; Pennello, Gene; Wang, Xiao-Feng; Kalpathy-Cramer, Jayashree; Kim, Hyun J Grace; Reeves, Anthony P

    2015-02-01

    Quantitative imaging biomarkers are being used increasingly in medicine to diagnose and monitor patients' disease. The computer algorithms that measure quantitative imaging biomarkers have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms' bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms' performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for quantitative imaging biomarker studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  16. Quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis – data from the Osteoarthritis Initiative

    PubMed Central

    Emmanuel, K.; Quinn, E.; Niu, J.; Guermazi, A.; Roemer, F.; Wirth, W.; Eckstein, F.; Felson, D.

    2017-01-01

    SUMMARY Objective To test the hypothesis that quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis (KOA), prior to the advent of radiographic disease. Methods 206 knees with incident radiographic KOA (Kellgren Lawrence Grade (KLG) 0 or 1 at baseline, developing KLG 2 or greater with a definite osteophyte and joint space narrowing (JSN) grade ≥1 by year 4) were matched to 232 control knees not developing incident KOA. Manual segmentation of the central five slices of the medial and lateral meniscus was performed on coronal 3T DESS MRI and quantitative meniscus position was determined. Cases and controls were compared using conditional logistic regression adjusting for age, sex, BMI, race and clinical site. Sensitivity analyses of early (year [Y] 1/2) and late (Y3/4) incidence was performed. Results Mean medial extrusion distance was significantly greater for incident compared to non-incident knees (1.56 mean ± 1.12 mm SD vs 1.29 ± 0.99 mm; +21%, P < 0.01), so was the percent extrusion area of the medial meniscus (25.8 ± 15.8% vs 22.0 ± 13.5%; +17%, P < 0.05). This finding was consistent for knees restricted to medial incidence. No significant differences were observed for the lateral meniscus in incident medial KOA, or for the tibial plateau coverage between incident and non-incident knees. Restricting the analysis to medial incident KOA at Y1/2 differences were attenuated, but reached significance for extrusion distance, whereas no significant differences were observed at incident KOA in Y3/4. Conclusion Greater medial meniscus extrusion predicts incident radiographic KOA. Early onset KOA showed greater differences for meniscus position between incident and non-incident knees than late onset KOA. PMID:26318658

  17. A quantitative evaluation of spurious results in the infrared spectroscopic measurement of CO2 isotope ratios

    NASA Astrophysics Data System (ADS)

    Mansfield, C. D.; Rutt, H. N.

    2002-02-01

    The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.

  18. Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing

    PubMed Central

    Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak

    2012-01-01

    This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700

  19. The balanced scorecard--measures that drive performance.

    PubMed

    Kaplan, R S; Norton, D P

    1992-01-01

    Frustrated by the inadequacies of traditional performance measurement systems, some managers have abandoned financial measures like return on equity and earnings per share. "Make operational improvements and the numbers will follow," the argument goes. But managers do not want to choose between financial and operational measures. Executives want a balanced presentation of measures that allow them to view the company from several perspectives simultaneously. During a year-long research project with 12 companies at the leading edge of performance measurement, the authors developed a "balanced scorecard," a new performance measurement system that gives top managers a fast but comprehensive view of the business. The balanced scorecard includes financial measures that tell the results of actions already taken. And it complements those financial measures with three sets of operational measures having to do with customer satisfaction, internal processes, and the organization's ability to learn and improve--the activities that drive future financial performance. Managers can create a balanced scorecard by translating their company's strategy and mission statements into specific goals and measures. To create the part of the scorecard that focuses on the customer perspective, for example, executives at Electronic Circuits Inc. established general goals for customer performance: get standard products to market sooner, improve customers' time-to-market, become customers' supplier of choice through partnerships, and develop innovative products tailored to customer needs. Managers translated these elements of strategy into four specific goals and identified a measure for each.

  20. An Ethnostatistical Analysis of Performance Measurement

    ERIC Educational Resources Information Center

    Winiecki, Donald J.

    2008-01-01

    Within the fields of human performance technology (HPT), human resources management (HRM), and management in general, performance measurement is not only foundational but considered necessary at all phases in the process of HPT. In HPT in particular, there is substantial advice literature on what measurement is, why it is essential, and (at a…

  1. Initial description of a quantitative, cross-species (chimpanzee-human) social responsiveness measure

    PubMed Central

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve; Constantino, John; Povinelli, Daniel; Pruett, John R.

    2011-01-01

    Objective Comparative studies of social responsiveness, an ability that is impaired in autistic spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species (human-chimpanzee) social responsiveness measure. Method We translated the Social Responsiveness Scale (SRS), an instrument that quantifies human social responsiveness, into an analogous instrument for chimpanzees. We then retranslated this "Chimp SRS" into a human "Cross-Species SRS" (XSRS). We evaluated three groups of chimpanzees (n=29) with the Chimp SRS and typical and autistic spectrum disorder (ASD) human children (n=20) with the XSRS. Results The Chimp SRS demonstrated strong inter-rater reliability at the three sites (ranges for individual ICCs: .534–.866 and mean ICCs: .851–.970). As has been observed in humans, exploratory principal components analysis of Chimp SRS scores supports a single factor underlying chimpanzee social responsiveness. Human subjects' XSRS scores were fully concordant with their SRS scores (r=.976, p=.001) and distinguished appropriately between typical and ASD subjects. One chimpanzee known for inappropriate social behavior displayed a significantly higher score than all other chimpanzees at its site, demonstrating the scale's ability to detect impaired social responsiveness in chimpanzees. Conclusion Our initial cross-species social responsiveness scale proved reliable and discriminated differences in social responsiveness across (in a relative sense) and within (in a more objectively quantifiable manner) humans and chimpanzees. PMID:21515200

  2. Resected Brain Tissue, Seizure Onset Zone and Quantitative EEG Measures: Towards Prediction of Post-Surgical Seizure Control

    PubMed Central

    Andrzejak, Ralph G.; Hauf, Martinus; Pollo, Claudio; Müller, Markus; Weisstanner, Christian; Wiest, Roland; Schindler, Kaspar

    2015-01-01

    Background Epilepsy surgery is a potentially curative treatment option for pharmacoresistent patients. If non-invasive methods alone do not allow to delineate the epileptogenic brain areas the surgical candidates undergo long-term monitoring with intracranial EEG. Visual EEG analysis is then used to identify the seizure onset zone for targeted resection as a standard procedure. Methods Despite of its great potential to assess the epileptogenicty of brain tissue, quantitative EEG analysis has not yet found its way into routine clinical practice. To demonstrate that quantitative EEG may yield clinically highly relevant information we retrospectively investigated how post-operative seizure control is associated with four selected EEG measures evaluated in the resected brain tissue and the seizure onset zone. Importantly, the exact spatial location of the intracranial electrodes was determined by coregistration of pre-operative MRI and post-implantation CT and coregistration with post-resection MRI was used to delineate the extent of tissue resection. Using data-driven thresholding, quantitative EEG results were separated into normally contributing and salient channels. Results In patients with favorable post-surgical seizure control a significantly larger fraction of salient channels in three of the four quantitative EEG measures was resected than in patients with unfavorable outcome in terms of seizure control (median over the whole peri-ictal recordings). The same statistics revealed no association with post-operative seizure control when EEG channels contributing to the seizure onset zone were studied. Conclusions We conclude that quantitative EEG measures provide clinically relevant and objective markers of target tissue, which may be used to optimize epilepsy surgery. The finding that differentiation between favorable and unfavorable outcome was better for the fraction of salient values in the resected brain tissue than in the seizure onset zone is consistent

  3. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  4. 47 CFR 73.1590 - Equipment performance measurements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Equipment performance measurements. 73.1590... RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.1590 Equipment performance... equipment performance measurements for each main transmitter as follows: (1) Upon initial installation of a...

  5. Formation resistivity measurements from within a cased well used to quantitatively determine the amount of oil and gas present

    DOEpatents

    Vail, III, William Banning

    2000-01-01

    Methods to quantitatively determine the separate amounts of oil and gas in a geological formation adjacent to a cased well using measurements of formation resistivity. The steps include obtaining resistivity measurements from within a cased well of a given formation, obtaining the porosity, obtaining the resistivity of formation water present, computing the combined amounts of oil and gas present using Archie's Equations, determining the relative amounts of oil and gas present from measurements within a cased well, and then quantitatively determining the separate amounts of oil and gas present in the formation. Resistivity measurements are obtained from within the cased well by conducting A.C. current from within the cased well to a remote electrode at a frequency that is within the frequency range of 0.1 Hz to 20 Hz.

  6. MathPatch - Raising Retention and Performance in an Intro-geoscience Class by Raising Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Baer, E. M.; Whittington, C.; Burn, H.

    2008-12-01

    The geological sciences are fundamentally quantitative. However, the diversity of students' mathematical preparation and skills makes the successful use of quantitative concepts difficult in introductory level classes. At Highline Community College, we have implemented a one-credit co-requisite course to give students supplemental instruction for quantitative skills used in the course. The course, formally titled "Quantitative Geology," nicknamed "MathPatch," runs parallel to our introductory Physical Geology course. MathPatch teaches the quantitative skills required for the geology class right before they are needed. Thus, students learn only the skills they need and are given opportunities to apply them immediately. Topics include complex-graph reading, unit conversions, large numbers, scientific notation, scale and measurement, estimation, powers of 10, and other fundamental mathematical concepts used in basic geological concepts. Use of this course over the past 8 years has successfully accomplished the goals of increasing students' quantitative skills, success and retention. Students master the quantitative skills to a greater extent than before the course was implemented, and less time is spent covering basic quantitative skills in the classroom. Because the course supports the use of quantitative skills, the large number of faculty that teach Geology 101 are more comfortable in using quantitative analysis, and indeed see it as an expectation of the course at Highline. Also significant, retention in the geology course has increased substantially, from 75% to 85%. Although successful, challenges persist with requiring MathPatch as a supplementary course. One, we have seen enrollments decrease in Geology 101, which may be the result of adding this co-requisite. Students resist mandatory enrollment in the course, although they are not good at evaluating their own need for the course. The logistics utilizing MathPatch in an evening class with fewer and longer

  7. OPPORTUNISTIC ASPERGILLUS PATHOGENS MEASURED IN HOME AND HOSPITAL TAP WATER BY MOLD SPECIFIC QUANTITATIVE PCR (MSQPCR)

    EPA Science Inventory

    Opportunistic fungal pathogens are a concern because of the increasing number of immunocompromised patients. The goal of this research was to test a simple extraction method and rapid quantitative PCR (QPCR) measurement of the occurrence of potential pathogens, Aspergillus fumiga...

  8. Evaluation and performance of desorption electrospray ionization using a triple quadrupole mass spectrometer for quantitation of pharmaceuticals in plasma.

    PubMed

    Kennedy, Joseph H; Wiseman, Justin M

    2010-02-01

    The present work describes the methodology and investigates the performance of desorption electrospray ionization (DESI) combined with a triple quadrupole mass spectrometer for the quantitation of small drug molecules in human plasma. Amoxepine, atenolol, carbamazepine, clozapine, prazosin, propranolol and verapamil were selected as target analytes while terfenadine was selected as the internal standard common to each of the analytes. Protein precipitation of human plasma using acetonitrile was utilized for all samples. Limits of detection were determined for all analytes in plasma and shown to be in the range 0.2-40 ng/mL. Quantitative analysis of amoxepine, prazosin and verapamil was performed over the range 20-7400 ng/mL and shown to be linear in all cases with R(2) >0.99. In most cases, the precision (relative standard deviation) and accuracy (relative error) of each method were less than or equal to 20%, respectively. The performance of the combined techniques made it possible to analyze each sample in 15 s illustrating DESI tandem mass spectrometry (MS/MS) as powerful tool for the quantitation of analytes in deproteinized human plasma. Copyright 2010 John Wiley & Sons, Ltd.

  9. Advances in Surface Plasmon Resonance Imaging allowing for quantitative measurement of laterally heterogeneous samples

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2012-02-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  10. Does hospital financial performance measure up?

    PubMed

    Cleverley, W O; Harvey, R K

    1992-05-01

    Comparisons are continuously being made between the financial performance, products and services, of the healthcare industry and those of non-healthcare industries. Several useful measures of financial performance--profitability, liquidity, financial risk, asset management and replacement, and debt capacity, are used by the authors to compare the financial performance of the hospital industry with that of the industrial, transportation and utility sectors. Hospitals exhibit weaknesses in several areas. Goals are suggested for each measure to bring hospitals closer to competitive levels.

  11. Quantitative Measures of Immersion in Cloud and the Biogeography of Cloud Forests

    NASA Technical Reports Server (NTRS)

    Lawton, R. O.; Nair, U. S.; Ray, D.; Regmi, A.; Pounds, J. A.; Welch, R. M.

    2010-01-01

    Sites described as tropical montane cloud forests differ greatly, in part because observers tend to differ in their opinion as to what constitutes frequent and prolonged immersion in cloud. This definitional difficulty interferes with hydrologic analyses, assessments of environmental impacts on ecosystems, and biogeographical analyses of cloud forest communities and species. Quantitative measurements of cloud immersion can be obtained on site, but the observations are necessarily spatially limited, although well-placed observers can examine 10 50 km of a mountain range under rainless conditions. Regional analyses, however, require observations at a broader scale. This chapter discusses remote sensing and modeling approaches that can provide quantitative measures of the spatiotemporal patterns of cloud cover and cloud immersion in tropical mountain ranges. These approaches integrate remote sensing tools of various spatial resolutions and frequencies of observation, digital elevation models, regional atmospheric models, and ground-based observations to provide measures of cloud cover, cloud base height, and the intersection of cloud and terrain. This combined approach was applied to the Monteverde region of northern Costa Rica to illustrate how the proportion of time the forest is immersed in cloud may vary spatially and temporally. The observed spatial variation was largely due to patterns of airflow over the mountains. The temporal variation reflected the diurnal rise and fall of the orographic cloud base, which was influenced in turn by synoptic weather conditions, the seasonal movement of the Intertropical Convergence Zone and the north-easterly trade winds. Knowledge of the proportion of the time that sites are immersed in clouds should facilitate ecological comparisons and biogeographical analyses, as well as land use planning and hydrologic assessments in areas where intensive on-site work is not feasible.

  12. Quantitative Primary Tumor Indocyanine Green Measurements Predict Osteosarcoma Metastatic Lung Burden in a Mouse Model.

    PubMed

    Fourman, Mitchell S; Mahjoub, Adel; Mandell, Jon B; Yu, Shibing; Tebbets, Jessica C; Crasto, Jared A; Alexander, Peter E; Weiss, Kurt R

    2018-03-01

    Current preclinical osteosarcoma (OS) models largely focus on quantifying primary tumor burden. However, most fatalities from OS are caused by metastatic disease. The quantification of metastatic OS currently relies on CT, which is limited by motion artifact, requires intravenous contrast, and can be technically demanding in the preclinical setting. We describe the ability for indocyanine green (ICG) fluorescence angiography to quantify primary and metastatic OS in a previously validated orthotopic, immunocompetent mouse model. (1) Can near-infrared ICG fluorescence be used to attach a comparable, quantitative value to the primary OS tumor in our experimental mouse model? (2) Will primary tumor fluorescence differ in mice that go on to develop metastatic lung disease? (3) Does primary tumor fluorescence correlate with tumor volume measured with CT? Six groups of 4- to 6-week-old immunocompetent Balb/c mice (n = 6 per group) received paraphyseal injections into their left hindlimb proximal tibia consisting of variable numbers of K7M2 mouse OS cells. A hindlimb transfemoral amputation was performed 4 weeks after injection with euthanasia and lung extraction performed 10 weeks after injection. Histologic examination of lung and primary tumor specimens confirmed ICG localization only within the tumor bed. Mice with visible or palpable tumor growth had greater hindlimb fluorescence (3.5 ± 2.3 arbitrary perfusion units [APU], defined as the fluorescence pixel return normalized by the detector) compared with those with a negative examination (0.71 ± 0.38 APU, -2.7 ± 0.5 mean difference, 95% confidence interval -3.7 to -1.8, p < 0.001). A strong linear trend (r = 0.81, p < 0.01) was observed between primary tumor and lung fluorescence, suggesting that quantitative ICG tumor fluorescence is directly related to eventual metastatic burden. We did not find a correlation (r = 0.04, p = 0.45) between normalized primary tumor fluorescence and CT volumetric measurements. We

  13. Quantitative fluorescence angiography for neurosurgical interventions.

    PubMed

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  14. Quantitative trait loci for maternal performance for offspring survival in mice.

    PubMed Central

    Peripato, Andréa C; De Brito, Reinaldo A; Vaughn, Ty T; Pletscher, L Susan; Matioli, Sergio R; Cheverud, James M

    2002-01-01

    Maternal performance refers to the effect that the environment provided by mothers has on their offspring's phenotypes, such as offspring survival and growth. Variations in maternal behavior and physiology are responsible for variations in maternal performance, which in turn affects offspring survival. In our study we found females that failed to nurture their offspring and showed abnormal maternal behaviors. The genetic architecture of maternal performance for offspring survival was investigated in 241 females of an F(2) intercross of the SM/J and LG/J inbred mouse strains. Using interval-mapping methods we found two quantitative trait loci (QTL) affecting maternal performance at D2Mit17 + 6 cM and D7Mit21 + 2 cM on chromosomes 2 and 7, respectively. In a two-way genome-wide epistasis scan we found 15 epistatic interactions involving 23 QTL distributed across all chromosomes except 12, 16, and 17. These loci form several small sets of interacting QTL, suggesting a complex set of mechanisms operating to determine maternal performance for offspring survival. Taken all together and correcting for the large number of significant factors, QTL and their interactions explain almost 35% of the phenotypic variation for maternal performance for offspring survival in this cross. This study allowed the identification of many possible candidate genes, as well as the relative size of gene effects and patterns of gene action affecting maternal performance in mice. Detailed behavior observation of mothers from later generations suggests that offspring survival in the first week is related to maternal success in building nests, grooming their pups, providing milk, and/or manifesting aggressive behavior against intruders. PMID:12454078

  15. Prospective evaluation of risk of vertebral fractures using quantitative ultrasound measurements and bone mineral density in a population-based sample of postmenopausal women: results of the Basel Osteoporosis Study.

    PubMed

    Hollaender, R; Hartl, F; Krieg, M-A; Tyndall, A; Geuckel, C; Buitrago-Tellez, C; Manghani, M; Kraenzlin, M; Theiler, R; Hans, D

    2009-03-01

    Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.

  16. Advancing the Fork detector for quantitative spent nuclear fuel verification

    DOE PAGES

    Vaccaro, S.; Gauld, I. C.; Hu, J.; ...

    2018-01-31

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false

  17. Advancing the Fork detector for quantitative spent nuclear fuel verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaccaro, S.; Gauld, I. C.; Hu, J.

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false

  18. Advancing the Fork detector for quantitative spent nuclear fuel verification

    NASA Astrophysics Data System (ADS)

    Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.

    2018-04-01

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms

  19. Developing safety performance functions incorporating reliability-based risk measures.

    PubMed

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. A device for rapid and quantitative measurement of cardiac myocyte contractility

    NASA Astrophysics Data System (ADS)

    Gaitas, Angelo; Malhotra, Ricky; Li, Tao; Herron, Todd; Jalife, José

    2015-03-01

    Cardiac contractility is the hallmark of cardiac function and is a predictor of healthy or diseased cardiac muscle. Despite advancements over the last two decades, the techniques and tools available to cardiovascular scientists are limited in their utility to accurately and reliably measure the amplitude and frequency of cardiomyocyte contractions. Isometric force measurements in the past have entailed cumbersome attachment of isolated and permeabilized cardiomyocytes to a force transducer followed by measurements of sarcomere lengths under conditions of submaximal and maximal Ca2+ activation. These techniques have the inherent disadvantages of being labor intensive and costly. We have engineered a micro-machined cantilever sensor with an embedded deflection-sensing element that, in preliminary experiments, has demonstrated to reliably measure cardiac cell contractions in real-time. Here, we describe this new bioengineering tool with applicability in the cardiovascular research field to effectively and reliably measure cardiac cell contractility in a quantitative manner. We measured contractility in both primary neonatal rat heart cardiomyocyte monolayers that demonstrated a beat frequency of 3 Hz as well as human embryonic stem cell-derived cardiomyocytes with a contractile frequency of about 1 Hz. We also employed the β-adrenergic agonist isoproterenol (100 nmol l-1) and observed that our cantilever demonstrated high sensitivity in detecting subtle changes in both chronotropic and inotropic responses of monolayers. This report describes the utility of our micro-device in both basic cardiovascular research as well as in small molecule drug discovery to monitor cardiac cell contractions.

  1. Development and psychometric evaluation of a quantitative measure of "fat talk".

    PubMed

    MacDonald Clarke, Paige; Murnen, Sarah K; Smolak, Linda

    2010-01-01

    Based on her anthropological research, Nichter (2000) concluded that it is normative for many American girls to engage in body self-disparagement in the form of "fat talk." The purpose of the present two studies was to develop a quantitative measure of fat talk. A series of 17 scenarios were created in which "Naomi" is talking with a female friend(s) and there is an expression of fat talk. College women respondents rated the frequency with which they would behave in a similar way as the women in each scenario. A nine-item one-factor scale was determined through principal components analysis and its scores yielded evidence of internal consistency reliability, test-retest reliability over a five-week time period, construct validity, discriminant validity, and incremental validity in that it predicted unique variance in body shame and eating disorder symptoms above and beyond other measures of self-objectification. Copyright 2009 Elsevier Ltd. All rights reserved.

  2. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  3. Performance Measures, Benchmarking and Value.

    ERIC Educational Resources Information Center

    McGregor, Felicity

    This paper discusses performance measurement in university libraries, based on examples from the University of Wollongong (UoW) in Australia. The introduction highlights the integration of information literacy into the curriculum and the outcomes of a 1998 UoW student satisfaction survey. The first section considers performance indicators in…

  4. Reliability and group differences in quantitative cervicothoracic measures among individuals with and without chronic neck pain

    PubMed Central

    2012-01-01

    Background Clinicians frequently rely on subjective categorization of impairments in mobility, strength, and endurance for clinical decision-making; however, these assessments are often unreliable and lack sensitivity to change. The objective of this study was to determine the inter-rater reliability, minimum detectable change (MDC), and group differences in quantitative cervicothoracic measures for individuals with and without chronic neck pain (NP). Methods Nineteen individuals with NP and 20 healthy controls participated in this case control study. Two physical therapists performed a 30-minute examination on separate days. A handheld dynamometer, gravity inclinometer, ruler, and stopwatch were used to quantify cervical range of motion (ROM), cervical muscle strength and endurance, and scapulothoracic muscle length and strength, respectively. Results Intraclass correlation coefficients for inter-rater reliability were significantly greater than zero for most impairment measures, with point estimates ranging from 0.45 to 0.93. The NP group exhibited reduced cervical ROM (P ≤ 0.012) and muscle strength (P ≤ 0.038) in most movement directions, reduced cervical extensor endurance (P = 0.029), and reduced rhomboid and middle trapezius muscle strength (P ≤ 0.049). Conclusions Results demonstrate the feasibility of obtaining objective cervicothoracic impairment measures with acceptable inter-rater agreement across time. The clinical utility of these measures is supported by evidence of impaired mobility, strength, and endurance among patients with NP, with corresponding MDC values that can help establish benchmarks for clinically significant change. PMID:23114092

  5. Selecting quantitative water management measures at the river basin scale in a global change context

    NASA Astrophysics Data System (ADS)

    Girard, Corentin; Rinaudo, Jean-Daniel; Caballero, Yvan; Pulido-Velazquez, Manuel

    2013-04-01

    One of the main challenges in the implementation of the Water Framework Directive (WFD) in the European Union is the definition of programme of measures to reach the good status of the European water bodies. In areas where water scarcity is an issue, one of these challenges is the selection of water conservation and capacity expansion measures to ensure minimum environmental in-stream flow requirements. At the same time, the WFD calls for the use of economic analysis to identify the most cost-effective combination of measures at the river basin scale to achieve its objective. With this respect, hydro-economic river basin models, by integrating economics, environmental and hydrological aspects at the river basin scale in a consistent framework, represent a promising approach. This article presents a least-cost river basin optimization model (LCRBOM) that selects the combination of quantitative water management measures to meet environmental flows for future scenarios of agricultural and urban demand taken into account the impact of the climate change. The model has been implemented in a case study on a Mediterranean basin in the south of France, the Orb River basin. The water basin has been identified as in need for quantitative water management measures in order to reach the good status of its water bodies. The LCRBOM has been developed using GAMS, applying Mixed Integer Linear Programming. It is run to select the set of measures that minimizes the total annualized cost of the applied measures, while meeting the demands and minimum in-stream flow constraints. For the economic analysis, the programme of measures is composed of water conservation measures on agricultural and urban water demands. It compares them with measures mobilizing new water resources coming from groundwater, inter-basin transfers and improvement in reservoir operating rules. The total annual cost of each measure is calculated for each demand unit considering operation, maintenance and

  6. Comparative Validation of Five Quantitative Rapid Test Kits for the Analysis of Salt Iodine Content: Laboratory Performance, User- and Field-Friendliness

    PubMed Central

    Rohner, Fabian; Kangambèga, Marcelline O.; Khan, Noor; Kargougou, Robert; Garnier, Denis; Sanou, Ibrahima; Ouaro, Bertine D.; Petry, Nicolai; Wirth, James P.; Jooste, Pieter

    2015-01-01

    Background Iodine deficiency has important health and development consequences and the introduction of iodized salt as national programs has been a great public health success in the past decades. To render national salt iodization programs sustainable and ensure adequate iodization levels, simple methods to quantitatively assess whether salt is adequately iodized are required. Several methods claim to be simple and reliable, and are available on the market or are in development. Objective This work has validated the currently available quantitative rapid test kits (quantRTK) in a comparative manner for both their laboratory performance and ease of use in field settings. Methods Laboratory performance parameters (linearity, detection and quantification limit, intra- and inter-assay imprecision) were conducted on 5 quantRTK. We assessed inter-operator imprecision using salt of different quality along with the comparison of 59 salt samples from across the globe; measurements were made both in a laboratory and a field setting by technicians and non-technicians. Results from the quantRTK were compared against iodometric titration for validity. An ‘ease-of-use’ rating system was developed to identify the most suitable quantRTK for a given task. Results Most of the devices showed acceptable laboratory performance, but for some of the devices, use by non-technicians revealed poorer performance when working in a routine manner. Of the quantRTK tested, the iCheck® and I-Reader® showed most consistent performance and ease of use, and a newly developed paper-based method (saltPAD) holds promise if further developed. Conclusions User- and field-friendly devices are now available and the most appropriate quantRTK can be selected depending on the number of samples and the budget available. PMID:26401655

  7. Kinetics of Poliovirus Shedding following Oral Vaccination as Measured by Quantitative Reverse Transcription-PCR versus Culture

    PubMed Central

    Begum, Sharmin; Uddin, Md Jashim; Platts-Mills, James A.; Liu, Jie; Kirkpatrick, Beth D.; Chowdhury, Anwarul H.; Jamil, Khondoker M.; Haque, Rashidul; Petri, William A.; Houpt, Eric R.

    2014-01-01

    Amid polio eradication efforts, detection of oral polio vaccine (OPV) virus in stool samples can provide information about rates of mucosal immunity and allow estimation of the poliovirus reservoir. We developed a multiplex one-step quantitative reverse transcription-PCR (qRT-PCR) assay for detection of OPV Sabin strains 1, 2, and 3 directly in stool samples with an external control to normalize samples for viral quantity and compared its performance with that of viral culture. We applied the assay to samples from infants in Dhaka, Bangladesh, after the administration of trivalent OPV (tOPV) at weeks 14 and 52 of life (on days 0 [pre-OPV], +4, +11, +18, and +25 relative to vaccination). When 1,350 stool samples were tested, the sensitivity and specificity of the quantitative PCR (qPCR) assay were 89 and 91% compared with culture. A quantitative relationship between culture+/qPCR+ and culture−/qPCR+ stool samples was observed. The kinetics of shedding revealed by qPCR and culture were similar. qPCR quantitative cutoffs based on the day +11 or +18 stool samples could be used to identify the culture-positive shedders, as well as the long-duration or high-frequency shedders. Interestingly, qPCR revealed that a small minority (7%) of infants contributed the vast majority (93 to 100%) of the total estimated viral excretion across all subtypes at each time point. This qPCR assay for OPV can simply and quantitatively detect all three Sabin strains directly in stool samples to approximate shedding both qualitatively and quantitatively. PMID:25378579

  8. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  9. Facility-level outcome performance measures for nursing homes.

    PubMed

    Porell, F; Caro, F G

    1998-12-01

    Risk-adjusted nursing home performance scores were developed for four health outcomes and five quality indicators from resident-level longitudinal case-mix reimbursement data for Medicaid residents of more than 500 nursing homes in Massachusetts. Facility performance was measured by comparing actual resident outcomes with expected outcomes derived from quarterly predictions of resident-level econometric models over a 3-year period (1991-1994). Performance measures were tightly distributed among facilities in the state. The intercorrelations among the nine outcome performance measures were relatively low and not uniformly positive. Performance measures were not highly associated with various structural facility attributes. For most outcomes, longitudinal analyses revealed only modest correlations between a facility's performance score from one time period to the next. Relatively few facilities exhibited consistent superior or inferior performance over time. The findings have implications toward the practical use of facility outcome performance measures for quality assurance and reimbursement purposes in the near future.

  10. Performance measures for rural transportation systems : guidebook.

    DOT National Transportation Integrated Search

    2006-06-01

    This Performance Measures for Rural Transportation Systems Guidebook provides a : standardized and supportable performance measurement process that can be applied to : transportation systems in rural areas. The guidance included in this guidebook was...

  11. Quantitative MRI of kidneys in renal disease.

    PubMed

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2018-03-01

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p < 0.05), higher ADCs (2.46 ± 0.20 vs. 2.18 ± 0.10 × 10 -3  mm 2 /s, p < 0.05), lower R2*s (14.9 ± 1.7 vs. 18.1 ± 1.6 s -1 , p < 0.05), and lower tissue stiffness (3.2 ± 0.3 vs. 3.8 ± 0.5 kPa, p < 0.05). Excellent reproducibility of the quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal

  12. Benchmarking and Performance Measurement.

    ERIC Educational Resources Information Center

    Town, J. Stephen

    This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…

  13. Delving into sensible measures to enhance the environmental performance of biohydrogen: A quantitative approach based on process simulation, life cycle assessment and data envelopment analysis.

    PubMed

    Martín-Gamboa, Mario; Iribarren, Diego; Susmozas, Ana; Dufour, Javier

    2016-08-01

    A novel approach is developed to evaluate quantitatively the influence of operational inefficiency in biomass production on the life-cycle performance of hydrogen from biomass gasification. Vine-growers and process simulation are used as key sources of inventory data. The life cycle assessment of biohydrogen according to current agricultural practices for biomass production is performed, as well as that of target biohydrogen according to agricultural practices optimised through data envelopment analysis. Only 20% of the vineyards assessed operate efficiently, and the benchmarked reduction percentages of operational inputs range from 45% to 73% in the average vineyard. The fulfilment of operational benchmarks avoiding irregular agricultural practices is concluded to improve significantly the environmental profile of biohydrogen (e.g., impact reductions above 40% for eco-toxicity and global warming). Finally, it is shown that this type of bioenergy system can be an excellent replacement for conventional hydrogen in terms of global warming and non-renewable energy demand. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Multisite concordance of apparent diffusion coefficient measurements across the NCI Quantitative Imaging Network.

    PubMed

    Newitt, David C; Malyarenko, Dariya; Chenevert, Thomas L; Quarles, C Chad; Bell, Laura; Fedorov, Andriy; Fennessy, Fiona; Jacobs, Michael A; Solaiyappan, Meiyappan; Hectors, Stefanie; Taouli, Bachir; Muzi, Mark; Kinahan, Paul E; Schmainda, Kathleen M; Prah, Melissa A; Taber, Erin N; Kroenke, Christopher; Huang, Wei; Arlinghaus, Lori R; Yankeelov, Thomas E; Cao, Yue; Aryal, Madhava; Yen, Yi-Fen; Kalpathy-Cramer, Jayashree; Shukla-Dave, Amita; Fung, Maggie; Liang, Jiachao; Boss, Michael; Hylton, Nola

    2018-01-01

    Diffusion weighted MRI has become ubiquitous in many areas of medicine, including cancer diagnosis and treatment response monitoring. Reproducibility of diffusion metrics is essential for their acceptance as quantitative biomarkers in these areas. We examined the variability in the apparent diffusion coefficient (ADC) obtained from both postprocessing software implementations utilized by the NCI Quantitative Imaging Network and online scan time-generated ADC maps. Phantom and in vivo breast studies were evaluated for two ([Formula: see text]) and four ([Formula: see text]) [Formula: see text]-value diffusion metrics. Concordance of the majority of implementations was excellent for both phantom ADC measures and in vivo [Formula: see text], with relative biases [Formula: see text] ([Formula: see text]) and [Formula: see text] (phantom [Formula: see text]) but with higher deviations in ADC at the lowest phantom ADC values. In vivo [Formula: see text] concordance was good, with typical biases of [Formula: see text] to 3% but higher for online maps. Multiple b -value ADC implementations were separated into two groups determined by the fitting algorithm. Intergroup mean ADC differences ranged from negligible for phantom data to 2.8% for [Formula: see text] in vivo data. Some higher deviations were found for individual implementations and online parametric maps. Despite generally good concordance, implementation biases in ADC measures are sometimes significant and may be large enough to be of concern in multisite studies.

  15. Measurement of bone marrow lesions by MR imaging in knee osteoarthritis using quantitative segmentation methods--a reliability and sensitivity to change analysis.

    PubMed

    Nielsen, Flemming K; Egund, Niels; Peters, David; Jurik, Anne Grethe

    2014-12-20

    Longitudinal assessment of bone marrow lesions (BMLs) in knee osteoarthritis (KOA) by MRI is usually performed using semi-quantitative grading methods. Quantitative segmentation methods may be more sensitive to detect change over time. The purpose of this study was to evaluate and compare the validity and sensitivity to detect changes of two quantitative MR segmentation methods for measuring BMLs in KOA, one computer assisted (CAS) and one manual (MS) method. Twenty-two patients with KOA confined to the medial femoro-tibial compartment obtained MRI at baseline and follow-up (median 334 days in between). STIR, T1 and fat saturated T1 post-contrast sequences were obtained using a 1.5 T system. The 44 sagittal STIR sequences were assessed independently by two readers for quantification of BML. The signal intensities (SIs) of the normal bone marrow in the lateral femoral condyles and tibial plateaus were used as threshold values. The volume of bone marrow with SIs exceeding the threshold values (BML) was measured in the medial femoral condyle and tibial plateau and related to the total volume of the condyles/plateaus.The 95% limits of agreement at baseline were used to determine the sensitivity to change. The mean threshold values of CAS and MS were almost identical but the absolute and relative BML volumes differed being 1319 mm3/10% and 1828 mm3/15% in the femur and 941 mm3/7% and 2097 mm3/18% in the tibia using CAS and MS, respectively. The BML volumes obtained by CAS and MS were significantly correlated but the tissue changes measured were different. The volume of voxels exceeding the threshold values was measured by CAS whereas MS included intervening voxels with normal SI.The 95% limits of agreement were narrower by CAS than by MS; a significant change of relative BML by CAS was outside the limits of -2.0%-4.7% whereas the limits by MS were -6.9%-8.2%. The BML changed significantly in 13 knees using CAS and in 10 knees by MS. CAS was a reliable method for

  16. Health Plan Performance Measurement within Medicare Subvention.

    DTIC Science & Technology

    1998-06-01

    the causes of poor performance (Siren & Laffel, 1996). Although outcomes measures such as nosocomial infection rates, admission rates for select...defined. Traditional outcomes measures include infection rates, morbidity, and mortality. The problem with these traditional measures is... Maternal /Child Care Indicators Nursing Staffing Indicators Outcome Indicators Technical Outcomes Plan Performance Stability of Health Plan

  17. Quantitative measurement of adhesion of ink on plastic films with a Nano Indenter and a Scanning Probe Microscope

    NASA Astrophysics Data System (ADS)

    Shen, Weidian

    2005-03-01

    Plastic film packaging is widely used these days, especially in the convenience food industry due to its flexibility, boilability, and microwavability. Almost every package is printed with ink. The adhesion of ink on plastic films merits increasing attention to ensure quality packaging. However, inks and plastic films are polymeric materials with complicated molecular structures. The thickness of the jelly-like ink is only 500nm or less, and the thickness of the soft and flexible film is no more than 50μm, which make the quantitative measurement of their adhesion very challenging. Up to now, no scientific quantitative measurement method for the adhesion of ink on plastic films has been documented. We have tried a technique, in which a Nano-Indenter and a Scanning Probe Microscope were used to evaluate the adhesion strength of ink deposited on plastic films, quantitatively, as well as examine the configurations of adhesion failure. It was helpful in better understanding the adhesion mechanism, thus giving direction as to how to improve the adhesion.

  18. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix.

    PubMed

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image

  19. Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.

    PubMed

    Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel

    2016-01-01

    Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability

  20. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    ERIC Educational Resources Information Center

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  1. Quantitative measurements of localized density variations in cylindrical tablets using X-ray microtomography.

    PubMed

    Busignies, Virginie; Leclerc, Bernard; Porion, Patrice; Evesque, Pierre; Couarraze, Guy; Tchoreloff, Pierre

    2006-08-01

    Direct compaction is a complex process that results in a density distribution inside the tablets which is often heterogeneous. Therefore, the density variations may affect the compact properties. A quantitative analysis of this phenomenon is still lacking. Recently, X-ray microtomography has been successfully used in pharmaceutical development to study qualitatively the impact of tablet shape and break-line in the density of pharmaceutical tablets. In this study, we evaluate the density profile in microcrystalline cellulose (Vivapur 12) compacts obtained at different mean porosity (ranging from 7.7% to 33.5%) using X-ray tomography technique. First, the validity of the Beer-Lambert law is studied. Then, density calibration is performed and density maps of cylindrical tablets are obtained and visualized using a process with colour-scale calibration plot which is explained. As expected, important heterogeneity in density is observed and quantified. The higher densities in peripheral region were particularly investigated and appraised in regard to the lower densities observed in the middle of the tablet. The results also underlined that in the case of pharmaceutical tablets, it is important to differentiate the mechanical properties representative of the total volume tablet and the mechanical properties that only characterize the tablet surface like the Brinell hardness measurements.

  2. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  3. Scalable Performance Measurement and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd

    2009-01-01

    Concurrency levels in large-scale, distributed-memory supercomputers are rising exponentially. Modern machines may contain 100,000 or more microprocessor cores, and the largest of these, IBM's Blue Gene/L, contains over 200,000 cores. Future systems are expected to support millions of concurrent tasks. In this dissertation, we focus on efficient techniques for measuring and analyzing the performance of applications running on very large parallel machines. Tuning the performance of large-scale applications can be a subtle and time-consuming task because application developers must measure and interpret data from many independent processes. While the volume of the raw data scales linearly with the number ofmore » tasks in the running system, the number of tasks is growing exponentially, and data for even small systems quickly becomes unmanageable. Transporting performance data from so many processes over a network can perturb application performance and make measurements inaccurate, and storing such data would require a prohibitive amount of space. Moreover, even if it were stored, analyzing the data would be extremely time-consuming. In this dissertation, we present novel methods for reducing performance data volume. The first draws on multi-scale wavelet techniques from signal processing to compress systemwide, time-varying load-balance data. The second uses statistical sampling to select a small subset of running processes to generate low-volume traces. A third approach combines sampling and wavelet compression to stratify performance data adaptively at run-time and to reduce further the cost of sampled tracing. We have integrated these approaches into Libra, a toolset for scalable load-balance analysis. We present Libra and show how it can be used to analyze data from large scientific applications scalably.« less

  4. Environmental Health Practice: Statistically Based Performance Measurement

    PubMed Central

    Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.

    2007-01-01

    Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709

  5. Quantitative contrast-enhanced mammography for contrast medium kinetics studies

    NASA Astrophysics Data System (ADS)

    Arvanitis, C. D.; Speller, R.

    2009-10-01

    Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.

  6. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  7. [Measurement and performance analysis of functional neural network].

    PubMed

    Li, Shan; Liu, Xinyu; Chen, Yan; Wan, Hong

    2018-04-01

    The measurement of network is one of the important researches in resolving neuronal population information processing mechanism using complex network theory. For the quantitative measurement problem of functional neural network, the relation between the measure indexes, i.e. the clustering coefficient, the global efficiency, the characteristic path length and the transitivity, and the network topology was analyzed. Then, the spike-based functional neural network was established and the simulation results showed that the measured network could represent the original neural connections among neurons. On the basis of the former work, the coding of functional neural network in nidopallium caudolaterale (NCL) about pigeon's motion behaviors was studied. We found that the NCL functional neural network effectively encoded the motion behaviors of the pigeon, and there were significant differences in four indexes among the left-turning, the forward and the right-turning. Overall, the establishment method of spike-based functional neural network is available and it is an effective tool to parse the brain information processing mechanism.

  8. Louisiana Airport System Plan : measures of performance.

    DOT National Transportation Integrated Search

    1992-07-01

    This report applies measures of performance to gauge the extent to which the airport system achieves its goals and objectives by the implementation of the Airport System Plan. Measures of Performance are expressed as percentages of the given demograp...

  9. Reconsidering the measurement of ancillary service performance.

    PubMed

    Griffin, D T; Rauscher, J A

    1987-08-01

    Prospective payment reimbursement systems have forced hospitals to review their costs more carefully. The result of the increased emphasis on costs is that many hospitals use costs, rather than margin, to judge the performance of ancillary services. However, arbitrary selection of performance measures for ancillary services can result in managerial decisions contrary to hospital objectives. Managerial accounting systems provide models which assist in the development of performance measures for ancillary services. Selection of appropriate performance measures provides managers with the incentive to pursue goals congruent with those of the hospital overall. This article reviews the design and implementation of managerial accounting systems, and considers the impact of prospective payment systems and proposed changes in capital reimbursement on this process.

  10. Performance Theories for Sentence Coding: Some Quantitative Models

    ERIC Educational Resources Information Center

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  11. Quantitative measurement of intact alpha-synuclein proteoforms from post-mortem control and Parkinson's disease brain tissue by intact protein mass spectrometry.

    PubMed

    Kellie, John F; Higgs, Richard E; Ryder, John W; Major, Anthony; Beach, Thomas G; Adler, Charles H; Merchant, Kalpana; Knierman, Michael D

    2014-07-23

    A robust top down proteomics method is presented for profiling alpha-synuclein species from autopsied human frontal cortex brain tissue from Parkinson's cases and controls. The method was used to test the hypothesis that pathology associated brain tissue will have a different profile of post-translationally modified alpha-synuclein than the control samples. Validation of the sample processing steps, mass spectrometry based measurements, and data processing steps were performed. The intact protein quantitation method features extraction and integration of m/z data from each charge state of a detected alpha-synuclein species and fitting of the data to a simple linear model which accounts for concentration and charge state variability. The quantitation method was validated with serial dilutions of intact protein standards. Using the method on the human brain samples, several previously unreported modifications in alpha-synuclein were identified. Low levels of phosphorylated alpha synuclein were detected in brain tissue fractions enriched for Lewy body pathology and were marginally significant between PD cases and controls (p = 0.03).

  12. From mission to measures: performance measure development for a Teen Pregnancy Prevention Program.

    PubMed

    Farb, Amy Feldman; Burrus, Barri; Wallace, Ina F; Wilson, Ellen K; Peele, John E

    2014-03-01

    The Office of Adolescent Health (OAH) sought to create a comprehensive set of performance measures to capture the performance of the Teen Pregnancy Prevention (TPP) program. This performance measurement system needed to provide measures that could be used internally (by both OAH and the TPP grantees) for management and program improvement as well as externally to communicate the program's progress to other interested stakeholders and Congress. This article describes the selected measures and outlines the considerations behind the TPP measurement development process. Issues faced, challenges encountered, and lessons learned have broad applicability for other federal agencies and, specifically, for TPP programs interested in assessing their own performance and progress. Published by Elsevier Inc.

  13. 26 CFR 801.2 - Measuring organizational performance.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 20 2010-04-01 2010-04-01 false Measuring organizational performance. 801.2 Section 801.2 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INTERNAL REVENUE PRACTICE BALANCED SYSTEM FOR MEASURING ORGANIZATIONAL AND EMPLOYEE PERFORMANCE WITHIN THE INTERNAL...

  14. Quantitative Susceptibility Mapping after Sports-Related Concussion.

    PubMed

    Koch, K M; Meier, T B; Karr, R; Nencka, A S; Muftuler, L T; McCrea, M

    2018-06-07

    Quantitative susceptibility mapping using MR imaging can assess changes in brain tissue structure and composition. This report presents preliminary results demonstrating changes in tissue magnetic susceptibility after sports-related concussion. Longitudinal quantitative susceptibility mapping metrics were produced from imaging data acquired from cohorts of concussed and control football athletes. One hundred thirty-six quantitative susceptibility mapping datasets were analyzed across 3 separate visits (24 hours after injury, 8 days postinjury, and 6 months postinjury). Longitudinal quantitative susceptibility mapping group analyses were performed on stability-thresholded brain tissue compartments and selected subregions. Clinical concussion metrics were also measured longitudinally in both cohorts and compared with the measured quantitative susceptibility mapping. Statistically significant increases in white matter susceptibility were identified in the concussed athlete group during the acute (24 hour) and subacute (day 8) period. These effects were most prominent at the 8-day visit but recovered and showed no significant difference from controls at the 6-month visit. The subcortical gray matter showed no statistically significant group differences. Observed susceptibility changes after concussion appeared to outlast self-reported clinical recovery metrics at a group level. At an individual subject level, susceptibility increases within the white matter showed statistically significant correlations with return-to-play durations. The results of this preliminary investigation suggest that sports-related concussion can induce physiologic changes to brain tissue that can be detected using MR imaging-based magnetic susceptibility estimates. In group analyses, the observed tissue changes appear to persist beyond those detected on clinical outcome assessments and were associated with return-to-play duration after sports-related concussion. © 2018 by American Journal of

  15. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    EPA Science Inventory

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  16. Quantitative Motor Performance and Sleep Benefit in Parkinson Disease.

    PubMed

    van Gilst, Merel M; van Mierlo, Petra; Bloem, Bastiaan R; Overeem, Sebastiaan

    2015-10-01

    Many people with Parkinson disease experience "sleep benefit": temporarily improved mobility upon awakening. Here we used quantitative motor tasks to assess the influence of sleep on motor functioning in Parkinson disease. Eighteen Parkinson patients with and 20 without subjective sleep benefit and 20 healthy controls participated. Before and directly after a regular night sleep and an afternoon nap, subjects performed the timed pegboard dexterity task and quantified finger tapping task. Subjective ratings of motor functioning and mood/vigilange were included. Sleep was monitored using polysomnography. On both tasks, patients were overall slower than healthy controls (night: F2,55 = 16.938, P < 0.001; nap: F2,55 = 15.331, P < 0.001). On the pegboard task, there was a small overall effect of night sleep (F1,55 = 9.695, P = 0.003); both patients and controls were on average slightly slower in the morning. However, in both tasks there was no sleep*group interaction for nighttime sleep nor for afternoon nap. There was a modest correlation between the score on the pegboard task and self-rated motor symptoms among patients (rho = 0.233, P = 0.004). No correlations in task performance and mood/vigilance or sleep time/efficiency were found. A positive effect of sleep on motor function is commonly reported by Parkinson patients. Here we show that the subjective experience of sleep benefit is not paralleled by an actual improvement in motor functioning. Sleep benefit therefore appears to be a subjective phenomenon and not a Parkinson-specific reduction in symptoms. © 2015 Associated Professional Sleep Societies, LLC.

  17. Basic quantitative polymerase chain reaction using real-time fluorescence measurements.

    PubMed

    Ares, Manuel

    2014-10-01

    This protocol uses quantitative polymerase chain reaction (qPCR) to measure the number of DNA molecules containing a specific contiguous sequence in a sample of interest (e.g., genomic DNA or cDNA generated by reverse transcription). The sample is subjected to fluorescence-based PCR amplification and, theoretically, during each cycle, two new duplex DNA molecules are produced for each duplex DNA molecule present in the sample. The progress of the reaction during PCR is evaluated by measuring the fluorescence of dsDNA-dye complexes in real time. In the early cycles, DNA duplication is not detected because inadequate amounts of DNA are made. At a certain threshold cycle, DNA-dye complexes double each cycle for 8-10 cycles, until the DNA concentration becomes so high and the primer concentration so low that the reassociation of the product strands blocks efficient synthesis of new DNA and the reaction plateaus. There are two types of measurements: (1) the relative change of the target sequence compared to a reference sequence and (2) the determination of molecule number in the starting sample. The first requires a reference sequence, and the second requires a sample of the target sequence with known numbers of the molecules of sequence to generate a standard curve. By identifying the threshold cycle at which a sample first begins to accumulate DNA-dye complexes exponentially, an estimation of the numbers of starting molecules in the sample can be extrapolated. © 2014 Cold Spring Harbor Laboratory Press.

  18. Strategic performance management: development of a performance measurement system at the Mayo Clinic.

    PubMed

    Curtright, J W; Stolp-Smith, S C; Edell, E S

    2000-01-01

    Managing and measuring performance become exceedingly complex as healthcare institutions evolve into integrated health systems comprised of hospitals, outpatient clinics and surgery centers, nursing homes, and home health services. Leaders of integrated health systems need to develop a methodology and system that align organizational strategies with performance measurement and management. To meet this end, multiple healthcare organizations embrace the performance-indicators reporting system known as a "balanced scorecard" or a "dashboard report." This discrete set of macrolevel indicators gives senior management a fast but comprehensive glimpse of the organization's performance in meeting its quality, operational, and financial goals. The leadership of outpatient operations for Mayo Clinic in Rochester, Minnesota built on this concept by creating a performance management and measurement system that monitors and reports how well the organization achieves its performance goals. Internal stakeholders identified metrics to measure performance in each key category. Through these metrics, the organization links Mayo Clinic's vision, primary value, core principles, and day-to-day operations by monitoring key performance indicators on a weekly, monthly, or quarterly basis.

  19. Quantitative Laughter Detection, Measurement, and Classification-A Critical Survey.

    PubMed

    Cosentino, Sarah; Sessa, Salvatore; Takanishi, Atsuo

    2016-01-01

    The study of human nonverbal social behaviors has taken a more quantitative and computational approach in recent years due to the development of smart interfaces and virtual agents or robots able to interact socially. One of the most interesting nonverbal social behaviors, producing a characteristic vocal signal, is laughing. Laughter is produced in several different situations: in response to external physical, cognitive, or emotional stimuli; to negotiate social interactions; and also, pathologically, as a consequence of neural damage. For this reason, laughter has attracted researchers from many disciplines. A consequence of this multidisciplinarity is the absence of a holistic vision of this complex behavior: the methods of analysis and classification of laughter, as well as the terminology used, are heterogeneous; the findings sometimes contradictory and poorly documented. This survey aims at collecting and presenting objective measurement methods and results from a variety of different studies in different fields, to contribute to build a unified model and taxonomy of laughter. This could be successfully used for advances in several fields, from artificial intelligence and human-robot interaction to medicine and psychiatry.

  20. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity

    PubMed Central

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A.; Bradford, William D.; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S.; Li, Rong

    2015-01-01

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein−based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. PMID:25823586

  1. Quantitative comparison between PGNAA measurements and MCNP calculations in view of the characterization of radioactive wastes in Germany and France

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauerhofer, E.; Havenith, A.; Kettler, J.

    The Forschungszentrum Juelich GmbH (FZJ), together with the Aachen University Rheinisch-Westfaelische Technische Hochschule (RWTH) and the French Alternative Energies and Atomic Energy Commission (CEA Cadarache) are involved in a cooperation aiming at characterizing toxic and reactive elements in radioactive waste packages by means of Prompt Gamma Neutron Activation Analysis (PGNAA). The French and German waste management agencies have indeed defined acceptability limits concerning these elements in view of their projected geological repositories. A first measurement campaign was performed in the new Prompt Gamma Neutron Activation Analysis (PGNAA) facility called MEDINA, at FZJ, to assess the capture gamma-ray signatures of somemore » elements of interest in large samples up to waste drums with a volume of 200 liter. MEDINA is the acronym for Multi Element Detection based on Instrumental Neutron Activation. This paper presents MCNP calculations of the MEDINA facility and quantitative comparison between measurement and simulation. Passive gamma-ray spectra acquired with a high purity germanium detector and calibration sources are used to qualify the numerical model of the crystal. Active PGNAA spectra of a sodium chloride sample measured with MEDINA then allow for qualifying the global numerical model of the measurement cell. Chlorine indeed constitutes a usual reference with reliable capture gamma-ray production data. The goal is to characterize the entire simulation protocol (geometrical model, nuclear data, and postprocessing tools) which will be used for current measurement interpretation, extrapolation of the performances to other types of waste packages or other applications, as well as for the study of future PGNAA facilities.« less

  2. Quantitative comparison between PGNAA measurements and MCNP calculations in view of the characterization of radioactive wastes in Germany and France

    NASA Astrophysics Data System (ADS)

    Mauerhofer, E.; Havenith, A.; Carasco, C.; Payan, E.; Kettler, J.; Ma, J. L.; Perot, B.

    2013-04-01

    The Forschungszentrum Jülich GmbH (FZJ), together with the Aachen University Rheinisch-Westfaelische Technische Hochschule (RWTH) and the French Alternative Energies and Atomic Energy Commission (CEA Cadarache) are involved in a cooperation aiming at characterizing toxic and reactive elements in radioactive waste packages by means of Prompt Gamma Neutron Activation Analysis (PGNAA) [1]. The French and German waste management agencies have indeed defined acceptability limits concerning these elements in view of their projected geological repositories. A first measurement campaign was performed in the new Prompt Gamma Neutron Activation Analysis (PGNAA) facility called MEDINA, at FZJ, to assess the capture gamma-ray signatures of some elements of interest in large samples up to waste drums with a volume of 200 liter. MEDINA is the acronym for Multi Element Detection based on Instrumental Neutron Activation. This paper presents MCNP calculations of the MEDINA facility and quantitative comparison between measurement and simulation. Passive gamma-ray spectra acquired with a high purity germanium detector and calibration sources are used to qualify the numerical model of the crystal. Active PGNAA spectra of a sodium chloride sample measured with MEDINA then allow for qualifying the global numerical model of the measurement cell. Chlorine indeed constitutes a usual reference with reliable capture gamma-ray production data. The goal is to characterize the entire simulation protocol (geometrical model, nuclear data, and postprocessing tools) which will be used for current measurement interpretation, extrapolation of the performances to other types of waste packages or other applications, as well as for the study of future PGNAA facilities.

  3. Quantitative evaluation improves specificity of myocardial perfusion SPECT in the assessment of functionally significant intermediate coronary artery stenoses: a comparative study with fractional flow reserve measurements.

    PubMed

    Sahiner, Ilgin; Akdemir, Umit O; Kocaman, Sinan A; Sahinarslan, Asife; Timurkaynak, Timur; Unlu, Mustafa

    2013-02-01

    Myocardial perfusion SPECT (MPS) is a noninvasive method commonly used for assessment of the hemodynamic significance of intermediate coronary stenoses. Fractional flow reserve (FFR) measurement is a well-validated invasive method used for the evaluation of intermediate stenoses. We aimed to determine the association between MPS and FFR findings in intermediate degree stenoses and evaluate the added value of quantification in MPS. Fifty-eight patients who underwent intracoronary pressure measurement in the catheterization laboratory to assess the physiological significance of intermediate (40-70%) left anterior descending (LAD) artery lesions, and who also underwent stress myocardial perfusion SPECT either for the assessment of an intermediate stenosis or for suspected coronary artery disease were analyzed retrospectively in the study. Quantitative analysis was performed using the 4DMSPECT program, with visual assessment performed by two experienced nuclear medicine physicians blinded to the angiographic findings. Summed stress scores (SSS) and summed difference scores (SDS) in the LAD artery territory according to the 20 segment model were calculated. A summed stress score of ≥ 3 and an SDS of ≥ 2 were assumed as pathologic, indicating significance of the lesion; a cutoff value of 0.75 was used to define abnormal FFR. Both visual and quantitative assessment results were compared with FFR using Chi-square (χ²) test. The mean time interval between two studies was 13 ± 11 days. FFR was normal in 45 and abnormal in 13 patients. Considering the FFR results as the gold standard method for assessing the significance of the lesion, the sensitivity and specificity of quantitative analysis determining the abnormal flow reserve were 85 and 84%, respectively, while visual analysis had a sensitivity of 77% and a specificity of 51%. There was a good agreement between the observers (κ = 0.856). Summed stress and difference scores demonstrated moderate inverse

  4. A literature review of quantitative indicators to measure the quality of labor and delivery care.

    PubMed

    Tripathi, Vandana

    2016-02-01

    Strengthening measurement of the quality of labor and delivery (L&D) care in low-resource countries requires an understanding of existing approaches. To identify quantitative indicators of L&D care quality and assess gaps in indicators. PubMed, CINAHL Plus, and Embase databases were searched for research published in English between January 1, 1990, and October 31, 2013, using structured terms. Studies describing indicators for L&D care quality assessment were included. Those whose abstracts contained inclusion criteria underwent full-text review. Study characteristics, including indicator selection and data sources, were extracted via a standard spreadsheet. The structured search identified 1224 studies. After abstract and full-text review, 477 were included in the analysis. Most studies selected indicators by using literature review, clinical guidelines, or expert panels. Few indicators were empirically validated; most studies relied on medical record review to measure indicators. Many quantitative indicators have been used to measure L&D care quality, but few have been validated beyond expert opinion. There has been limited use of clinical observation in quality assessment of care processes. The findings suggest the need for validated, efficient consensus indicators of the quality of L&D care processes, particularly in low-resource countries. Copyright © 2015 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Performance measurement: A tool for program control

    NASA Technical Reports Server (NTRS)

    Abell, Nancy

    1994-01-01

    Performance measurement is a management tool for planning, monitoring, and controlling as aspects of program and project management--cost, schedule, and technical requirements. It is a means (concept and approach) to a desired end (effective program planning and control). To reach the desired end, however, performance measurement must be applied and used appropriately, with full knowledge and recognition of its power and of its limitations--what it can and cannot do for the project manager. What is the potential of this management tool? What does performance measurement do that a traditional plan vs. actual technique cannot do? Performance measurement provides an improvement over the customary comparison of how much money was spent (actual cost) vs. how much was planned to be spent based on a schedule of activities (work planned). This commonly used plan vs. actual comparison does not allow one to know from the numerical data if the actual cost incurred was for work intended to be done.

  6. Initial description of a quantitative, cross-species (chimpanzee-human) social responsiveness measure.

    PubMed

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E; Constantino, John N; Povinelli, Daniel J; Pruett, John R

    2011-05-01

    Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species (human-chimpanzee) social responsiveness measure. We translated the Social Responsiveness Scale (SRS), an instrument that quantifies human social responsiveness, into an analogous instrument for chimpanzees. We then retranslated this "Chimpanzee SRS" into a human "Cross-Species SRS" (XSRS). We evaluated three groups of chimpanzees (n = 29) with the Chimpanzee SRS and typical and human children with autism spectrum disorder (ASD; n = 20) with the XSRS. The Chimpanzee SRS demonstrated strong interrater reliability at the three sites (ranges for individual ICCs: 0.534 to 0.866; mean ICCs: 0.851 to 0.970). As has been observed in human beings, exploratory principal components analysis of Chimpanzee SRS scores supports a single factor underlying chimpanzee social responsiveness. Human subjects' XSRS scores were fully concordant with their SRS scores (r = 0.976, p = .001) and distinguished appropriately between typical and ASD subjects. One chimpanzee known for inappropriate social behavior displayed a significantly higher score than all other chimpanzees at its site, demonstrating the scale's ability to detect impaired social responsiveness in chimpanzees. Our initial cross-species social responsiveness scale proved reliable and discriminated differences in social responsiveness across (in a relative sense) and within (in a more objectively quantifiable manner) human beings and chimpanzees. Copyright © 2011 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  7. Business process performance measurement: a structured literature review of indicators, measures and metrics.

    PubMed

    Van Looy, Amy; Shafagatova, Aygun

    2016-01-01

    Measuring the performance of business processes has become a central issue in both academia and business, since organizations are challenged to achieve effective and efficient results. Applying performance measurement models to this purpose ensures alignment with a business strategy, which implies that the choice of performance indicators is organization-dependent. Nonetheless, such measurement models generally suffer from a lack of guidance regarding the performance indicators that exist and how they can be concretized in practice. To fill this gap, we conducted a structured literature review to find patterns or trends in the research on business process performance measurement. The study also documents an extended list of 140 process-related performance indicators in a systematic manner by further categorizing them into 11 performance perspectives in order to gain a holistic view. Managers and scholars can consult the provided list to choose the indicators that are of interest to them, considering each perspective. The structured literature review concludes with avenues for further research.

  8. Performance Measures for Student Success, 2016

    ERIC Educational Resources Information Center

    North Carolina Community College System, 2016

    2016-01-01

    The Performance Measures for Student Success Report is the North Carolina Community College System's major accountability document. This annual performance report is based on data compiled from the previous year and serves to inform colleges and the public on the performance of our 58 community colleges. In 1993, the State Board of Community…

  9. Performance Measures for Student Success, 2013

    ERIC Educational Resources Information Center

    North Carolina Community College System, 2013

    2013-01-01

    The Performance Measures for Student Success Report is the North Carolina Community College System's major accountability document. This annual performance report is based on data compiled from the previous year and serves to inform colleges and the public on the performance of North Carolina's 58 community colleges. In 2010, President Scott Ralls…

  10. Performance Measures for Student Success, 2014

    ERIC Educational Resources Information Center

    North Carolina Community College System, 2014

    2014-01-01

    The Performance Measures for Student Success Report is the North Carolina Community College System's major accountability document. This annual performance report is based on data compiled from the previous year and serves to inform colleges and the public on the performance of North Carolina's 58 community colleges. In 2010, President Scott Ralls…

  11. High Resolution Qualitative and Quantitative MR Evaluation of the Glenoid Labrum

    PubMed Central

    Iwasaki, Kenyu; Tafur, Monica; Chang, Eric Y.; SherondaStatum; Biswas, Reni; Tran, Betty; Bae, Won C.; Du, Jiang; Bydder, Graeme M.; Chung, Christine B.

    2015-01-01

    Objective To implement qualitative and quantitative MR sequences for the evaluation of labral pathology. Methods Six glenoid labra were dissected and the anterior and posterior portions were divided into normal, mildly degenerated, or severely degenerated groups using gross and MR findings. Qualitative evaluation was performed using T1-weighted, proton density-weighted (PD), spoiled gradient echo (SPGR) and ultra-short echo time (UTE) sequences. Quantitative evaluation included T2 and T1rho measurements as well as T1, T2*, and T1rho measurements acquired with UTE techniques. Results SPGR and UTE sequences best demonstrated labral fiber structure. Degenerated labra had a tendency towards decreased T1 values, increased T2/T2* values and increased T1 rho values. T2* values obtained with the UTE sequence allowed for delineation between normal, mildly degenerated and severely degenerated groups (p<0.001). Conclusion Quantitative T2* measurements acquired with the UTE technique are useful for distinguishing between normal, mildly degenerated and severely degenerated labra. PMID:26359581

  12. Synthesis of work-zone performance measures.

    DOT National Transportation Integrated Search

    2013-09-01

    The main objective of this synthesis was to identify and summarize how agencies collect, analyze, and report different work-zone : traffic-performance measures, which include exposure, mobility, and safety measures. The researchers also examined comm...

  13. Quantitative Species Measurements in Microgravity Combustion Flames

    NASA Technical Reports Server (NTRS)

    Silver, Joel A.; Wood, William R.; Chen, Shin-Juh; Dahm, Werner J. A.; Piltch, Nancy D.

    2001-01-01

    Flame-vortex interactions are canonical configurations that can be used to study the underlying processes occurring in complicated turbulent reacting flows. The elegant simplicity of the flame-vortex interaction permits the study of these complex interactions under relatively controllable experimental configurations, in contrast to direct measurements in turbulent flames. The ability to measure and model the fundamental phenomena that occur in a turbulent flame, but with time and spatial scales which are amenable to our diagnostics, permits significant improvements in the understanding of turbulent combustion under both normal and reduced gravity conditions. In this paper, we report absolute mole fraction measurements of methane in a reacting vortex ring. These microgravity experiments are performed in the 2.2-sec drop tower at NASA Glenn Research Center. In collaboration with Drs. Chen and Dahm at the University of Michigan, measured methane absorbances are incorporated into a new model from which the temperature and concentrations of all major gases in the flame can be determined at all positions and times in the development of the vortex ring. This is the first demonstration of the ITAC (Iterative Temperature with Assumed Chemistry) approach, and the results of these computations and analyses are presented in a companion paper by Dahm and Chen at this Workshop. We believe that the ITAC approach will become a powerful tool in understanding a wide variety of combustion flames under both equilibrium and non-equilibrium conditions.

  14. Research notes : transportation planning performance measures.

    DOT National Transportation Integrated Search

    2006-06-01

    Performance measurement can be defined as an assessment of progress toward goals. In transportation planning a good measure is clearly defined, is acceptable to stakeholders, allows for economical data collection and analysis, shows how well th...

  15. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  16. Development of congestion performance measures using ITS information.

    DOT National Transportation Integrated Search

    2003-01-01

    The objectives of this study were to define a performance measure(s) that could be used to show congestion levels on critical corridors throughout Virginia and to develop a method to select and calculate performance measures to quantify congestion in...

  17. Laser flare photometry: a noninvasive, objective, and quantitative method to measure intraocular inflammation.

    PubMed

    Tugal-Tutkun, Ilknur; Herbort, Carl P

    2010-10-01

    Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis.

  18. Alternative performance measures for evaluating congestion.

    DOT National Transportation Integrated Search

    2004-04-01

    This report summarizes the results of the work performed under the project Alternative Performance Measures for Evaluating : Congestion. The study first outlines existing approaches to looking at congestion. It then builds on the previous work in the...

  19. Quantitative measurement of stream respiration using the resazurin-resorufin system

    NASA Astrophysics Data System (ADS)

    Gonzalez Pinzon, R. A.; Acker, S.; Haggerty, R.; Myrold, D.

    2011-12-01

    After three decades of active research in hydrology and stream ecology, the relationship between stream solute transport, metabolism and nutrient dynamics is still unresolved. These knowledge gaps obscure the function of stream ecosystems and how they interact with other landscape processes. To date, measuring rates of stream metabolism is accomplished with techniques that have vast uncertainties and are not spatially representative. These limitations mask the role of metabolism in nutrient processing. Clearly, more robust techniques are needed to develop mechanistic relationships that will ultimately improve our fundamental understanding of in-stream processes and how streams interact with other ecosystems. We investigated the "metabolic window of detection" of the Resazurin (Raz)-Resorufin (Rru) system (Haggerty et al., 2008, 2009). Although previous results have shown that the transformation of Raz to Rru is strongly correlated with respiration, a quantitative relationship between them is needed. We investigated this relationship using batch experiments with pure cultures (aerobic and anaerobic) and flow-through columns with incubated sediments from four different streams. The results suggest that the Raz-Rru system is a suitable approach that will enable hydrologists and stream ecologists to measure in situ and in vivo respiration at different scales, thus opening a reliable alternative to investigate how solute transport and stream metabolism control nutrient processing.

  20. Evaluation of emergency department performance - a systematic review on recommended performance and quality-in-care measures.

    PubMed

    Sørup, Christian Michel; Jacobsen, Peter; Forberg, Jakob Lundager

    2013-08-09

    Evaluation of emergency department (ED) performance remains a difficult task due to the lack of consensus on performance measures that reflects high quality, efficiency, and sustainability. To describe, map, and critically evaluate which performance measures that the published literature regard as being most relevant in assessing overall ED performance. Following the PRISMA guidelines, a systematic literature review of review articles reporting accentuated ED performance measures was conducted in the databases of PubMed, Cochrane Library, and Web of Science. Study eligibility criteria includes: 1) the main purpose was to discuss, analyse, or promote performance measures best reflecting ED performance, 2) the article was a review article, and 3) the article reported macro-level performance measures, thus reflecting an overall departmental performance level. A number of articles addresses this study's objective (n = 14 of 46 unique hits). Time intervals and patient-related measures were dominant in the identified performance measures in review articles from US, UK, Sweden and Canada. Length of stay (LOS), time between patient arrival to initial clinical assessment, and time between patient arrivals to admission were highlighted by the majority of articles. Concurrently, "patients left without being seen" (LWBS), unplanned re-attendance within a maximum of 72 hours, mortality/morbidity, and number of unintended incidents were the most highlighted performance measures that related directly to the patient. Performance measures related to employees were only stated in two of the 14 included articles. A total of 55 ED performance measures were identified. ED time intervals were the most recommended performance measures followed by patient centeredness and safety performance measures. ED employee related performance measures were rarely mentioned in the investigated literature. The study's results allow for advancement towards improved performance measurement and

  1. PERFORMANCE MEASUREMENT IN TRIBAL HOME VISITING: CHALLENGES AND OPPORTUNITIES.

    PubMed

    Morales, Julie R; Ferron, Cathy; Whitmore, Corrie; Reifel, Nancy; Geary, Erin; Anderson, Cyndi; Mcdaniel, Judy

    2018-05-01

    Over the last several decades, performance measurement has become an increasingly prevalent requirement among human services agencies for demonstrating program progress and achieving outcomes. In the Tribal Maternal, Infant, and Early Childhood Home Visiting Program (Tribal MIECHV), performance measurement was one of the central components of the Administration for Children and Families' cooperative agreements to tribes, urban Indian organizations, and tribal organizations. Since the inception of the Tribal MIECHV Program in 2010, the benchmark requirement was intended to be a mechanism to systematically monitor program progress and performance toward improving the quality of home-visiting programs that serve vulnerable American Indian or Alaska Native families. In this article, we examine performance measurement in the context of Tribal MIECHV, providing an overview of performance measurement, the Tribal MIECHV requirement, and how grantees experienced the requirement; we describe the existing literature on performance measurement challenges and benefits, and the specific challenges and advantages experienced by tribal grantees; and provide recommendations for performance measurement in tribal home-visiting contexts based on grantees' own experiences. This article contributes to the literature by examining performance measurement challenges and opportunities in the context of tribal communities, and provides recommendations that may inform future policy on performance measurement design and implementation in tribal communities. © 2018 Michigan Association for Infant Mental Health.

  2. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    PubMed

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers

  3. Sooting turbulent jet flame: characterization and quantitative soot measurements

    NASA Astrophysics Data System (ADS)

    Köhler, M.; Geigle, K. P.; Meier, W.; Crosland, B. M.; Thomson, K. A.; Smallwood, G. J.

    2011-08-01

    Computational fluid dynamics (CFD) modelers require high-quality experimental data sets for validation of their numerical tools. Preferred features for numerical simulations of a sooting, turbulent test case flame are simplicity (no pilot flame), well-defined boundary conditions, and sufficient soot production. This paper proposes a non-premixed C2H4/air turbulent jet flame to fill this role and presents an extensive database for soot model validation. The sooting turbulent jet flame has a total visible flame length of approximately 400 mm and a fuel-jet Reynolds number of 10,000. The flame has a measured lift-off height of 26 mm which acts as a sensitive marker for CFD model validation, while this novel compiled experimental database of soot properties, temperature and velocity maps are useful for the validation of kinetic soot models and numerical flame simulations. Due to the relatively simple burner design which produces a flame with sufficient soot concentration while meeting modelers' needs with respect to boundary conditions and flame specifications as well as the present lack of a sooting "standard flame", this flame is suggested as a new reference turbulent sooting flame. The flame characterization presented here involved a variety of optical diagnostics including quantitative 2D laser-induced incandescence (2D-LII), shifted-vibrational coherent anti-Stokes Raman spectroscopy (SV-CARS), and particle image velocimetry (PIV). Producing an accurate and comprehensive characterization of a transient sooting flame was challenging and required optimization of these diagnostics. In this respect, we present the first simultaneous, instantaneous PIV, and LII measurements in a heavily sooting flame environment. Simultaneous soot and flow field measurements can provide new insights into the interaction between a turbulent vortex and flame chemistry, especially since soot structures in turbulent flames are known to be small and often treated in a statistical manner.

  4. English Value-Added Measures: Examining the Limitations of School Performance Measurement

    ERIC Educational Resources Information Center

    Perry, Thomas

    2016-01-01

    Value-added "Progress" measures are to be introduced for all English schools in 2016 as "headline" measures of school performance. This move comes despite research highlighting high levels of instability in value-added measures and concerns about the omission of contextual variables in the planned measure. This article studies…

  5. Transportation planning performance measures : appendices.

    DOT National Transportation Integrated Search

    2005-10-01

    The article is the appendices for Transportation Planning Performance Measures. : Oregon transportation plans, including the statewide Oregon Transportation Plan, and current regional transportation plans for the Portland, Salem, Eugene, and Medford ...

  6. Trucking in Georgia : freight performance measures.

    DOT National Transportation Integrated Search

    2011-11-16

    This report provides a review of the recent literature on the development of truck freight performance measures, : and specifically measures that can assist the Georgia Department of Transportation in assessing, and in tracking : from year to year, h...

  7. Performance Evaluation of the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit: Comparison with the Roche COBAS® AmpliPrep/COBAS TaqMan® HIV-1 Test Ver.2.0 for Quantification of HIV-1 Viral Load in Indonesia.

    PubMed

    Kosasih, Agus Susanto; Sugiarto, Christine; Hayuanta, Hubertus Hosti; Juhaendi, Runingsih; Setiawan, Lyana

    2017-08-08

    Measurement of viral load in human immunodeficiency virus type 1 (HIV-1) infected patients is essential for the establishment of a therapeutic strategy. Several assays based on qPCR are available for the measurement of viral load; they differ in sample volume, technology applied, target gene, sensitivity and dynamic range. The Bioneer AccuPower® HIV-1 Quantitative RT-PCR is a novel commercial kit that has not been evaluated for its performance. This study aimed to evaluate the performance of the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit. In total, 288 EDTA plasma samples from the Dharmais Cancer Hospital were analyzed with the Bioneer AccuPower® HIV-1 Quantitative RT-PCR kit and the Roche COBAS? AmpliPrep/COBAS® TaqMan® HIV-1 version 2.0 (CAP/CTM v2.0). The performance of the Bioneer assay was then evaluated against the Roche CAP/CTM v2.0. Overall, there was good agreement between the two assays. The Bioneer assay showed significant linear correlation with CAP/CTM v2.0 (R2=0.963, p<0.001) for all samples (N=118) which were quantified by both assays, with high agreement (94.9%, 112/118) according to the Bland-Altman model. The mean difference between the quantitative values measured by Bioneer assay and CAP/CTM v2.0 was 0.11 Log10 IU/mL (SD=0.26). Based on these results, the Bioneer assay can be used to quantify HIV-1 RNA in clinical laboratories.

  8. Association between quantitative measures obtained using fluorescence-based methods and activity status of occlusal caries lesions in primary molars.

    PubMed

    Novaes, Tatiane Fernandes; Reyes, Alessandra; Matos, Ronilza; Antunes-Pontes, Laura Regina; Marques, Renata Pereira de Samuel; Braga, Mariana Minatel; Diniz, Michele Baffi; Mendes, Fausto Medeiros

    2017-05-01

    Fluorescence-based methods (FBM) can add objectiveness to diagnosis strategy for caries. Few studies, however, have focused on the evaluation of caries activity. To evaluate the association between quantitative measures obtained with FBM, clinical parameters acquired from the patients, caries detection, and assessment of activity status in occlusal surfaces of primary molars. Six hundred and six teeth from 113 children (4-14 years) were evaluated. The presence of a biofilm, caries experience, and the number of active lesions were recorded. The teeth were assessed using FBM: DIAGNOdent pen (Lfpen) and Quantitative light-induced fluorescence (QLF). As reference standard, all teeth were evaluated using the ICDAS (International Caries Detection and Assessment System) associated with clinical activity assessments. Multilevel regressions compared the FBM values and evaluated the association between the FBM measures and clinical variables related to the caries activity. The measures from the FBM were higher in cavitated lesions. Only, ∆F values distinguished active and inactive lesions. The LFpen measures were higher in active lesions, at the cavitated threshold (56.95 ± 29.60). Following regression analyses, only the presence of visible biofilm on occlusal surfaces (adjusted prevalence ratio = 1.43) and ∆R values of the teeth (adjusted prevalence ratio = 1.02) were associated with caries activity. Some quantitative measures from FBM parameters are associated with caries activity evaluation, which is similar to the clinical evaluation of the presence of visible biofilm. © 2016 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Performance measurement in surgery through the National Quality Forum.

    PubMed

    Hyder, Joseph A; Roy, Nathalie; Wakeam, Elliot; Hernandez, Roland; Kim, Simon P; Bader, Angela M; Cima, Robert R; Nguyen, Louis L

    2014-11-01

    Performance measurement has become central to surgical practice. We systematically reviewed all endorsed performance measures from the National Quality Forum, the national clearing house for performance measures in health care, to identify measures relevant to surgical practice and describe measure stewardship, measure types, and identify gaps in measurement. Performance measures current to June 2014 were categorized by denominator statement as either assessing surgical practice in specific or as part of a mixed medical and surgical population. Measures were further classified by surgical specialty, Donabedian measure type, patients, disease and events targeted, reporting eligibility, and measure stewards. Of 637 measures, 123 measures assessed surgical performance in specific and 123 assessed surgical performance in aggregate. Physician societies (51 of 123, 41.5%) were more common than government agencies (32 of 123, 26.0%) among measure stewards for surgical measures, in particular, the Society for Thoracic Surgery (n = 32). Outcomes measures rather than process measures were common among surgical measures (62 of 123, 50.4%) compared with aggregate medical/surgical measures (46 of 123, 37.4%). Among outcomes measures, death alone was the most commonly specified outcome (24 of 62, 38.7%). Only 1 surgical measure addressed patient-centered care and only 1 measure addressed hospital readmission. We found 7 current surgical measures eligible for value-based purchasing. Surgical society stewards and outcomes measure types, particularly for cardiac surgery, were well represented in the National Quality Forum. Measures addressing patient-centered outcomes and the value of surgical decision-making were not well represented and may be suitable targets for measure innovation. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  10. Beyond Math Skills: Measuring Quantitative Reasoning in Context

    ERIC Educational Resources Information Center

    Grawe, Nathan D.

    2011-01-01

    It might be argued that quantitative and qualitative analyses are merely two alternative reflections of an overarching critical thinking. For instance, just as instructors of numeracy warn their charges to consider the construction of variables, teachers of qualitative approaches caution students to define terms. Similarly, an advocate of…

  11. Capsular Outcomes After Pediatric Cataract Surgery Without Intraocular Lens Implantation: Qualitative Classification and Quantitative Measurement.

    PubMed

    Tan, Xuhua; Lin, Haotian; Lin, Zhuoling; Chen, Jingjing; Tang, Xiangchen; Luo, Lixia; Chen, Weirong; Liu, Yizhi

    2016-03-01

    The objective of this study was to investigate capsular outcomes 12 months after pediatric cataract surgery without intraocular lens implantation via qualitative classification and quantitative measurement.This study is a cross-sectional study that was approved by the institutional review board of Zhongshan Ophthalmic Center of Sun Yat-sen University in Guangzhou, China.Digital coaxial retro-illumination photographs of 329 aphakic pediatric eyes were obtained 12 months after pediatric cataract surgery without intraocular lens implantation. Capsule digital coaxial retro-illumination photographs were divided as follows: anterior capsule opening area (ACOA), posterior capsule opening area (PCOA), and posterior capsule opening opacity (PCOO). Capsular outcomes were qualitatively classified into 3 types based on the PCOO: Type I-capsule with mild opacification but no invasion into the capsule opening; Type II-capsule with moderate opacification accompanied by contraction of the ACOA and invasion to the occluding part of the PCOA; and Type III-capsule with severe opacification accompanied by total occlusion of the PCOA. Software was developed to quantitatively measure the ACOA, PCOA, and PCOO using standardized DCRPs. The relationships between the accurate intraoperative anterior and posterior capsulorhexis sizes and the qualitative capsular types were statistically analyzed.The DCRPs of 315 aphakic eyes (95.8%) of 191 children were included. Capsular outcomes were classified into 3 types: Type I-120 eyes (38.1%); Type II-157 eyes (49.8%); Type III-38 eyes (12.1%). The scores of the capsular outcomes were negatively correlated with intraoperative anterior capsulorhexis size (R = -0.572, P < 0.001), but no significant correlation with intraoperative posterior capsulorhexis size (R = -0.16, P = 0.122) was observed. The ACOA significantly decreased from Type I to Type II to Type III, the PCOA increased in size from Type I to Type II, and the PCOO increased

  12. Quantitative computed tomography measurements of emphysema for diagnosing asthma-chronic obstructive pulmonary disease overlap syndrome

    PubMed Central

    Xie, Mengshuang; Wang, Wei; Dou, Shuang; Cui, Liwei; Xiao, Wei

    2016-01-01

    Background The diagnostic criteria of asthma–COPD overlap syndrome (ACOS) are controversial. Emphysema is characteristic of COPD and usually does not exist in typical asthma patients. Emphysema in patients with asthma suggests the coexistence of COPD. Quantitative computed tomography (CT) allows repeated evaluation of emphysema noninvasively. We investigated the value of quantitative CT measurements of emphysema in the diagnosis of ACOS. Methods This study included 404 participants; 151 asthma patients, 125 COPD patients, and 128 normal control subjects. All the participants underwent pulmonary function tests and a high-resolution CT scan. Emphysema measurements were taken with an Airway Inspector software. The asthma patients were divided into high and low emphysema index (EI) groups based on the percentage of low attenuation areas less than −950 Hounsfield units. The characteristics of asthma patients with high EI were compared with those having low EI or COPD. Results The normal value of percentage of low attenuation areas less than −950 Hounsfield units in Chinese aged >40 years was 2.79%±2.37%. COPD patients indicated more severe emphysema and more upper-zone-predominant distribution of emphysema than asthma patients or controls. Thirty-two (21.2%) of the 151 asthma patients had high EI. Compared with asthma patients with low EI, those with high EI were significantly older, more likely to be male, had more pack-years of smoking, had more upper-zone-predominant distribution of emphysema, and had greater airflow limitation. There were no significant differences in sex ratios, pack-years of smoking, airflow limitation, or emphysema distribution between asthma patients with high EI and COPD patients. A greater number of acute exacerbations were seen in asthma patients with high EI compared with those with low EI or COPD. Conclusion Asthma patients with high EI fulfill the features of ACOS, as described in the Global Initiative for Asthma and Global

  13. Work zone performance measures pilot test.

    DOT National Transportation Integrated Search

    2011-04-01

    Currently, a well-defined and validated set of metrics to use in monitoring work zone performance do not : exist. This pilot test was conducted to assist state DOTs in identifying what work zone performance : measures can and should be targeted, what...

  14. Quantitative background parenchymal uptake on molecular breast imaging and breast cancer risk: a case-control study.

    PubMed

    Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M

    2018-06-05

    Background parenchymal uptake (BPU), which refers to the level of Tc-99m sestamibi uptake within normal fibroglandular tissue on molecular breast imaging (MBI), has been identified as a breast cancer risk factor, independent of mammographic density. Prior analyses have used subjective categories to describe BPU. We evaluate a new quantitative method for assessing BPU by testing its reproducibility, comparing quantitative results with previously established subjective BPU categories, and determining the association of quantitative BPU with breast cancer risk. Two nonradiologist operators independently performed region-of-interest analysis on MBI images viewed in conjunction with corresponding digital mammograms. Quantitative BPU was defined as a unitless ratio of the average pixel intensity (counts/pixel) within the fibroglandular tissue versus the average pixel intensity in fat. Operator agreement and the correlation of quantitative BPU measures with subjective BPU categories assessed by expert radiologists were determined. Percent density on mammograms was estimated using Cumulus. The association of quantitative BPU with breast cancer (per one unit BPU) was examined within an established case-control study of 62 incident breast cancer cases and 177 matched controls. Quantitative BPU ranged from 0.4 to 3.2 across all subjects and was on average higher in cases compared to controls (1.4 versus 1.2, p < 0.007 for both operators). Quantitative BPU was strongly correlated with subjective BPU categories (Spearman's r = 0.59 to 0.69, p < 0.0001, for each paired combination of two operators and two radiologists). Interoperator and intraoperator agreement in the quantitative BPU measure, assessed by intraclass correlation, was 0.92 and 0.98, respectively. Quantitative BPU measures showed either no correlation or weak negative correlation with mammographic percent density. In a model adjusted for body mass index and percent density, higher quantitative BPU was

  15. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    PubMed

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  16. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  17. Quantitative colorectal cancer perfusion measurement using dynamic contrast-enhanced multidetector-row computed tomography: effect of acquisition time and implications for protocols.

    PubMed

    Goh, Vicky; Halligan, Steve; Hugill, Jo-Ann; Gartner, Louise; Bartram, Clive I

    2005-01-01

    To determine the effect of acquisition time on quantitative colorectal cancer perfusion measurement. Dynamic contrast-enhanced computed tomography (CT) was performed prospectively in 10 patients with histologically proven colorectal cancer using 4-detector row CT (Lightspeed Plus; GE Healthcare Technologies, Waukesha, WI). Tumor blood flow, blood volume, mean transit time, and permeability were assessed for 3 acquisition times (45, 65, and 130 seconds). Mean values for all 4 perfusion parameters for each acquisition time were compared using the paired t test. Significant differences in permeability values were noted between acquisitions of 45 seconds and 65 and 130 seconds, respectively (P=0.02, P=0.007). There was no significant difference for values of blood volume, blood flow, and mean transit time between any of the acquisition times. Scan acquisitions of 45 seconds are too short for reliable permeability measurement in the abdomen. Longer acquisition times are required.

  18. The development of NEdSERV: quantitative instrumentation to measure service quality in nurse education.

    PubMed

    Roberts, P

    1999-07-01

    The political climate of health care provision and education for health care in the latter years of the 20th century is evolving from the uncertainty of newly created markets to a more clearly focused culture of collaboration, dissemination of good practice, with an increased emphasis on quality provision and its measurement. The need for provider units to prove and improve efficiency and effectiveness through evidence-based quality strategies in order to stay firmly in the market place has never been more necessary. The measurement of customer expectations and perceptions of delivered service quality is widely utilized as a basis for customer retention and business growth in both commercial and non-profit organizations. This paper describes the methodological development of NEdSERV--quantitative instrumentation designed to measure and respond to ongoing stakeholder expectations and perceptions of delivered service quality within nurse education.

  19. Quantitative measurements of in-cylinder gas composition in a controlled auto-ignition combustion engine

    NASA Astrophysics Data System (ADS)

    Zhao, H.; Zhang, S.

    2008-01-01

    One of the most effective means to achieve controlled auto-ignition (CAI) combustion in a gasoline engine is by the residual gas trapping method. The amount of residual gas and mixture composition have significant effects on the subsequent combustion process and engine emissions. In order to obtain quantitative measurements of in-cylinder residual gas concentration and air/fuel ratio, a spontaneous Raman scattering (SRS) system has been developed recently. The optimized optical SRS setups are presented and discussed. The temperature effect on the SRS measurement is considered and a method has been developed to correct for the overestimated values due to the temperature effect. Simultaneous measurements of O2, H2O, CO2 and fuel were obtained throughout the intake, compression, combustion and expansion strokes. It shows that the SRS can provide valuable data on this process in a CAI combustion engine.

  20. Performance measurement for case management: principles and objectives for developing standard measures.

    PubMed

    Howe, Rufus

    2005-01-01

    Developing standardized performance measurements for case management (CM) has become the holy grail of the field. The Council for Case Management Accountability (CCMA), a leadership committee of the Case Management Society of America (CMSA), has been grappling with the concept since early 2003. This article lays out a theoretical framework for performance measurement and then outlines the progress on a specific initiative begun by CCMA to correlate CM practices with improved outcome measures.

  1. In vivo MRS and MRSI: Performance analysis, measurement considerations and evaluation of metabolite concentration images

    NASA Astrophysics Data System (ADS)

    Vikhoff-Baaz, Barbro

    2000-10-01

    The doctoral thesis concerns development, evaluation and performance of quality assessment methods for volume- selection methods in 31P and 1H MR spectroscopy (MRS). It also contains different aspects of the measurement procedure for 1H MR spectroscopic imaging (MRSI) with application on the human brain, image reconstruction of the MRSI images and evaluation methods for lateralization of temporal lobe epilepsy (TLE). Two complementary two-compartment phantoms and evaluation methods for quality assessment of 31P MRS in small-bore MR systems were presented. The first phantom consisted of an inner cube inside a sphere phantom where measurements with and without volume selection where compared for various VOI sizes. The multi-centre showed that the evaluated parameters provide useful information of the performance of volume-selective MRS at the MR system. The second phantom consisted of two compartments divided by a very thin wall and was found useful for measurements of the appearance and position of the VOI profile in specific gradient directions. The second part concerned 1H MRS and MRSI of whole-body MR systems. Different factors that may degrade or complicate the measurement procedure like for MRSI were evaluated, e.g. the volume selection performance, contamination, susceptibility and motion. Two interpolation methods for reconstruction of MRSI images were compared. Measurements and computer simulations showed that Fourier interpolation correctly visualizes the information inherent in the data set, while the results were dependent on the position of the object relative the original matrix using Cubic spline interpolation. Application of spatial filtering may improve the image representation of the data. Finally, 1H MRSI was performed on healthy volunteers and patients with temporal lobe epilepsy (TLE). Metabolite concentration images were used for lateralization of TLE, where the signal intensity in the two hemispheres were compared. Visual analysis of the

  2. Chief Complaint-Based Performance Measures: A New Focus For Acute Care Quality Measurement

    PubMed Central

    Griffey, Richard T; Pines, Jesse M.; Farley, Heather L.; Phelan, Michael P; Beach, Christopher; Schuur, Jeremiah D; Venkatesh, Arjun K.

    2014-01-01

    Performance measures are increasingly important to guide meaningful quality improvement efforts and value-based reimbursement. Populations included in most current hospital performance measures are defined by recorded diagnoses using International Disease Classification (ICD)-9 codes in administrative claims data. While the diagnosis-centric approach allows the assessment of disease-specific quality, it fails to measure one of the primary functions of emergency department (ED) care which involves diagnosing, risk-stratifying, and treating patients’ potentially life-threatening conditions based on symptoms (i.e. chief complaints). In this paper we propose chief complaint-based quality measures as a means to enhance the evaluation of quality and value in emergency care. We discuss the potential benefits of chief-complaint based measures, describe opportunities to mitigate challenges, propose an example measure set, and present several recommendations to advance this paradigm in ED-based performance measurement. PMID:25443989

  3. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    PubMed

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  4. Development and implementation of an automated quantitative film digitizer quality control program

    NASA Astrophysics Data System (ADS)

    Fetterly, Kenneth A.; Avula, Ramesh T. V.; Hangiandreou, Nicholas J.

    1999-05-01

    A semi-automated, quantitative film digitizer quality control program that is based on the computer analysis of the image data from a single digitized test film was developed. This program includes measurements of the geometric accuracy, optical density performance, signal to noise ratio, and presampled modulation transfer function. The variability of the measurements was less than plus or minus 5%. Measurements were made on a group of two clinical and two laboratory laser film digitizers during a trial period of approximately four months. Quality control limits were established based on clinical necessity, vendor specifications and digitizer performance. During the trial period, one of the digitizers failed the performance requirements and was corrected by calibration.

  5. SEPARATION AND QUANTITATION OF NITROBENZENES AND THEIR REDUCTION PRODUCTS NITROANILINES AND PHENYLENEDIAMINES BY REVERSED=PHASE HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY

    EPA Science Inventory

    A reversed-phase high-performance liquid chromatographic method for the separation and quantitation of a mixture consisting of nitrobenzene, dinitrobenzene isomers, 1,3,5-trinitrobenzene and their reduction products: aniline, nitroanilines and phenylenediamines has been developed...

  6. Measuring surgical performance: A risky game?

    PubMed

    Kiernan, F; Rahman, F

    2015-08-01

    Interest in performance measurement has been driven by increased demand for better indicators of hospital quality of care. This is due in part to policy makers wishing to benchmark standards of care and implement quality improvements, and also by an increased demand for transparency and accountability. We describe the role of performance measurement, which is not only about quality improvement, but also serves as a guide in allocating resources within health systems, and between health, education, and social welfare systems. As hospital based healthcare is responsible for the most cost within the healthcare system, and treats the most severely ill of patients, it is no surprise that performance measurement has focused attention on hospital based care, and in particular on surgery, as an important means of improving quality and accountability. We are particularly concerned about the choice of mortality as an outcome measure in surgery, as this choice assumes that all mortality in surgery is preventable. In reality, as a low quality indicator of care it risks both gaming, and cream-skimming, unless accurate risk adjustment exists. Further concerns relate to the public reporting of this outcome measure. As mortality rates are an imperfect measure of quality, the reputation of individual surgeons will be threatened by the public release of this data. Significant effort should be made to communicate the results to the public in an appropriate manner. Copyright © 2015 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  7. Psychometric viability of measures of functional performance commonly used for people with dementia: a systematic review of measurement properties.

    PubMed

    Fox, Benjamin; Henwood, Timothy; Keogh, Justin; Neville, Christine

    2016-08-01

    Confidence in findings can only be drawn from measurement tools that have sound psychometric properties for the population with which they are used. Within a dementia specific population, measures of physical function have been poorly justified in exercise intervention studies, with justification of measures based on validity or reliability studies from dissimilar clinical populations, such as people with bronchitis or healthy older adults without dementia. To review the reliability and validity of quantitative measures of pre-identified physical function, as commonly used within exercise intervention literature for adults with dementia. Participants were adults, aged 65 years and older, with a confirmed medical diagnosis of dementia. n/a Desired studies were observational and cross-sectional and that assessed measures from a pre-identified list of measures of physical function. Studies that assessed the psychometric constructs of reliability and validity were targeted. COSMIN taxology was used to define reliability and validity. This included, but were not limited to, Intra-Class Correlations, Kappa, Cronbach's Alpha, Chi Squared, Standard Error of Measurement, Minimal Detectable Change and Limits of Agreement. Published material was sourced from the following four databases: MEDLINE, EMBASE, CINAHL and ISI Web of Science. Grey literature was searched for using ALOIS, Google Scholar and ProQuest. The COSMIN checklist was used to assess methodological quality of included studies. Assessment was completed by two reviewers independently. Reliability and validity data was extracted from included studies using standardized Joanna Briggs Institute data collection forms. Extraction was completed by two reviewers. A narrative synthesis of measurement properties of the tools used to measure physical function was performed. Quantitative meta-analysis was conducted for Intra-Class Correlation Coefficients only. With respect to relative reliability, studies reporting assessed

  8. A rapid and quantitative assay for measuring antibody-mediated neutralization of West Nile virus infection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierson, Theodore C.; Sanchez, Melissa D.; Puffer, Bridget A.

    2006-03-01

    West Nile virus (WNV) is a neurotropic flavivirus within the Japanese encephalitis antigenic complex that is responsible for causing West Nile encephalitis in humans. The surface of WNV virions is covered by a highly ordered icosahedral array of envelope proteins that is responsible for mediating attachment and fusion with target cells. These envelope proteins are also primary targets for the generation of neutralizing antibodies in vivo. In this study, we describe a novel approach for measuring antibody-mediated neutralization of WNV infection using virus-like particles that measure infection as a function of reporter gene expression. These reporter virus particles (RVPs) aremore » produced by complementation of a sub-genomic replicon with WNV structural proteins provided in trans using conventional DNA expression vectors. The precision and accuracy of this approach stem from an ability to measure the outcome of the interaction between antibody and viral antigens under conditions that satisfy the assumptions of the law of mass action as applied to virus neutralization. In addition to its quantitative strengths, this approach allows the production of WNV RVPs bearing the prM-E proteins of different WNV strains and mutants, offering considerable flexibility for the study of the humoral immune response to WNV in vitro. WNV RVPs are capable of only a single round of infection, can be used under BSL-2 conditions, and offer a rapid and quantitative approach for detecting virus entry and its inhibition by neutralizing antibody.« less

  9. Formative and Summative Evaluation: Related Issues in Performance Measurement.

    ERIC Educational Resources Information Center

    Wholey, Joseph S.

    1996-01-01

    Performance measurement can serve both formative and summative evaluation functions. Formative evaluation is typically more useful for government purposes whereas performance measurement is more useful than one-shot evaluations of either formative or summative nature. Evaluators should study performance measurement through case studies and…

  10. Benchmarking: your performance measurement and improvement tool.

    PubMed

    Senn, G F

    2000-01-01

    Many respected professional healthcare organizations and societies today are seeking to establish data-driven performance measurement strategies such as benchmarking. Clinicians are, however, resistant to "benchmarking" that is based on financial data alone, concerned that it may be adverse to the patients' best interests. Benchmarking of clinical procedures that uses physician's codes such as Current Procedural Terminology (CPTs) has greater credibility with practitioners. Better Performers, organizations that can perform procedures successfully at lower cost and in less time, become the "benchmark" against which other organizations can measure themselves. The Better Performers' strategies can be adopted by other facilities to save time or money while maintaining quality patient care.

  11. Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals. Research Brief

    ERIC Educational Resources Information Center

    National Center on Performance Incentives, 2008

    2008-01-01

    In "Value-Added and Other Methods for Measuring School Performance: An Analysis of Performance Measurement Strategies in Teacher Incentive Fund Proposals"--a paper presented at the February 2008 National Center on Performance Incentives research to policy conference--Robert Meyer and Michael Christian examine select performance-pay plans…

  12. Left ventricular performance in various heart diseases with or without heart failure:--an appraisal by quantitative one-plane cineangiocardiography.

    PubMed

    Lien, W P; Lee, Y S; Chang, F Z; Chen, J J; Shieh, W B

    1978-01-01

    Quantitative one-plane cineangiocardiography in right anterior oblique position for evaluation of LV performance was carried out in 62 patients with various heart diseases and in 13 subjects with normal LV. Parameters for evaluating both pump and muscle performances were derived from volume and pressure measurements. Of 31 patients with either systolic hypertension or LV myocardial diseases (coronary artery disease or idiopathic cardiomyopathy), 14 had clinical evidence of LV failure before the study. It was found that mean VCF and EF were most sensitive indicators of impaired LV performance among the various parameters. There was a close correlation between mean VCF and EF, yet discordant changes of both parameters were noted in some patients. Furthermore, wall motion abnormalities were not infrequently observed in patients with coronary artery disease or primary cardiomyopathy. Therefore, assessment of at least three ejection properties (EF, mean VCF and wall motion abnormalities) are considered to be essential for full understanding of derangement of LV function in heart disease. This is especially true of patients with coronary artery disease. LV behavior in relation to different pathological stresses or lesions, such as chronic pressure or volume load, myocardial disease and mitral stenosis, was also studied and possible cause of impaired LV myocardial function in mitral stenosis was discussed.

  13. Measuring human performance on NASA's microgravity aircraft

    NASA Technical Reports Server (NTRS)

    Morris, Randy B.; Whitmore, Mihriban

    1993-01-01

    Measuring human performance in a microgravity environment will aid in identifying the design requirements, human capabilities, safety, and productivity of future astronauts. The preliminary understanding of the microgravity effects on human performance can be achieved through evaluations conducted onboard NASA's KC-135 aircraft. These evaluations can be performed in relation to hardware performance, human-hardware interface, and hardware integration. Measuring human performance in the KC-135 simulated environment will contribute to the efforts of optimizing the human-machine interfaces for future and existing space vehicles. However, there are limitations, such as limited number of qualified subjects, unexpected hardware problems, and miscellaneous plane movements which must be taken into consideration. Examples for these evaluations, the results, and their implications are discussed in the paper.

  14. Measuring the Unmeasurable: Upholding Rigor in Quantitative Studies of Personal and Social Development in Outdoor Adventure Education

    ERIC Educational Resources Information Center

    Scrutton, Roger; Beames, Simon

    2015-01-01

    Outdoor adventure education (OAE) has a long history of being credited with the personal and social development (PSD) of its participants. PSD is notoriously difficult to measure quantitatively, yet stakeholders demand statistical evidence that given approaches to eliciting PSD are effective in their methods. Rightly or wrongly, many stakeholders…

  15. Quantitative and Qualitative Differences in Morphological Traits Revealed between Diploid Fragaria Species

    PubMed Central

    SARGENT, DANIEL J.; GEIBEL, M.; HAWKINS, J. A.; WILKINSON, M. J.; BATTEY, N. H.; SIMPSON, D. W.

    2004-01-01

    • Background and Aims The aims of this investigation were to highlight the qualitative and quantitative diversity apparent between nine diploid Fragaria species and produce interspecific populations segregating for a large number of morphological characters suitable for quantitative trait loci analysis. • Methods A qualitative comparison of eight described diploid Fragaria species was performed and measurements were taken of 23 morphological traits from 19 accessions including eight described species and one previously undescribed species. A principal components analysis was performed on 14 mathematically unrelated traits from these accessions, which partitioned the species accessions into distinct morphological groups. Interspecific crosses were performed with accessions of species that displayed significant quantitative divergence and, from these, populations that should segregate for a range of quantitative traits were raised. • Key Results Significant differences between species were observed for all 23 morphological traits quantified and three distinct groups of species accessions were observed after the principal components analysis. Interspecific crosses were performed between these groups, and F2 and backcross populations were raised that should segregate for a range of morphological characters. In addition, the study highlighted a number of distinctive morphological characters in many of the species studied. • Conclusions Diploid Fragaria species are morphologically diverse, yet remain highly interfertile, making the group an ideal model for the study of the genetic basis of phenotypic differences between species through map-based investigation using quantitative trait loci. The segregating interspecific populations raised will be ideal for such investigations and could also provide insights into the nature and extent of genome evolution within this group. PMID:15469944

  16. Principles, performance, and applications of spectral reconstitution (SR) in quantitative analysis of oils by Fourier transform infrared spectroscopy (FT-IR).

    PubMed

    García-González, Diego L; Sedman, Jacqueline; van de Voort, Frederik R

    2013-04-01

    Spectral reconstitution (SR) is a dilution technique developed to facilitate the rapid, automated, and quantitative analysis of viscous oil samples by Fourier transform infrared spectroscopy (FT-IR). This technique involves determining the dilution factor through measurement of an absorption band of a suitable spectral marker added to the diluent, and then spectrally removing the diluent from the sample and multiplying the resulting spectrum to compensate for the effect of dilution on the band intensities. The facsimile spectrum of the neat oil thus obtained can then be qualitatively or quantitatively analyzed for the parameter(s) of interest. The quantitative performance of the SR technique was examined with two transition-metal carbonyl complexes as spectral markers, chromium hexacarbonyl and methylcyclopentadienyl manganese tricarbonyl. The estimation of the volume fraction (VF) of the diluent in a model system, consisting of canola oil diluted to various extents with odorless mineral spirits, served as the basis for assessment of these markers. The relationship between the VF estimates and the true volume fraction (VF(t)) was found to be strongly dependent on the dilution ratio and also depended, to a lesser extent, on the spectral resolution. These dependences are attributable to the effect of changes in matrix polarity on the bandwidth of the ν(CO) marker bands. Excellent VF(t) estimates were obtained by making a polarity correction devised with a variance-spectrum-delineated correction equation. In the absence of such a correction, SR was shown to introduce only a minor and constant bias, provided that polarity differences among all the diluted samples analyzed were minimal. This bias can be built into the calibration of a quantitative FT-IR analytical method by subjecting appropriate calibration standards to the same SR procedure as the samples to be analyzed. The primary purpose of the SR technique is to simplify preparation of diluted samples such that

  17. Cognitive performance modeling based on general systems performance theory.

    PubMed

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  18. Telerobotic system performance measurement - Motivation and methods

    NASA Technical Reports Server (NTRS)

    Kondraske, George V.; Khoury, George J.

    1992-01-01

    A systems performance-based strategy for modeling and conducting experiments relevant to the design and performance characterization of telerobotic systems is described. A developmental testbed consisting of a distributed telerobotics network and initial efforts to implement the strategy described is presented. Consideration is given to the general systems performance theory (GSPT) to tackle human performance problems as a basis for: measurement of overall telerobotic system (TRS) performance; task decomposition; development of a generic TRS model; and the characterization of performance of subsystems comprising the generic model. GSPT employs a resource construct to model performance and resource economic principles to govern the interface of systems to tasks. It provides a comprehensive modeling/measurement strategy applicable to complex systems including both human and artificial components. Application is presented within the framework of a distributed telerobotics network as a testbed. Insight into the design of test protocols which elicit application-independent data is described.

  19. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  20. The book availability study as an objective measure of performance in a health sciences library.

    PubMed Central

    Kolner, S J; Welch, E C

    1985-01-01

    In its search for an objective overall diagnostic evaluation, the University of Illinois Library of the Health Sciences' Program Evaluation Committee selected a book availability measure; it is easy to administer and repeat, results are reproducible, and comparable data exist for other academic and health sciences libraries. The study followed the standard methodology in the literature with minor modifications. Patrons searching for particular books were asked to record item(s) needed and the outcome of the search. Library staff members then determined the reasons for failures in obtaining desired items. The results of the study are five performance scores. The first four represent the percentage probability of a library's operating with ideal effectiveness; the last provides an overall performance score. The scores of the Library of the Health Sciences demonstrated no unusual availability problems. The study was easy to implement and provided meaningful, quantitative, and objective data. PMID:3995202

  1. Air Force Materiel Command: A Survey of Performance Measures

    DTIC Science & Technology

    2004-03-12

    AIR FORCE MATERIEL COMMAND: A SURVEY OF PERFORMANCE MEASURES THESIS Marcia Leonard, Capt...AFIT/GLM/ENS/04-10 AIR FORCE MATERIEL COMMAND: A SURVEY OF PERFORMANCE MEASURES THESIS Presented to the Faculty...SURVEY OF PERFORMANCE MEASURES Marcia Leonard, BS Capt, USAF Approved: //signed// 12 March 2004

  2. Measuring the Internal Structure and Physical Conditions in Star and Planet Forming Clouds Cores: Towards a Quantitative Description of Cloud Evolution

    NASA Technical Reports Server (NTRS)

    Lada, Charles J.

    2004-01-01

    This grant funds a research program to use infrared extinction measurements to probe the detailed structure of dark molecular cloud cores and investigate the physical conditions which give rise to star and planet formation. The goals of this program are to acquire, reduce and analyze deep infrared and molecular-line observations of a carefully selected sample of nearby dark clouds in order to determine the detailed initial conditions for star formation from quantitative measurements of the internal structure of starless cloud cores and to quantitatively investigate the evolution of such structure through the star and planet formation process.

  3. The five traps of performance measurement.

    PubMed

    Likierman, Andrew

    2009-10-01

    Evaluating a company's performance often entails wading through a thicket of numbers produced by a few simple metrics, writes the author, and senior executives leave measurement to those whose specialty is spreadsheets. To take ownership of performance assessment, those executives should find qualitative, forward-looking measures that will help them avoid five common traps: Measuring against yourself. Find data from outside the company, and reward relative, rather than absolute, performance. Enterprise Rent-A-Car uses a service quality index to measure customers' repeat purchase intentions. Looking backward. Use measures that lead rather than lag the profits in your business. Humana, a health insurer, found that the sickest 10% of its patients account for 80% of its costs; now it offers customers incentives for early screening. Putting your faith in numbers. The soft drinks company Britvic evaluates its executive coaching program not by trying to assign it an ROI number but by tracking participants' careers for a year. Gaming your metrics. The law firm Clifford Chance replaced its single, easy-to-game metric of billable hours with seven criteria on which to base bonuses. Sticking to your numbers too long. Be precise about what you want to assess and explicit about what metrics are assessing it. Such clarity would have helped investors interpret the AAA ratings involved in the financial meltdown. Really good assessment will combine finance managers' relative independence with line managers' expertise.

  4. Single-cell quantitative HER2 measurement identifies heterogeneity and distinct subgroups within traditionally defined HER2-positive patients.

    PubMed

    Onsum, Matthew D; Geretti, Elena; Paragas, Violette; Kudla, Arthur J; Moulis, Sharon P; Luus, Lia; Wickham, Thomas J; McDonagh, Charlotte F; MacBeath, Gavin; Hendriks, Bart S

    2013-11-01

    Human epidermal growth factor receptor 2 (HER2) is an important biomarker for breast and gastric cancer prognosis and patient treatment decisions. HER2 positivity, as defined by IHC or fluorescent in situ hybridization testing, remains an imprecise predictor of patient response to HER2-targeted therapies. Challenges to correct HER2 assessment and patient stratification include intratumoral heterogeneity, lack of quantitative and/or objective assays, and differences between measuring HER2 amplification at the protein versus gene level. We developed a novel immunofluorescence method for quantitation of HER2 protein expression at the single-cell level on FFPE patient samples. Our assay uses automated image analysis to identify and classify tumor versus non-tumor cells, as well as quantitate the HER2 staining for each tumor cell. The HER2 staining level is converted to HER2 protein expression using a standard cell pellet array stained in parallel with the tissue sample. This approach allows assessment of HER2 expression and heterogeneity within a tissue section at the single-cell level. By using this assay, we identified distinct subgroups of HER2 heterogeneity within traditional definitions of HER2 positivity in both breast and gastric cancers. Quantitative assessment of intratumoral HER2 heterogeneity may offer an opportunity to improve the identification of patients likely to respond to HER2-targeted therapies. The broad applicability of the assay was demonstrated by measuring HER2 expression profiles on multiple tumor types, and on normal and diseased heart tissues. Copyright © 2013 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  5. Quantitative filter technique measurements of spectral light absorption by aquatic particles using a portable integrating cavity absorption meter (QFT-ICAM).

    PubMed

    Röttgers, Rüdiger; Doxaran, David; Dupouy, Cecile

    2016-01-25

    The accurate determination of light absorption coefficients of particles in water, especially in very oligotrophic oceanic areas, is still a challenging task. Concentrating aquatic particles on a glass fiber filter and using the Quantitative Filter Technique (QFT) is a common practice. Its routine application is limited by the necessary use of high performance spectrophotometers, distinct problems induced by the strong scattering of the filters and artifacts induced by freezing and storing samples. Measurements of the sample inside a large integrating sphere reduce scattering effects and direct field measurements avoid artifacts due to sample preservation. A small, portable, Integrating Cavity Absorption Meter setup (QFT-ICAM) is presented, that allows rapid measurements of a sample filter. The measurement technique takes into account artifacts due to chlorophyll-a fluorescence. The QFT-ICAM is shown to be highly comparable to similar measurements in laboratory spectrophotometers, in terms of accuracy, precision, and path length amplification effects. No spectral artifacts were observed when compared to measurement of samples in suspension, whereas freezing and storing of sample filters induced small losses of water-soluble pigments (probably phycoerythrins). Remaining problems in determining the particulate absorption coefficient with the QFT-ICAM are strong sample-to-sample variations of the path length amplification, as well as fluorescence by pigments that is emitted in a different spectral region than that of chlorophyll-a.

  6. Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.

    PubMed

    Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin

    2018-01-08

    We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.

  7. Identification and Quantitation of Asparagine and Citrulline Using High-Performance Liquid Chromatography (HPLC)

    PubMed Central

    Bai, Cheng; Reilly, Charles C.; Wood, Bruce W.

    2007-01-01

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (μMol ml−1/μMol ml−1)], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides. PMID:19662174

  8. Identification and quantitation of asparagine and citrulline using high-performance liquid chromatography (HPLC).

    PubMed

    Bai, Cheng; Reilly, Charles C; Wood, Bruce W

    2007-03-28

    High-performance liquid chromatography (HPLC) analysis was used for identification of two problematic ureides, asparagine and citrulline. We report here a technique that takes advantage of the predictable delay in retention time of the co-asparagine/citrulline peak to enable both qualitative and quantitative analysis of asparagine and citrulline using the Platinum EPS reverse-phase C18 column (Alltech Associates). Asparagine alone is eluted earlier than citrulline alone, but when both of them are present in biological samples they may co-elute. HPLC retention times for asparagine and citrulline were influenced by other ureides in the mixture. We found that at various asparagines and citrulline ratios [= 3:1, 1:1, and 1:3; corresponding to 75:25, 50:50, and 25:75 (microMol ml(-1)/microMol ml(-1))], the resulting peak exhibited different retention times. Adjustment of ureide ratios as internal standards enables peak identification and quantification. Both chemicals were quantified in xylem sap samples of pecan [Carya illinoinensis (Wangenh.) K. Koch] trees. Analysis revealed that tree nickel nutrition status affects relative concentrations of Urea Cycle intermediates, asparagine and citrulline, present in sap. Consequently, we concluded that the HPLC methods are presented to enable qualitative and quantitative analysis of these metabolically important ureides.

  9. Effects of a veterinary student leadership program on measures of stress and academic performance.

    PubMed

    Moore, Dale A; Truscott, Marla L; St Clair, Lisa; Klingborg, Donald J

    2007-01-01

    Assuming leadership roles in veterinary student governance or club activities could be considered an added stressor for students because of the impact on time available for personal and academic activities. The study reported here evaluated the effects of participation in a leadership program and leadership activity across two classes of veterinary students on measures of stress, using the Derogatis Stress Profile (DSP), and on veterinary school academic performance, measured as annual grade-point average (GPA) over a three-year period. Program participants and their classmates completed the DSP three times across the first three years of veterinary school. On average, participating students reported self-declared stress levels that were higher and measured DSP stress levels that were lower than those of the general population. Students were more likely to assume elected or appointed leadership roles while in their first three years of the veterinary degree program if they participated in the optional leadership program and demonstrated lower stress in several dimensions. Some increased stress, as measured in some of the DSP stress dimensions, had a small but statistically significant influence on professional school GPA. The study determined that the most important predictors of students' cumulative GPA across the three-year period were the GPA from the last 45 credits of pre-veterinary coursework and their quantitative GRE scores. The results of the study indicate that neither participation in the leadership program nor taking on leadership roles within veterinary school appeared to influence veterinary school academic performance or to increase stress.

  10. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  11. Quantitative measurements of magnetic vortices using position resolved diffraction in Lorentz STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaluzec, N. J.

    2002-03-05

    A number of electron column techniques have been developed over the last forty years to permit visualization of magnetic fields in specimens. These include: Fresnel imaging, Differential Phase Contrast, Electron Holography and Lorentz STEM. In this work we have extended the LSTEM methodology using Position Resolved Diffraction (PRD) to quantitatively measure the in-plane electromagnetic fields of thin film materials. The experimental work reported herein has been carried out using the ANL AAEM HB603Z 300 kV FEG instrument 5. In this instrument, the electron optical column was operated in a zero field mode, at the specimen, where the objective lens ismore » turned off and the probe forming lens functions were reallocated to the C1, C2, and C3 lenses. Post specimen lenses (P1, P2, P3, P4) were used to magnify the transmitted electrons to a YAG screen, which was then optically transferred to a Hamamatsu ORCA ER CCD array. This CCD was interfaced to an EmiSpec Data Acquisition System and the data was subsequently transferred to an external computer system for detailed quantitative analysis. In Position Resolved Diffraction mode, we digitally step a focused electron probe across the region of interest of the specimen while at the same time recording the complete diffraction pattern at each point in the scan.« less

  12. Evaluation of emergency department performance – a systematic review on recommended performance and quality-in-care measures

    PubMed Central

    2013-01-01

    Background Evaluation of emergency department (ED) performance remains a difficult task due to the lack of consensus on performance measures that reflects high quality, efficiency, and sustainability. Aim To describe, map, and critically evaluate which performance measures that the published literature regard as being most relevant in assessing overall ED performance. Methods Following the PRISMA guidelines, a systematic literature review of review articles reporting accentuated ED performance measures was conducted in the databases of PubMed, Cochrane Library, and Web of Science. Study eligibility criteria includes: 1) the main purpose was to discuss, analyse, or promote performance measures best reflecting ED performance, 2) the article was a review article, and 3) the article reported macro-level performance measures, thus reflecting an overall departmental performance level. Results A number of articles addresses this study’s objective (n = 14 of 46 unique hits). Time intervals and patient-related measures were dominant in the identified performance measures in review articles from US, UK, Sweden and Canada. Length of stay (LOS), time between patient arrival to initial clinical assessment, and time between patient arrivals to admission were highlighted by the majority of articles. Concurrently, “patients left without being seen” (LWBS), unplanned re-attendance within a maximum of 72 hours, mortality/morbidity, and number of unintended incidents were the most highlighted performance measures that related directly to the patient. Performance measures related to employees were only stated in two of the 14 included articles. Conclusions A total of 55 ED performance measures were identified. ED time intervals were the most recommended performance measures followed by patient centeredness and safety performance measures. ED employee related performance measures were rarely mentioned in the investigated literature. The study’s results allow for advancement

  13. ASUPT Automated Objective Performance Measurement System.

    ERIC Educational Resources Information Center

    Waag, Wayne L.; And Others

    To realize its full research potential, a need exists for the development of an automated objective pilot performance evaluation system for use in the Advanced Simulation in Undergraduate Pilot Training (ASUPT) facility. The present report documents the approach taken for the development of performance measures and also presents data collected…

  14. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    PubMed

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  15. Usefulness of quantitative susceptibility mapping for the diagnosis of Parkinson disease.

    PubMed

    Murakami, Y; Kakeda, S; Watanabe, K; Ueda, I; Ogasawara, A; Moriya, J; Ide, S; Futatsuya, K; Sato, T; Okada, K; Uozumi, T; Tsuji, S; Liu, T; Wang, Y; Korogi, Y

    2015-06-01

    Quantitative susceptibility mapping allows overcoming several nonlocal restrictions of susceptibility-weighted and phase imaging and enables quantification of magnetic susceptibility. We compared the diagnostic accuracy of quantitative susceptibility mapping and R2* (1/T2*) mapping to discriminate between patients with Parkinson disease and controls. For 21 patients with Parkinson disease and 21 age- and sex-matched controls, 2 radiologists measured the quantitative susceptibility mapping values and R2* values in 6 brain structures (the thalamus, putamen, caudate nucleus, pallidum, substantia nigra, and red nucleus). The quantitative susceptibility mapping values and R2* values of the substantia nigra were significantly higher in patients with Parkinson disease (P < .01); measurements in other brain regions did not differ significantly between patients and controls. For the discrimination of patients with Parkinson disease from controls, receiver operating characteristic analysis suggested that the optimal cutoff values for the substantia nigra, based on the Youden Index, were >0.210 for quantitative susceptibility mapping and >28.8 for R2*. The sensitivity, specificity, and accuracy of quantitative susceptibility mapping were 90% (19 of 21), 86% (18 of 21), and 88% (37 of 42), respectively; for R2* mapping, they were 81% (17 of 21), 52% (11 of 21), and 67% (28 of 42). Pair-wise comparisons showed that the areas under the receiver operating characteristic curves were significantly larger for quantitative susceptibility mapping than for R2* mapping (0.91 versus 0.69, P < .05). Quantitative susceptibility mapping showed higher diagnostic performance than R2* mapping for the discrimination between patients with Parkinson disease and controls. © 2015 by American Journal of Neuroradiology.

  16. QUANTITATIVE PLUTONIUM MICRODISTRIBUTION IN BONE TISSUE OF VERTEBRA FROM A MAYAK WORKER

    PubMed Central

    Lyovkina, Yekaterina V.; Miller, Scott C.; Romanov, Sergey A.; Krahenbuhl, Melinda P.; Belosokhov, Maxim V.

    2010-01-01

    The purpose was to obtain quantitative data on plutonium microdistribution in different structural elements of human bone tissue for local dose assessment and dosimetric models validation. A sample of the thoracic vertebra was obtained from a former Mayak worker with a rather high plutonium burden. Additional information was obtained on occupational and exposure history, medical history, and measured plutonium content in organs. Plutonium was detected in bone sections from its fission tracks in polycarbonate film using neutron-induced autoradiography. Quantitative analysis of randomly selected microscopic fields on one of the autoradiographs was performed. Data included fission fragment tracks in different bone tissue and surface areas. Quantitative information on plutonium microdistribution in human bone tissue was obtained for the first time. From these data, quantitative relationship of plutonium decays in bone volume to decays on bone surface in cortical and trabecular fractions were defined as 2.0 and 0.4, correspondingly. The measured quantitative relationship of decays in bone volume to decays on bone surface does not coincide with recommended models for the cortical bone fraction by the International Commission on Radiological Protection. Biokinetic model parameters of extrapulmonary compartments might need to be adjusted after expansion of the data set on quantitative plutonium microdistribution in other bone types in human as well as other cases with different exposure patterns and types of plutonium. PMID:20838087

  17. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  18. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  19. Evolution of Quantitative Measures in NMR: Quantum Mechanical qHNMR Advances Chemical Standardization of a Red Clover (Trifolium pratense) Extract

    PubMed Central

    2017-01-01

    Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513

  20. Quantitative cultures of bronchoscopically obtained specimens should be performed for optimal management of ventilator-associated pneumonia.

    PubMed

    Baselski, Vickie; Klutts, J Stacey; Baselski, Vickie; Klutts, J Stacey

    2013-03-01

    Ventilator-associated pneumonia (VAP) is a leading cause of health care-associated infection. It has a high rate of attributed mortality, and this mortality is increased in patients who do not receive appropriate empirical antimicrobial therapy. As a result of the overuse of broad-spectrum antimicrobials such as the carbapenems, strains of Acinetobacter, Enterobacteriaceae, and Pseudomonas aeruginosa susceptible only to polymyxins and tigecycline have emerged as important causes of VAP. The need to accurately diagnose VAP so that appropriate discontinuation or de-escalation of antimicrobial therapy can be initiated to reduce this antimicrobial pressure is essential. Practice guidelines for the diagnosis of VAP advocate the use of bronchoalveolar lavage (BAL) fluid obtained either bronchoscopically or by the use of a catheter passed through the endotracheal tube. The CDC recommends that quantitative cultures be performed on these specimens, using ≥ 10(4) CFU/ml to designate a positive culture (http://www.cdc.gov/nhsn/TOC_PSCManual.html, accessed 30 October 2012). However, there is no consensus in the clinical microbiology community as to whether these specimens should be cultured quantitatively, using the aforementioned designated bacterial cell count to designate infection, or by a semiquantitative approach. We have asked Vickie Baselski, University of Tennessee Health Science Center, who was the lead author on one of the seminal papers on quantitative BAL fluid culture, to explain why she believes that quantitative BAL fluid cultures are the optimal strategy for VAP diagnosis. We have Stacey Klutts, University of Iowa, to advocate the semiquantitative approach.

  1. Quantitative measurement of pass-by noise radiated by vehicles running at high speeds

    NASA Astrophysics Data System (ADS)

    Yang, Diange; Wang, Ziteng; Li, Bing; Luo, Yugong; Lian, Xiaomin

    2011-03-01

    It has been a challenge in the past to accurately locate and quantify the pass-by noise source radiated by the running vehicles. A system composed of a microphone array is developed in our current work to do this work. An acoustic-holography method for moving sound sources is designed to handle the Doppler effect effectively in the time domain. The effective sound pressure distribution is reconstructed on the surface of a running vehicle. The method has achieved a high calculation efficiency and is able to quantitatively measure the sound pressure at the sound source and identify the location of the main sound source. The method is also validated by the simulation experiments and the measurement tests with known moving speakers. Finally, the engine noise, tire noise, exhaust noise and wind noise of the vehicle running at different speeds are successfully identified by this method.

  2. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  3. Unmixing of fluorescence spectra to resolve quantitative time-series measurements of gene expression in plate readers.

    PubMed

    Lichten, Catherine A; White, Rachel; Clark, Ivan B N; Swain, Peter S

    2014-02-03

    To connect gene expression with cellular physiology, we need to follow levels of proteins over time. Experiments typically use variants of Green Fluorescent Protein (GFP), and time-series measurements require specialist expertise if single cells are to be followed. Fluorescence plate readers, however, a standard in many laboratories, can in principle provide similar data, albeit at a mean, population level. Nevertheless, extracting the average fluorescence per cell is challenging because autofluorescence can be substantial. Here we propose a general method for correcting plate reader measurements of fluorescent proteins that uses spectral unmixing and determines both the fluorescence per cell and the errors on that fluorescence. Combined with strain collections, such as the GFP fusion collection for budding yeast, our methodology allows quantitative measurements of protein levels of up to hundreds of genes and therefore provides complementary data to high throughput studies of transcription. We illustrate the method by following the induction of the GAL genes in Saccharomyces cerevisiae for over 20 hours in different sugars and argue that the order of appearance of the Leloir enzymes may be to reduce build-up of the toxic intermediate galactose-1-phosphate. Further, we quantify protein levels of over 40 genes, again over 20 hours, after cells experience a change in carbon source (from glycerol to glucose). Our methodology is sensitive, scalable, and should be applicable to other organisms. By allowing quantitative measurements on a per cell basis over tens of hours and over hundreds of genes, it should increase our understanding of the dynamic changes that drive cellular behaviour.

  4. Unmixing of fluorescence spectra to resolve quantitative time-series measurements of gene expression in plate readers

    PubMed Central

    2014-01-01

    Background To connect gene expression with cellular physiology, we need to follow levels of proteins over time. Experiments typically use variants of Green Fluorescent Protein (GFP), and time-series measurements require specialist expertise if single cells are to be followed. Fluorescence plate readers, however, a standard in many laboratories, can in principle provide similar data, albeit at a mean, population level. Nevertheless, extracting the average fluorescence per cell is challenging because autofluorescence can be substantial. Results Here we propose a general method for correcting plate reader measurements of fluorescent proteins that uses spectral unmixing and determines both the fluorescence per cell and the errors on that fluorescence. Combined with strain collections, such as the GFP fusion collection for budding yeast, our methodology allows quantitative measurements of protein levels of up to hundreds of genes and therefore provides complementary data to high throughput studies of transcription. We illustrate the method by following the induction of the GAL genes in Saccharomyces cerevisiae for over 20 hours in different sugars and argue that the order of appearance of the Leloir enzymes may be to reduce build-up of the toxic intermediate galactose-1-phosphate. Further, we quantify protein levels of over 40 genes, again over 20 hours, after cells experience a change in carbon source (from glycerol to glucose). Conclusions Our methodology is sensitive, scalable, and should be applicable to other organisms. By allowing quantitative measurements on a per cell basis over tens of hours and over hundreds of genes, it should increase our understanding of the dynamic changes that drive cellular behaviour. PMID:24495318

  5. Dissociation of performance and subjective measures of workload

    NASA Technical Reports Server (NTRS)

    Yeh, Yei-Yu; Wickens, Christopher D.

    1988-01-01

    A theory is presented to identify sources that produce dissociations between performance and subjective measures of workload. The theory states that performance is determined by (1) amount of resources invested, (2) resource efficiency, and (3) degree of competition for common resources in a multidimensional space described in the multiple-resources model. Subjective perception of workload, multidimensional in nature, increases with greater amounts of resource investment and with greater demands on working memory. Performance and subjective workload measures dissociate when greater resources are invested to improve performance of a resource-limited task; when demands on working memory are increased by time-sharing between concurrent tasks or between display elements; and when performance is sensitive to resource competition and subjective measures are more sensitive to total investment. These dissociation findings and their implications are discussed and directions for future research are suggested.

  6. Hydrograph matching method for measuring model performance

    NASA Astrophysics Data System (ADS)

    Ewen, John

    2011-09-01

    SummaryDespite all the progress made over the years on developing automatic methods for analysing hydrographs and measuring the performance of rainfall-runoff models, automatic methods cannot yet match the power and flexibility of the human eye and brain. Very simple approaches are therefore being developed that mimic the way hydrologists inspect and interpret hydrographs, including the way that patterns are recognised, links are made by eye, and hydrological responses and errors are studied and remembered. In this paper, a dynamic programming algorithm originally designed for use in data mining is customised for use with hydrographs. It generates sets of "rays" that are analogous to the visual links made by the hydrologist's eye when linking features or times in one hydrograph to the corresponding features or times in another hydrograph. One outcome from this work is a new family of performance measures called "visual" performance measures. These can measure differences in amplitude and timing, including the timing errors between simulated and observed hydrographs in model calibration. To demonstrate this, two visual performance measures, one based on the Nash-Sutcliffe Efficiency and the other on the mean absolute error, are used in a total of 34 split-sample calibration-validation tests for two rainfall-runoff models applied to the Hodder catchment, northwest England. The customised algorithm, called the Hydrograph Matching Algorithm, is very simple to apply; it is given in a few lines of pseudocode.

  7. Exploratory data project : freight resiliency performance measures.

    DOT National Transportation Integrated Search

    2010-03-01

    Exploratory Data Project: Freight Resiliency Performance Measures. (2009-10) FHWA's Office of Freight Management : and Operations, through a partnership with the American Transportation Research Institute (ATRI), established a Freight : Performance M...

  8. Quantitative measurement of mitochondrial membrane potential in cultured cells: calcium-induced de- and hyperpolarization of neuronal mitochondria

    PubMed Central

    Gerencser, Akos A; Chinopoulos, Christos; Birket, Matthew J; Jastroch, Martin; Vitelli, Cathy; Nicholls, David G; Brand, Martin D

    2012-01-01

    Mitochondrial membrane potential (ΔΨM) is a central intermediate in oxidative energy metabolism. Although ΔΨM is routinely measured qualitatively or semi-quantitatively using fluorescent probes, its quantitative assay in intact cells has been limited mostly to slow, bulk-scale radioisotope distribution methods. Here we derive and verify a biophysical model of fluorescent potentiometric probe compartmentation and dynamics using a bis-oxonol-type indicator of plasma membrane potential (ΔΨP) and the ΔΨM probe tetramethylrhodamine methyl ester (TMRM) using fluorescence imaging and voltage clamp. Using this model we introduce a purely fluorescence-based quantitative assay to measure absolute values of ΔΨM in millivolts as they vary in time in individual cells in monolayer culture. The ΔΨP-dependent distribution of the probes is modelled by Eyring rate theory. Solutions of the model are used to deconvolute ΔΨP and ΔΨM in time from the probe fluorescence intensities, taking into account their slow, ΔΨP-dependent redistribution and Nernstian behaviour. The calibration accounts for matrix:cell volume ratio, high- and low-affinity binding, activity coefficients, background fluorescence and optical dilution, allowing comparisons of potentials in cells or cell types differing in these properties. In cultured rat cortical neurons, ΔΨM is −139 mV at rest, and is regulated between −108 mV and −158 mV by concerted increases in ATP demand and Ca2+-dependent metabolic activation. Sensitivity analysis showed that the standard error of the mean in the absolute calibrated values of resting ΔΨM including all biological and systematic measurement errors introduced by the calibration parameters is less than 11 mV. Between samples treated in different ways, the typical equivalent error is ∼5 mV. PMID:22495585

  9. Measuring performance in virtual reality phacoemulsification surgery

    NASA Astrophysics Data System (ADS)

    Söderberg, Per; Laurell, Carl-Gustaf; Simawi, Wamidh; Skarman, Eva; Nordh, Leif; Nordqvist, Per

    2008-02-01

    We have developed a virtual reality (VR) simulator for phacoemulsification surgery. The current work aimed at developing a relative performance index that characterizes the performance of an individual trainee. We recorded measurements of 28 response variables during three iterated surgical sessions in 9 experienced cataract surgeons, separately for the sculpting phase and the evacuation phase of phacoemulsification surgery and compared their outcome to that of a reference group of naive trainees. We defined an individual overall performance index, an individual class specific performance index and an individual variable specific performance index. We found that on an average the experienced surgeons performed at a lower level than a reference group of naive trainees but that this was particularly attributed to a few surgeons. When their overall performance index was further analyzed as class specific performance index and variable specific performance index it was found that the low level performance was attributed to a behavior that is acceptable for an experienced surgeon but not for a naive trainee. It was concluded that relative performance indices should use a reference group that corresponds to the measured individual since the definition of optimal surgery may vary among trainee groups depending on their level of experience.

  10. Measuring individual work performance: identifying and selecting indicators.

    PubMed

    Koopmans, Linda; Bernaards, Claire M; Hildebrandt, Vincent H; de Vet, Henrica C W; van der Beek, Allard J

    2014-01-01

    Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions. This study was designed to (1) identify indicators for each dimension, (2) select the most relevant indicators, and (3) determine the relative weight of each dimension in ratings of work performance. IWP indicators were identified from multiple research disciplines, via literature, existing questionnaires, and expert interviews. Subsequently, experts selected the most relevant indicators per dimension and scored the relative weight of each dimension in ratings of IWP. In total, 128 unique indicators were identified. Twenty-three of these indicators were selected by experts as most relevant for measuring IWP. Task performance determined 36% of the work performance rating, while the other three dimensions respectively determined 22%, 20% and 21% of the rating. Notable consensus was found on relevant indicators of IWP, reducing the number from 128 to 23 relevant indicators. This provides an important step towards the development of a standardized, generic and short measurement instrument for assessing IWP.

  11. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty.

    PubMed

    Kellman, Philip J; Mnookin, Jennifer L; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E

    2014-01-01

    Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and

  12. Forensic Comparison and Matching of Fingerprints: Using Quantitative Image Measures for Estimating Error Rates through Understanding and Predicting Difficulty

    PubMed Central

    Kellman, Philip J.; Mnookin, Jennifer L.; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E.

    2014-01-01

    Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and

  13. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph

    PubMed Central

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045

  14. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph.

    PubMed

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.

  15. Genome-Wide Association Studies of Quantitatively Measured Skin, Hair, and Eye Pigmentation in Four European Populations

    PubMed Central

    Candille, Sophie I.; Absher, Devin M.; Beleza, Sandra; Bauchet, Marc; McEvoy, Brian; Garrison, Nanibaa’ A.; Li, Jun Z.; Myers, Richard M.; Barsh, Gregory S.; Tang, Hua; Shriver, Mark D.

    2012-01-01

    Pigmentation of the skin, hair, and eyes varies both within and between human populations. Identifying the genes and alleles underlying this variation has been the goal of many candidate gene and several genome-wide association studies (GWAS). Most GWAS for pigmentary traits to date have been based on subjective phenotypes using categorical scales. But skin, hair, and eye pigmentation vary continuously. Here, we seek to characterize quantitative variation in these traits objectively and accurately and to determine their genetic basis. Objective and quantitative measures of skin, hair, and eye color were made using reflectance or digital spectroscopy in Europeans from Ireland, Poland, Italy, and Portugal. A GWAS was conducted for the three quantitative pigmentation phenotypes in 176 women across 313,763 SNP loci, and replication of the most significant associations was attempted in a sample of 294 European men and women from the same countries. We find that the pigmentation phenotypes are highly stratified along axes of European genetic differentiation. The country of sampling explains approximately 35% of the variation in skin pigmentation, 31% of the variation in hair pigmentation, and 40% of the variation in eye pigmentation. All three quantitative phenotypes are correlated with each other. In our two-stage association study, we reproduce the association of rs1667394 at the OCA2/HERC2 locus with eye color but we do not identify new genetic determinants of skin and hair pigmentation supporting the lack of major genes affecting skin and hair color variation within Europe and suggesting that not only careful phenotyping but also larger cohorts are required to understand the genetic architecture of these complex quantitative traits. Interestingly, we also see that in each of these four populations, men are more lightly pigmented in the unexposed skin of the inner arm than women, a fact that is underappreciated and may vary across the world. PMID:23118974

  16. Quantitative analysis of phylloquinone (vitamin K1) in soy bean oils by high-performance liquid chromatography.

    PubMed

    Zonta, F; Stancher, B

    1985-07-19

    A high-performance liquid chromatographic method for determining phylloquinone (vitamin K1) in soy bean oils is described. Resolution of vitamin K1 from interfering peaks of the matrix was obtained after enzymatic digestion, extraction and liquid-solid chromatography on alumina. An isocratic reversed-phase chromatography with UV detection was used in the final stage. The quantitation was carried out by the standard addition method, and the recovery of the whole procedure was 88.2%.

  17. Quantitative analysis of the Ge concentration in a SiGe quantum well: comparison of low-energy RBS and SIMS measurements.

    PubMed

    Krecar, D; Rosner, M; Draxler, M; Bauer, P; Hutter, H

    2006-01-01

    The germanium concentration and the position and thickness of the quantum well in molecular beam epitaxy (MBE)-grown SiGe were quantitatively analyzed via low-energy Rutherford backscattering (RBS) and secondary ion mass spectrometry (SIMS). In these samples, the concentrations of Si and Ge were assumed to be constant, except for the quantum well, where the germanium concentration was lower. The thickness of the analyzed quantum well was about 12 nm and it was situated at a depth of about 60 nm below the surface. A dip showed up in the RBS spectra due to the lower germanium concentration in the quantum well, and this was evaluated. Good depth resolution was required in order to obtain quantitative results, and this was obtained by choosing a primary energy of 500 keV and a tilt angle of 51 degrees with respect to the surface normal. Quantitative information was deduced from the raw data by comparing it with SIMNRA simulated spectra. The SIMS measurements were performed with oxygen primary ions. Given the response function of the SIMS instrument (the SIMS depth profile of the germanium delta (delta) layer), and using the forward convolution (point-to-point convolution) model, it is possible to determine the germanium concentration and the thickness of the analyzed quantum well from the raw SIMS data. The aim of this work was to compare the results obtained via RBS and SIMS and to show their potential for use in the semiconductor and microelectronics industry. The detection of trace elements (here the doping element antimony) that could not be evaluated with RBS in low-energy mode is also demonstrated using SIMS instead.

  18. Non-invasive tissue temperature measurements based on quantitative diffuse optical spectroscopy (DOS) of water.

    PubMed

    Chung, S H; Cerussi, A E; Merritt, S I; Ruth, J; Tromberg, B J

    2010-07-07

    We describe the development of a non-invasive method for quantitative tissue temperature measurements using Broadband diffuse optical spectroscopy (DOS). Our approach is based on well-characterized opposing shifts in near-infrared (NIR) water absorption spectra that appear with temperature and macromolecular binding state. Unlike conventional reflectance methods, DOS is used to generate scattering-corrected tissue water absorption spectra. This allows us to separate the macromolecular bound water contribution from the thermally induced spectral shift using the temperature isosbestic point at 996 nm. The method was validated in intralipid tissue phantoms by correlating DOS with thermistor measurements (R=0.96) with a difference of 1.1+/-0.91 degrees C over a range of 28-48 degrees C. Once validated, thermal and hemodynamic (i.e. oxy- and deoxy-hemoglobin concentration) changes were measured simultaneously and continuously in human subjects (forearm) during mild cold stress. DOS-measured arm temperatures were consistent with previously reported invasive deep tissue temperature studies. These results suggest that DOS can be used for non-invasive, co-registered measurements of absolute temperature and hemoglobin parameters in thick tissues, a potentially important approach for optimizing thermal diagnostics and therapeutics.

  19. 76 FR 70653 - Performance Measurement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... Regulations as follows: PART 3055--SERVICE PERFORMANCE AND CUSTOMER SATISFACTION REPORTING 0 1. The authority... entered into the National Customer Management System (NCMS). NCMS manages SFS inventory, general ledger, order history, and customer accounts. A measurement ends when the order is logically closed out in the...

  20. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.