Sample records for objective quantitative evaluation

  1. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  2. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  3. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior.

    PubMed

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-01-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  4. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  5. Objective, Quantitative, Data-Driven Assessment of Chemical Probes.

    PubMed

    Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan

    2018-02-15

    Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Matrix evaluation of science objectives

    NASA Technical Reports Server (NTRS)

    Wessen, Randii R.

    1994-01-01

    The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.

  7. Objective and quantitative equilibriometric evaluation of individual locomotor behaviour in schizophrenia: Translational and clinical implications.

    PubMed

    Haralanov, Svetlozar; Haralanova, Evelina; Milushev, Emil; Shkodrova, Diana; Claussen, Claus-Frenz

    2018-04-17

    Psychiatry is the only medical specialty that lacks clinically applicable biomarkers for objective evaluation of the existing pathology at a single-patient level. On the basis of an original translational equilibriometric method for evaluation of movement patterns, we have introduced in the everyday clinical practice of psychiatry an easy-to-perform computerized objective quantification of the individual locomotor behaviour during execution of the Unterberger stepping test. For the last 20 years, we have gradually collected a large database of more than 1000 schizophrenic patients, their relatives, and matched psychiatric, neurological, and healthy controls via cross-sectional and longitudinal investigations. Comparative analyses revealed transdiagnostic locomotor similarities among schizophrenic patients, high-risk schizotaxic individuals, and neurological patients with multiple sclerosis and cerebellar ataxia, thus suggesting common underlying brain mechanisms. In parallel, intradiagnostic dissimilarities were revealed, which allow to separate out subclinical locomotor subgroups within the diagnostic categories. Prototypical qualitative (dysmetric and ataxic) locomotor abnormalities in schizophrenic patients were differentiated from 2 atypical quantitative ones, manifested as either hypolocomotion or hyperlocomotion. Theoretical analyses suggested that these 3 subtypes of locomotor abnormalities could be conceived as objectively measurable biomarkers of 3 schizophrenic subgroups with dissimilar brain mechanisms, which require different treatment strategies. Analogies with the prominent role of locomotor measures in some well-known animal models of mental disorders advocate for a promising objective translational research in the so far over-subjective field of psychiatry. Distinctions among prototypical, atypical, and diagnostic biomarkers, as well as between neuromotor and psychomotor locomotor abnormalities, are discussed. Conclusions are drawn about the

  8. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  9. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  10. Object-Oriented Algorithm For Evaluation Of Fault Trees

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1992-01-01

    Algorithm for direct evaluation of fault trees incorporates techniques of object-oriented programming. Reduces number of calls needed to solve trees with repeated events. Provides significantly improved software environment for such computations as quantitative analyses of safety and reliability of complicated systems of equipment (e.g., spacecraft or factories).

  11. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  12. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  13. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  14. QUANTITATIVE EVALUATION OF THE HYPOTHESIS THAT BL LACERTAE OBJECTS ARE QSO REMNANTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borra, E. F.

    2014-11-20

    We evaluate with numerical simulations the hypothesis that BL Lacertae objects (BLLs) are the remnants of quasi-stellar objects. This hypothesis is based on their highly peculiar redshift evolution. They have a comoving space density that increases with decreasing redshift, contrary to all other active galactic nuclei. We assume that relativistic jets are below detection in young radio-quiet quasars and increase in strength with cosmic time so that they eventually are detected as BLLs. Our numerical simulations fit very well the observed redshift distributions of BLLs. There are strong indications that only the high-synchrotron-peaked BLLs could be QSO remnants.

  15. Quantitative phase-contrast digital holographic microscopy for cell dynamic evaluation

    NASA Astrophysics Data System (ADS)

    Yu, Lingfeng; Mohanty, Samarendra; Berns, Michael W.; Chen, Zhongping

    2009-02-01

    The laser microbeam uses lasers to alter and/or to ablate intracellular organelles and cellular and tissue samples, and, today, has become an important tool for cell biologists to study the molecular mechanism of complex biological systems by removing individual cells or sub-cellular organelles. However, absolute quantitation of the localized alteration/damage to transparent phase objects, such as the cell membrane or chromosomes, was not possible using conventional phase-contrast or differential interference contrast microscopy. We report the development of phase-contrast digital holographic microscopy for quantitative evaluation of cell dynamic changes in real time during laser microsurgery. Quantitative phase images are recorded during the process of laser microsurgery and thus, the dynamic change in phase can be continuously evaluated. Out-of-focus organelles are re-focused by numerical reconstruction algorithms.

  16. Nuclear medicine and quantitative imaging research (instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1990-09-01

    This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less

  17. Objective evaluation of cutaneous thermal sensivity

    NASA Technical Reports Server (NTRS)

    Vanbeaumont, W.

    1972-01-01

    The possibility of obtaining reliable and objective quantitative responses was investigated under conditions where only temperature changes in localized cutaneous areas evoked measurable changes in remote sudomotor activity. Both male and female subjects were studied to evaluate sex difference in thermal sensitivity. The results discussed include: sweat rate responses to contralateral cooling, comparison of sweat rate responses between men and women to contralateral cooling, influence of the menstrual cycle on the sweat rate responses to contralateral cooling, comparison of threshold of sweating responses between men and women, and correlation of latency to threshold for whole body sweating. It is concluded that the quantitative aspects of the reflex response is affected by both the density and activation of receptors as well as the rate of heat loss; men responded 8-10% more frequently than women to thermode cooling, the magnitude of responses being greater for men; and women responded 7-9% more frequently to thermode cooling on day 1 of menstruation, as compared to day 15.

  18. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  19. Objective evaluation of chemotherapy-induced peripheral neuropathy using quantitative pain measurement system (Pain Vision®), a pilot study.

    PubMed

    Sato, Junya; Mori, Megumi; Nihei, Satoru; Takeuchi, Satoshi; Kashiwaba, Masahiro; Kudo, Kenzo

    2017-01-01

    In an evaluation of chemotherapy-induced peripheral neuropathy (CIPN), objectivity may be poor because the evaluation is determined by the patient's subjective assessment. In such cases, management of neuropathy may be delayed and CIPN symptoms may become severe. In this pilot study, we attempted an objective evaluation of CIPN using a quantitative pain measurement system (Pain Vision ® ). The subjects were patients with gynecologic cancer who underwent chemotherapy using taxane and platinum drugs. The grade of the peripheral sensory nerve disorder was based on the Common Terminology Criteria for Adverse Events (CTC-AE) ver. 4.0 and was evaluated before the initiation of therapy and up to six chemotherapy cycles. A symptom scale assessed by the patients using a peripheral neuropathy questionnaire (PNQ) was also evaluated. Simultaneously during these evaluations, graded electric current was applied from the probe to a fingertip and measured both the lowest perceptible current and lowest current perceived as pain by Pain Vision ® . From these values, the pain degree was calculated from the following formula: (pain perception current value - lowest perceptible current value) ÷ lowest perceptible current value × 100. We compared the pain degrees by Pain Vision ® during CIPN development with the value obtained before chemotherapy initiation. Forty-one patients were enrolled. In the evaluation by a medical professional, 28 (64.3%) patients developed CIPN during 2.5 ± 1.1 chemotherapy cycles (mean ± standard deviation). The pain degree by Pain Vision ® at grade 1 and 2 CIPN development according to the evaluation (CTC-AE) was significantly decreased compared to that before chemotherapy initiation (126.0 ± 114.5 vs. 69.8 ± 46.8, p  = 0.001, and 126.0 ± 114.5 vs. 32.8 ± 32.6, p  = 0.004). Changes in the pain degree by Pain Vision ® were also found during scale B and C, D CIPN development in the patient evaluation (PNQ) (115.9 ± 112.4 vs. 70

  20. A new method to evaluate image quality of CBCT images quantitatively without observers

    PubMed Central

    Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori

    2017-01-01

    Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343

  1. Characterization and Application of a Grazing Angle Objective for Quantitative Infrared Reflection Microspectroscopy

    NASA Technical Reports Server (NTRS)

    Pepper, Stephen V.

    1995-01-01

    A grazing angle objective on an infrared microspectrometer is studied for quantitative spectroscopy by considering the angular dependence of the incident intensity within the objective's angular aperture. The assumption that there is no angular dependence is tested by comparing the experimental reflectance of Si and KBr surfaces with the reflectance calculated by integrating the Fresnel reflection coefficient over the angular aperture under this assumption. Good agreement was found, indicating that the specular reflectance of surfaces can straight-forwardly be quantitatively integrated over the angular aperture without considering non-uniform incident intensity. This quantitative approach is applied to the thickness determination of dipcoated Krytox on gold. The infrared optical constants of both materials are known, allowing the integration to be carried out. The thickness obtained is in fair agreement with the value determined by ellipsometry in the visible. Therefore, this paper illustrates a method for more quantitative use of a grazing angle objective for infrared reflectance microspectroscopy.

  2. FE-ANN based modeling of 3D Simple Reinforced Concrete Girders for Objective Structural Health Evaluation : Tech Transfer Summary

    DOT National Transportation Integrated Search

    2017-06-01

    The objective of this study was to develop an objective, quantitative method for evaluating damage to bridge girders by using artificial neural networks (ANNs). This evaluation method, which is a supplement to visual inspection, requires only the res...

  3. High Resolution Qualitative and Quantitative MR Evaluation of the Glenoid Labrum

    PubMed Central

    Iwasaki, Kenyu; Tafur, Monica; Chang, Eric Y.; SherondaStatum; Biswas, Reni; Tran, Betty; Bae, Won C.; Du, Jiang; Bydder, Graeme M.; Chung, Christine B.

    2015-01-01

    Objective To implement qualitative and quantitative MR sequences for the evaluation of labral pathology. Methods Six glenoid labra were dissected and the anterior and posterior portions were divided into normal, mildly degenerated, or severely degenerated groups using gross and MR findings. Qualitative evaluation was performed using T1-weighted, proton density-weighted (PD), spoiled gradient echo (SPGR) and ultra-short echo time (UTE) sequences. Quantitative evaluation included T2 and T1rho measurements as well as T1, T2*, and T1rho measurements acquired with UTE techniques. Results SPGR and UTE sequences best demonstrated labral fiber structure. Degenerated labra had a tendency towards decreased T1 values, increased T2/T2* values and increased T1 rho values. T2* values obtained with the UTE sequence allowed for delineation between normal, mildly degenerated and severely degenerated groups (p<0.001). Conclusion Quantitative T2* measurements acquired with the UTE technique are useful for distinguishing between normal, mildly degenerated and severely degenerated labra. PMID:26359581

  4. Holographic quantitative imaging of sample hidden by turbid medium or occluding objects

    NASA Astrophysics Data System (ADS)

    Bianco, V.; Miccio, L.; Merola, F.; Memmolo, P.; Gennari, O.; Paturzo, Melania; Netti, P. A.; Ferraro, P.

    2015-03-01

    Digital Holography (DH) numerical procedures have been developed to allow imaging through turbid media. A fluid is considered turbid when dispersed particles provoke strong light scattering, thus destroying the image formation by any standard optical system. Here we show that sharp amplitude imaging and phase-contrast mapping of object hidden behind turbid medium and/or occluding objects are possible in harsh noise conditions and with a large field-of view by Multi-Look DH microscopy. In particular, it will be shown that both amplitude imaging and phase-contrast mapping of cells hidden behind a flow of Red Blood Cells can be obtained. This allows, in a noninvasive way, the quantitative evaluation of living processes in Lab on Chip platforms where conventional microscopy techniques fail. The combination of this technique with endoscopic imaging can pave the way for the holographic blood vessel inspection, e.g. to look for settled cholesterol plaques as well as blood clots for a rapid diagnostics of blood diseases.

  5. A GIS-BASED METHOD FOR MULTI-OBJECTIVE EVALUATION OF PARK VEGETATION. (R824766)

    EPA Science Inventory

    Abstract

    In this paper we describe a method for evaluating the concordance between a set of mapped landscape attributes and a set of quantitatively expressed management priorities. The method has proved to be useful in planning urban green areas, allowing objectively d...

  6. [Study on objectively evaluating skin aging according to areas of skin texture].

    PubMed

    Shan, Gaixin; Gan, Ping; He, Ling; Sun, Lu; Li, Qiannan; Jiang, Zheng; He, Xiangqian

    2015-02-01

    Skin aging principles play important roles in skin disease diagnosis, the evaluation of skin cosmetic effect, forensic identification and age identification in sports competition, etc. This paper proposes a new method to evaluate the skin aging objectively and quantitatively by skin texture area. Firstly, the enlarged skin image was acquired. Then, the skin texture image was segmented by using the iterative threshold method, and the skin ridge image was extracted according to the watershed algorithm. Finally, the skin ridge areas of the skin texture were extracted. The experiment data showed that the average areas of skin ridges, of both men and women, had a good correlation with age (the correlation coefficient r of male was 0.938, and the correlation coefficient r of female was 0.922), and skin texture area and age regression curve showed that the skin texture area increased with age. Therefore, it is effective to evaluate skin aging objectively by the new method presented in this paper.

  7. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  8. Comparison of DNA fragmentation and color thresholding for objective quantitation of apoptotic cells

    NASA Technical Reports Server (NTRS)

    Plymale, D. R.; Ng Tang, D. S.; Fermin, C. D.; Lewis, D. E.; Martin, D. S.; Garry, R. F.

    1995-01-01

    Apoptosis is a process of cell death characterized by distinctive morphological changes and fragmentation of cellular DNA. Using video imaging and color thresholding techniques, we objectively quantitated the number of cultured CD4+ T-lymphoblastoid cells (HUT78 cells, RH9 subclone) displaying morphological signs of apoptosis before and after exposure to gamma-irradiation. The numbers of apoptotic cells measured by objective video imaging techniques were compared to numbers of apoptotic cells measured in the same samples by sensitive apoptotic assays that quantitate DNA fragmentation. DNA fragmentation assays gave consistently higher values compared with the video imaging assays that measured morphological changes associated with apoptosis. These results suggest that substantial DNA fragmentation can precede or occur in the absence of the morphological changes which are associated with apoptosis in gamma-irradiated RH9 cells.

  9. Objective Evaluation of Visual Fatigue Using Binocular Fusion Maintenance.

    PubMed

    Hirota, Masakazu; Morimoto, Takeshi; Kanda, Hiroyuki; Endo, Takao; Miyoshi, Tomomitsu; Miyagawa, Suguru; Hirohara, Yoko; Yamaguchi, Tatsuo; Saika, Makoto; Fujikado, Takashi

    2018-03-01

    In this study, we investigated whether an individual's visual fatigue can be evaluated objectively and quantitatively from their ability to maintain binocular fusion. Binocular fusion maintenance (BFM) was measured using a custom-made binocular open-view Shack-Hartmann wavefront aberrometer equipped with liquid crystal shutters, wherein eye movements and wavefront aberrations were measured simultaneously. Transmittance in the liquid crystal shutter in front of the subject's nondominant eye was reduced linearly, and BFM was determined from the transmittance at the point when binocular fusion was broken and vergence eye movement was induced. In total, 40 healthy subjects underwent the BFM test and completed a questionnaire regarding subjective symptoms before and after a visual task lasting 30 minutes. BFM was significantly reduced after the visual task ( P < 0.001) and was negatively correlated with the total subjective eye symptom score (adjusted R 2 = 0.752, P < 0.001). Furthermore, the diagnostic accuracy for visual fatigue was significantly higher in BFM than in the conventional test results (aggregated fusional vergence range, near point of convergence, and the high-frequency component of accommodative microfluctuations; P = 0.007). These results suggest that BFM can be used as an indicator for evaluating visual fatigue. BFM can be used to evaluate the visual fatigue caused by the new visual devices, such as head-mount display, objectively.

  10. Objective Evaluation of Visual Fatigue Using Binocular Fusion Maintenance

    PubMed Central

    Hirota, Masakazu; Morimoto, Takeshi; Kanda, Hiroyuki; Endo, Takao; Miyoshi, Tomomitsu; Miyagawa, Suguru; Hirohara, Yoko; Yamaguchi, Tatsuo; Saika, Makoto

    2018-01-01

    Purpose In this study, we investigated whether an individual's visual fatigue can be evaluated objectively and quantitatively from their ability to maintain binocular fusion. Methods Binocular fusion maintenance (BFM) was measured using a custom-made binocular open-view Shack–Hartmann wavefront aberrometer equipped with liquid crystal shutters, wherein eye movements and wavefront aberrations were measured simultaneously. Transmittance in the liquid crystal shutter in front of the subject's nondominant eye was reduced linearly, and BFM was determined from the transmittance at the point when binocular fusion was broken and vergence eye movement was induced. In total, 40 healthy subjects underwent the BFM test and completed a questionnaire regarding subjective symptoms before and after a visual task lasting 30 minutes. Results BFM was significantly reduced after the visual task (P < 0.001) and was negatively correlated with the total subjective eye symptom score (adjusted R2 = 0.752, P < 0.001). Furthermore, the diagnostic accuracy for visual fatigue was significantly higher in BFM than in the conventional test results (aggregated fusional vergence range, near point of convergence, and the high-frequency component of accommodative microfluctuations; P = 0.007). Conclusions These results suggest that BFM can be used as an indicator for evaluating visual fatigue. Translational Relevance BFM can be used to evaluate the visual fatigue caused by the new visual devices, such as head-mount display, objectively. PMID:29600117

  11. Quantitative evaluation of his-tag purification and immunoprecipitation of tristetraprolin and its mutant proteins from transfected human cells

    USDA-ARS?s Scientific Manuscript database

    Histidine (His)-tag is widely used for affinity purification of recombinant proteins, but the yield and purity of expressed proteins are quite different. Little information is available about quantitative evaluation of this procedure. The objective of the current study was to evaluate the His-tag pr...

  12. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, Robert V.

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infra-red sensing devices.

  13. Quantitative method for measuring heat flux emitted from a cryogenic object

    DOEpatents

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  14. A novel, objective, quantitative method of evaluation of the back pain component using comparative computerized multi-parametric tactile mapping before/after spinal cord stimulation and database analysis: the "Neuro-Pain't" software.

    PubMed

    Rigoard, P; Nivole, K; Blouin, P; Monlezun, O; Roulaud, M; Lorgeoux, B; Bataille, B; Guetarni, F

    2015-03-01

    One of the major challenges of neurostimulation is actually to address the back pain component in patients suffering from refractory chronic back and leg pain. Facing a tremendous expansion of neurostimulation techniques and available devices, implanters and patients can still remain confused as they need to select the right tool for the right indication. To be able to evaluate and compare objectively patient outcomes, depending on therapeutical strategies, it appears essential to develop a rational and quantitative approach to pain assessment for those who undergo neurostimulation implantation. We developed a touch screen interface, in Poitiers University Hospital and N(3)Lab, called the "Neuro-Pain'T", to detect, record and quantify the painful area surface and intensity changes in an implanted patient within time. The second aim of this software is to analyse the link between a paraesthesia coverage generated by a type of neurostimulation and a potential analgesic effect, measured by pain surface reduction, pain intensity reduction within the painful surface and local change in pain characteristics distribution. The third aim of Neuro-Pain'T is to correlate these clinical parameters to global patient data and functional outcome analysis, via a network database (Neuro-Database), to be able to provide a concise but objective approach of the neurostimulation efficacy, summarized by an index called "RFG Index". This software has been used in more than 190 patients since 2012, leading us to define three clinical parameters grouped as a clinical component of the RFG Index, which might be helpful to assess neurostimulation efficacy and compare implanted devices. The Neuro-Pain'T is an original software designed to objectively and quantitatively characterize reduction of a painful area in a given individual, in terms of intensity, surface and pain typology, in response to a treatment strategy or implantation of an analgesic device. Because pain is a physical sensation

  15. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  16. Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders

    PubMed Central

    Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu

    2014-01-01

    Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709

  17. Quantitative light-induced fluorescence technology for quantitative evaluation of tooth wear

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Kyeom; Lee, Hyung-Suk; Park, Seok-Woo; Lee, Eun-Song; de Josselin de Jong, Elbert; Jung, Hoi-In; Kim, Baek-Il

    2017-12-01

    Various technologies used to objectively determine enamel thickness or dentin exposure have been suggested. However, most methods have clinical limitations. This study was conducted to confirm the potential of quantitative light-induced fluorescence (QLF) using autofluorescence intensity of occlusal surfaces of worn teeth according to enamel grinding depth in vitro. Sixteen permanent premolars were used. Each tooth was gradationally ground down at the occlusal surface in the apical direction. QLF-digital and swept-source optical coherence tomography images were acquired at each grinding depth (in steps of 100 μm). All QLF images were converted to 8-bit grayscale images to calculate the fluorescence intensity. The maximum brightness (MB) values of the same sound regions in grayscale images before (MB) and phased values after (MB) the grinding process were calculated. Finally, 13 samples were evaluated. MB increased over the grinding depth range with a strong correlation (r=0.994, P<0.001). In conclusion, the fluorescence intensity of the teeth and grinding depth was strongly correlated in the QLF images. Therefore, QLF technology may be a useful noninvasive tool used to monitor the progression of tooth wear and to conveniently estimate enamel thickness.

  18. Evaluation of image quality in terahertz pulsed imaging using test objects.

    PubMed

    Fitzgerald, A J; Berry, E; Miles, R E; Zinovev, N N; Smith, M A; Chamberlain, J M

    2002-11-07

    As with other imaging modalities, the performance of terahertz (THz) imaging systems is limited by factors of spatial resolution, contrast and noise. The purpose of this paper is to introduce test objects and image analysis methods to evaluate and compare THz image quality in a quantitative and objective way, so that alternative terahertz imaging system configurations and acquisition techniques can be compared, and the range of image parameters can be assessed. Two test objects were designed and manufactured, one to determine the modulation transfer functions (MTF) and the other to derive image signal to noise ratio (SNR) at a range of contrasts. As expected the higher THz frequencies had larger MTFs, and better spatial resolution as determined by the spatial frequency at which the MTF dropped below the 20% threshold. Image SNR was compared for time domain and frequency domain image parameters and time delay based images consistently demonstrated higher SNR than intensity based parameters such as relative transmittance because the latter are more strongly affected by the sources of noise in the THz system such as laser fluctuations and detector shot noise.

  19. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges.

    PubMed

    Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.

  20. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  1. Quantitative evaluations of ankle spasticity and stiffness in neurological disorders using manual spasticity evaluator.

    PubMed

    Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J; Zhang, Li-Qun

    2011-01-01

    Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke.

  2. A new approach for the quantitative evaluation of drawings in children with learning disabilities.

    PubMed

    Galli, Manuela; Vimercati, Sara Laura; Stella, Giacomo; Caiazzo, Giorgia; Norveti, Federica; Onnis, Francesca; Rigoldi, Chiara; Albertini, Giorgio

    2011-01-01

    A new method for a quantitative and objective description of drawing and for the quantification of drawing ability in children with learning disabilities (LD) is hereby presented. Twenty-four normally developing children (N) (age 10.6 ± 0.5) and 18 children with learning disabilities (LD) (age 10.3 ± 2.4) took part to the study. The drawing tasks were chosen among those already used in clinical daily experience (Denver Developmental Screening Test). Some parameters were defined in order to quantitatively describe the features of the children's drawings, introducing new objective measurements beside the subjective standard clinical evaluation. The experimental set-up revealed to be valid for clinical application with LD children. The parameters highlighted the presence of differences in the drawing features of N and LD children. This paper suggests the applicability of this protocol to other fields of motor and cognitive valuation, as well as the possibility to study the upper limbs position and muscle activation during drawing. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. An anthropomorphic phantom for quantitative evaluation of breast MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo

    2011-02-01

    In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of

  4. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  5. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  6. Quantitative endoscopy: initial accuracy measurements.

    PubMed

    Truitt, T O; Adelman, R A; Kelly, D H; Willging, J P

    2000-02-01

    The geometric optics of an endoscope can be used to determine the absolute size of an object in an endoscopic field without knowing the actual distance from the object. This study explores the accuracy of a technique that estimates absolute object size from endoscopic images. Quantitative endoscopy involves calibrating a rigid endoscope to produce size estimates from 2 images taken with a known traveled distance between the images. The heights of 12 samples, ranging in size from 0.78 to 11.80 mm, were estimated with this calibrated endoscope. Backup distances of 5 mm and 10 mm were used for comparison. The mean percent error for all estimated measurements when compared with the actual object sizes was 1.12%. The mean errors for 5-mm and 10-mm backup distances were 0.76% and 1.65%, respectively. The mean errors for objects <2 mm and > or =2 mm were 0.94% and 1.18%, respectively. Quantitative endoscopy estimates endoscopic image size to within 5% of the actual object size. This method remains promising for quantitatively evaluating object size from endoscopic images. It does not require knowledge of the absolute distance of the endoscope from the object, rather, only the distance traveled by the endoscope between images.

  7. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  8. Rubrics for Evaluating Open Education Resource (OER) Objects

    ERIC Educational Resources Information Center

    Achieve, Inc., 2011

    2011-01-01

    The rubrics presented in this report represent an evaluation system for objects found within Open Education Resources. An object could include images, applets, lessons, units, assessments and more. For the purpose of this evaluation, any component that can exist as a stand-alone qualifies as an object. The rubrics in this packet can be applied…

  9. Establishment of a Quantitative Medical Technology Evaluation System and Indicators within Medical Institutions.

    PubMed

    Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji

    2018-06-05

    The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the

  10. Quantitative evaluation of toothbrush and arm-joint motion during tooth brushing.

    PubMed

    Inada, Emi; Saitoh, Issei; Yu, Yong; Tomiyama, Daisuke; Murakami, Daisuke; Takemoto, Yoshihiko; Morizono, Ken; Iwasaki, Tomonori; Iwase, Yoko; Yamasaki, Youichi

    2015-07-01

    It is very difficult for dental professionals to objectively assess tooth brushing skill of patients, because an obvious index to assess the brushing motion of patients has not been established. The purpose of this study was to quantitatively evaluate toothbrush and arm-joint motion during tooth brushing. Tooth brushing motion, performed by dental hygienists for 15 s, was captured using a motion-capture system that continuously calculates the three-dimensional coordinates of object's motion relative to the floor. The dental hygienists performed the tooth brushing on the buccal and palatal sides of their right and left upper molars. The frequencies and power spectra of toothbrush motion and joint angles of the shoulder, elbow, and wrist were calculated and analyzed statistically. The frequency of toothbrush motion was higher on the left side (both buccal and palatal areas) than on the right side. There were no significant differences among joint angle frequencies within each brushing area. The inter- and intra-individual variations of the power spectrum of the elbow flexion angle when brushing were smaller than for any of the other angles. This study quantitatively confirmed that dental hygienists have individual distinctive rhythms during tooth brushing. All arm joints moved synchronously during brushing, and tooth brushing motion was controlled by coordinated movement of the joints. The elbow generated an individual's frequency through a stabilizing movement. The shoulder and wrist control the hand motion, and the elbow generates the cyclic rhythm during tooth brushing.

  11. ITEM SELECTION TECHNIQUES AND EVALUATION OF INSTRUCTIONAL OBJECTIVES.

    ERIC Educational Resources Information Center

    COX, RICHARD C.

    THE VALIDITY OF AN EDUCATIONAL ACHIEVEMENT TEST DEPENDS UPON THE CORRESPONDENCE BETWEEN SPECIFIED EDUCATIONAL OBJECTIVES AND THE EXTENT TO WHICH THESE OBJECTIVES ARE MEASURED BY THE EVALUATION INSTRUMENT. THIS STUDY IS DESIGNED TO EVALUATE THE EFFECT OF STATISTICAL ITEM SELECTION ON THE STRUCTURE OF THE FINAL EVALUATION INSTRUMENT AS COMPARED WITH…

  12. A quantitative approach to evaluating caring in nursing simulation.

    PubMed

    Eggenberger, Terry L; Keller, Kathryn B; Chase, Susan K; Payne, Linda

    2012-01-01

    This study was designed to test a quantitative method of measuring caring in the simulated environment. Since competency in caring is central to nursing practice, ways of including caring concepts in designing scenarios and in evaluation of performance need to be developed. Coates' Caring Efficacy scales were adapted for simulation and named the Caring Efficacy Scale-Simulation Student Version (CES-SSV) and Caring Efficacy Scale-Simulation Faculty Version (CES-SFV). A correlational study was designed to compare student self-ratings with faculty ratings on caring efficacy during an adult acute simulation experience with traditional and accelerated baccalaureate students in a nursing program grounded in caring theory. Student self-ratings were significantly correlated with objective ratings (r = 0.345, 0.356). Both the CES-SSV and the CES-SFV were found to have excellent internal consistency and significantly correlated interrater reliability. They were useful in measuring caring in the simulated learning environment.

  13. Introduction of a method for quantitative evaluation of spontaneous motor activity development with age in infants.

    PubMed

    Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter

    2012-04-01

    Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.

  14. Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1989-09-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less

  15. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  16. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  17. Subjective vs objective evaluations of smile esthetics.

    PubMed

    Schabel, Brian J; Franchi, Lorenzo; Baccetti, Tiziano; McNamara, James A

    2009-04-01

    The aim of this study was to analyze the relationships between subjective evaluations of posttreatment smiles captured with clinical photography and rated by a panel of orthodontists and parents of orthodontic patients, and objective evaluations of the same smiles from the Smile Mesh program (TDG Computing, Philadelphia, Pa). The clinical photographs of 48 orthodontically treated patients were rated by a panel of 25 experienced orthodontists and 20 parents of patients. Independent samples t tests were used to test whether objective measurements were significantly different between subjects with "attractive" and "unattractive" smiles, and those with the "most attractive" and "least attractive" smiles. Additionally, logistic regression was performed to evaluate whether the measurements could predict whether a smile captured with clinical photography would be attractive or unattractive. The comparison between groups showed no significant differences for any measurement. Subjects with the "most unattractive" smiles had a significantly greater distance between the incisal edge of the maxillary central incisors and the lower lip during smiling, and a significantly smaller smile index than did those with the "most attractive" smiles. As shown by the coefficients of logistic regression, smile attractiveness could not be predicted by any objectively gathered measurement. No objective measure of the smile could predict attractive or unattractive smiles as judged subjectively.

  18. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  19. Writing objectives and evaluating learning in the affective domain.

    PubMed

    Maier-Lorentz, M M

    1999-01-01

    Staff educators recognize the importance of affective competency for effective nursing practice. Inservice programs must include affective learning with objectives stated in measurable terms. Staff educators often express frustration in developing affective objectives and evaluating the learning outcome because attitudes and feelings are usually inferred from observations. This article presents affective learning objectives for a gerontological nursing inservice program and a rating scale that measures attitudes to evaluate the learning outcome.

  20. An Evaluation of the Effects of Experimenter Control of Objects on Individuals' Engagement in Object Stereotypy

    ERIC Educational Resources Information Center

    Stangeland, Lindsay A.; Smith, Dean P.; Rapp, John T.

    2012-01-01

    In two experiments, the authors evaluated the extent to which (a) individuals preferred engaging in object stereotypy versus observing an experimenter while the experimenter engaged in object stereotypy and (b) an experimenter's engagement in object stereotypy decreased the participants' engagement in object stereotypy. Results of Experiment 1…

  1. Contributions to Objective Measurement and Evaluation of Trainee Competency.

    ERIC Educational Resources Information Center

    Moonan, William J.

    The purpose of this paper is to lay a basis for and discuss the components of a system, called COMET, designed to objectively measure and evaluate the competency of trainees in military training enterprises. COMET is an acronym for "Computerized Objective Measurement and Evaluation of Trainees." These goals will be accomplished by: (a)…

  2. Physiological and subjective evaluation of a human-robot object hand-over task.

    PubMed

    Dehais, Frédéric; Sisbot, Emrah Akin; Alami, Rachid; Causse, Mickaël

    2011-11-01

    In the context of task sharing between a robot companion and its human partners, the notions of safe and compliant hardware are not enough. It is necessary to guarantee ergonomic robot motions. Therefore, we have developed Human Aware Manipulation Planner (Sisbot et al., 2010), a motion planner specifically designed for human-robot object transfer by explicitly taking into account the legibility, the safety and the physical comfort of robot motions. The main objective of this research was to define precise subjective metrics to assess our planner when a human interacts with a robot in an object hand-over task. A second objective was to obtain quantitative data to evaluate the effect of this interaction. Given the short duration, the "relative ease" of the object hand-over task and its qualitative component, classical behavioral measures based on accuracy or reaction time were unsuitable to compare our gestures. In this perspective, we selected three measurements based on the galvanic skin conductance response, the deltoid muscle activity and the ocular activity. To test our assumptions and validate our planner, an experimental set-up involving Jido, a mobile manipulator robot, and a seated human was proposed. For the purpose of the experiment, we have defined three motions that combine different levels of legibility, safety and physical comfort values. After each robot gesture the participants were asked to rate them on a three dimensional subjective scale. It has appeared that the subjective data were in favor of our reference motion. Eventually the three motions elicited different physiological and ocular responses that could be used to partially discriminate them. Copyright © 2011 Elsevier Ltd and the Ergonomics Society. All rights reserved.

  3. Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.

    PubMed

    Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro

    2016-03-01

    Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.

  4. Fuzzy object models for newborn brain MR image segmentation

    NASA Astrophysics Data System (ADS)

    Kobashi, Syoji; Udupa, Jayaram K.

    2013-03-01

    Newborn brain MR image segmentation is a challenging problem because of variety of size, shape and MR signal although it is the fundamental study for quantitative radiology in brain MR images. Because of the large difference between the adult brain and the newborn brain, it is difficult to directly apply the conventional methods for the newborn brain. Inspired by the original fuzzy object model introduced by Udupa et al. at SPIE Medical Imaging 2011, called fuzzy shape object model (FSOM) here, this paper introduces fuzzy intensity object model (FIOM), and proposes a new image segmentation method which combines the FSOM and FIOM into fuzzy connected (FC) image segmentation. The fuzzy object models are built from training datasets in which the cerebral parenchyma is delineated by experts. After registering FSOM with the evaluating image, the proposed method roughly recognizes the cerebral parenchyma region based on a prior knowledge of location, shape, and the MR signal given by the registered FSOM and FIOM. Then, FC image segmentation delineates the cerebral parenchyma using the fuzzy object models. The proposed method has been evaluated using 9 newborn brain MR images using the leave-one-out strategy. The revised age was between -1 and 2 months. Quantitative evaluation using false positive volume fraction (FPVF) and false negative volume fraction (FNVF) has been conducted. Using the evaluation data, a FPVF of 0.75% and FNVF of 3.75% were achieved. More data collection and testing are underway.

  5. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    PubMed

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  6. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  7. Quantitative Evaluation of Performance during Robot-assisted Treatment.

    PubMed

    Peri, E; Biffi, E; Maghini, C; Servodio Iammarrone, F; Gagliardi, C; Germiniasi, C; Pedrocchi, A; Turconi, A C; Reni, G

    2016-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The great potential of robots in extracting quantitative and meaningful data is not always exploited in clinical practice. The aim of the present work is to describe a simple parameter to assess the performance of subjects during upper limb robotic training exploiting data automatically recorded by the robot, with no additional effort for patients and clinicians. Fourteen children affected by cerebral palsy (CP) performed a training with Armeo®Spring. Each session was evaluated with P, a simple parameter that depends on the overall performance recorded, and median and interquartile values were computed to perform a group analysis. Median (interquartile) values of P significantly increased from 0.27 (0.21) at T0 to 0.55 (0.27) at T1 . This improvement was functionally validated by a significant increase of the Melbourne Assessment of Unilateral Upper Limb Function. The parameter described here was able to show variations in performance over time and enabled a quantitative evaluation of motion abilities in a way that is reliable with respect to a well-known clinical scale.

  8. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  9. Multi-objective optimization for evaluation of simulation fidelity for precipitation, cloudiness and insolation in regional climate models

    NASA Astrophysics Data System (ADS)

    Lee, H.

    2016-12-01

    Precipitation is one of the most important climate variables that are taken into account in studying regional climate. Nevertheless, how precipitation will respond to a changing climate and even its mean state in the current climate are not well represented in regional climate models (RCMs). Hence, comprehensive and mathematically rigorous methodologies to evaluate precipitation and related variables in multiple RCMs are required. The main objective of the current study is to evaluate the joint variability of climate variables related to model performance in simulating precipitation and condense multiple evaluation metrics into a single summary score. We use multi-objective optimization, a mathematical process that provides a set of optimal tradeoff solutions based on a range of evaluation metrics, to characterize the joint representation of precipitation, cloudiness and insolation in RCMs participating in the North American Regional Climate Change Assessment Program (NARCCAP) and Coordinated Regional Climate Downscaling Experiment-North America (CORDEX-NA). We also leverage ground observations, NASA satellite data and the Regional Climate Model Evaluation System (RCMES). Overall, the quantitative comparison of joint probability density functions between the three variables indicates that performance of each model differs markedly between sub-regions and also shows strong seasonal dependence. Because of the large variability across the models, it is important to evaluate models systematically and make future projections using only models showing relatively good performance. Our results indicate that the optimized multi-model ensemble always shows better performance than the arithmetic ensemble mean and may guide reliable future projections.

  10. Breast Retraction Assessment: an objective evaluation of cosmetic results of patients treated conservatively for breast cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pezner, R.D.; Patterson, M.P.; Hill, L.R.

    Breast Retraction Assessment (BRA) is an objective evaluation of the amount of cosmetic retraction of the treated breast in comparison to the untreated breast in patients who receive conservative treatment for breast cancer. A clear acrylic sheet supported vertically and marked as a grid at 1 cm intervals is employed to perform the measurements. Average BRA value in 29 control patients without breast cancer was 1.2 cm. Average BRA value in 27 patients treated conservatively for clinical Stage I or II unilateral breast cancer was 3.7 cm. BRA values in breast cancer patients ranged from 0.0 to 8.5 cm. Patientsmore » who received a local radiation boost to the primary tumor bed site had statistically significantly less retraction than those who did not receive a boost. Patients who had an extensive primary tumor resection had statistically significantly more retraction than those who underwent a more limited resection. In comparison to qualitative forms of cosmetic analysis, BRA is an objective test that can quantitatively evaluate factors which may be related to cosmetic retraction in patients treated conservatively for breast cancer.« less

  11. A Systematic Quantitative-Qualitative Model: How To Evaluate Professional Services

    ERIC Educational Resources Information Center

    Yoda, Koji

    1973-01-01

    The proposed evaluation model provides for the assignment of relative weights to each criterion, and establishes a weighting system for calculating a quantitative-qualitative raw score for each service activity of a faculty member being reviewed. (Author)

  12. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  13. Examining Readers' Evaluations of Objectivity and Bias in News Discourse

    ERIC Educational Resources Information Center

    Cramer, Peter; Eisenhart, Christopher

    2014-01-01

    Readers' objectivity and bias evaluations of news texts were investigated in order to better understand the process by which readers make these kinds of judgments and the evidence on which they base them. Readers were primed to evaluate news texts for objectivity and bias, and their selections and metacommentary were analyzed. Readers detected…

  14. Quantitative evaluation of in vivo vital-dye fluorescence endoscopic imaging for the detection of Barrett's-associated neoplasia.

    PubMed

    Thekkek, Nadhi; Lee, Michelle H; Polydorides, Alexandros D; Rosen, Daniel G; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2015-05-01

    Current imaging tools are associated with inconsistent sensitivity and specificity for detection of Barrett's-associated neoplasia. Optical imaging has shown promise in improving the classification of neoplasia in vivo. The goal of this pilot study was to evaluate whether in vivo vital dye fluorescence imaging (VFI) has the potential to improve the accuracy of early-detection of Barrett's-associated neoplasia. In vivo endoscopic VFI images were collected from 65 sites in 14 patients with confirmed Barrett's esophagus (BE), dysplasia, oresophageal adenocarcinoma using a modular video endoscope and a high-resolution microendoscope(HRME). Qualitative image features were compared to histology; VFI and HRME images show changes in glandular structure associated with neoplastic progression. Quantitative image features in VFI images were identified for objective image classification of metaplasia and neoplasia, and a diagnostic algorithm was developed using leave-one-out cross validation. Three image features extracted from VFI images were used to classify tissue as neoplastic or not with a sensitivity of 87.8% and a specificity of 77.6% (AUC = 0.878). A multimodal approach incorporating VFI and HRME imaging can delineate epithelial changes present in Barrett's-associated neoplasia. Quantitative analysis of VFI images may provide a means for objective interpretation of BE during surveillance.

  15. D Imaging for Museum Artefacts: a Portable Test Object for Heritage and Museum Documentation of Small Objects

    NASA Astrophysics Data System (ADS)

    Hess, M.; Robson, S.

    2012-07-01

    3D colour image data generated for the recording of small museum objects and archaeological finds are highly variable in quality and fitness for purpose. Whilst current technology is capable of extremely high quality outputs, there are currently no common standards or applicable guidelines in either the museum or engineering domain suited to scientific evaluation, understanding and tendering for 3D colour digital data. This paper firstly explains the rationale towards and requirements for 3D digital documentation in museums. Secondly it describes the design process, development and use of a new portable test object suited to sensor evaluation and the provision of user acceptance metrics. The test object is specifically designed for museums and heritage institutions and includes known surface and geometric properties which support quantitative and comparative imaging on different systems. The development for a supporting protocol will allow object reference data to be included in the data processing workflow with specific reference to conservation and curation.

  16. A Framework for Quantitative Evaluation of Care Coordination Effectiveness

    ERIC Educational Resources Information Center

    Liu, Wei

    2017-01-01

    The U.S. healthcare system lacks incentives and quantitative evaluation tools to assess coordination in a patient's care transition process. This is needed because poor care coordination has been identified by many studies as one of the major root causes for the U.S. health system's inefficiency, for poor outcomes, and for high cost. Despite…

  17. [Bio-objects and biological methods of space radiation effects evaluation].

    PubMed

    Kaminskaia, E V; Nevzgodina, L V; Platova, N G

    2009-01-01

    The unique conditions of space experiments place austere requirements to bio-objects and biological methods of radiation effects evaluation. The paper discusses suitability of a number of bio-objects varying in stage of evolution and metabolism for space researches aimed to state common patterns of the radiation damage caused by heavy ions (HI), and character of HI-cell interaction. Physical detectors in space experiments of the BIOBLOCK series make it possible to identify bio-objects hit by space HI and to set correlation between HI track topography and biological effect. The paper provides an all-round description of the bio-objects chosen for two BIOBLOCK experiments (population of hydrophyte Wolffia arrhiza (fam. duckweed) and Lactuca sativa seeds) and the method of evaluating effects from single space radiation HI. Direct effects of heavy ions on cells can be determined by the criteria of chromosomal aberrations and delayed morphologic abnormalities. The evaluation results are compared with the data about human blood lymphocytes. Consideration is being given to the procedures of test-objects' treatment and investigation.

  18. Objective and quantitative definitions of modified food textures based on sensory and rheological methodology

    PubMed Central

    Wendin, Karin; Ekman, Susanne; Bülow, Margareta; Ekberg, Olle; Johansson, Daniel; Rothenberg, Elisabet; Stading, Mats

    2010-01-01

    Introduction Patients who suffer from chewing and swallowing disorders, i.e. dysphagia, may have difficulties ingesting normal food and liquids. In these patients a texture modified diet may enable that the patient maintain adequate nutrition. However, there is no generally accepted definition of ‘texture’ that includes measurements describing different food textures. Objective Objectively define and quantify categories of texture-modified food by conducting rheological measurements and sensory analyses. A further objective was to facilitate the communication and recommendations of appropriate food textures for patients with dysphagia. Design About 15 food samples varying in texture qualities were characterized by descriptive sensory and rheological measurements. Results Soups were perceived as homogenous; thickened soups were perceived as being easier to swallow, more melting and creamy compared with soups without thickener. Viscosity differed between the two types of soups. Texture descriptors for pâtés were characterized by high chewing resistance, firmness, and having larger particles compared with timbales and jellied products. Jellied products were perceived as wobbly, creamy, and easier to swallow. Concerning the rheological measurements, all solid products were more elastic than viscous (G′>G″), belonging to different G′ intervals: jellied products (low G′) and timbales together with pâtés (higher G′). Conclusion By combining sensory and rheological measurements, a system of objective, quantitative, and well-defined food textures was developed that characterizes the different texture categories. PMID:20592965

  19. Quantitative evaluation of manufacturability and performance for ILT produced mask shapes using a single-objective function

    NASA Astrophysics Data System (ADS)

    Choi, Heon; Wang, Wei-long; Kallingal, Chidam

    2015-03-01

    The continuous scaling of semiconductor devices is quickly outpacing the resolution improvements of lithographic exposure tools and processes. This one-sided progression has pushed optical lithography to its limits, resulting in the use of well-known techniques such as Sub-Resolution Assist Features (SRAF's), Source-Mask Optimization (SMO), and double-patterning, to name a few. These techniques, belonging to a larger category of Resolution Enhancement Techniques (RET), have extended the resolution capabilities of optical lithography at the cost of increasing mask complexity, and therefore cost. One such technique, called Inverse Lithography Technique (ILT), has attracted much attention for its ability to produce the best possible theoretical mask design. ILT treats the mask design process as an inverse problem, where the known transformation from mask to wafer is carried out backwards using a rigorous mathematical approach. One practical problem in the application of ILT is the resulting contour-like mask shapes that must be "Manhattanized" (composed of straight edges and 90-deg corners) in order to produce a manufacturable mask. This conversion process inherently degrades the mask quality as it is a departure from the "optimal mask" represented by the continuously curved shapes produced by ILT. However, simpler masks composed of longer straight edges reduce the mask cost as it lowers the shot count and saves mask writing time during mask fabrication, resulting in a conflict between manufacturability and performance for ILT produced masks1,2. In this study, various commonly used metrics will be combined into an objective function to produce a single number to quantitatively measure a particular ILT solution's ability to balance mask manufacturability and RET performance. Several metrics that relate to mask manufacturing costs (i.e. mask vertex count, ILT computation runtime) are appropriately weighted against metrics that represent RET capability (i.e. process

  20. Effect of ethnicity on performance in a final objective structured clinical examination: qualitative and quantitative study

    PubMed Central

    Wass, Val; Roberts, Celia; Hoogenboom, Ron; Jones, Roger; Van der Vleuten, Cees

    2003-01-01

    Objective To assess the effect of ethnicity on student performance in stations assessing communication skills within an objective structured clinical examination. Design Quantitative and qualitative study. Setting A final UK clinical examination consisting of a two day objective structured clinical examination with 22 stations. Participants 82 students from ethnic minorities and 97 white students. Main outcome measures Mean scores for stations (quantitative) and observations made using discourse analysis on selected communication stations (qualitative). Results Mean performance of students from ethnic minorities was significantly lower than that of white students for stations assessing communication skills on days 1 (67.0% (SD 6.8%) and 72.3% (7.6%); P=0.001) and 2 (65.2% (6.6%) and 69.5% (6.3%); P=0.003). No examples of overt discrimination were found in 309 video recordings. Transcriptions showed subtle differences in communication styles in some students from ethnic minorities who performed poorly. Examiners' assumptions about what is good communication may have contributed to differences in grading. Conclusions There was no evidence of explicit discrimination between students from ethnic minorities and white students in the objective structured clinical examination. A small group of male students from ethnic minorities used particularly poorly rated communicative styles, and some subtle problems in assessing communication skills may have introduced bias. Tests need to reflect issues of diversity to ensure that students from ethnic minorities are not disadvantaged. What is already known on this topicUK medical schools are concerned that students from ethnic minorities may perform less well than white students in examinationsIt is important to understand whether our examination system disadvantages themWhat this study addsMean performance of students from ethnic minorities was significantly lower than that of white students in a final year objective structured

  1. Standardization in the Handling and Evaluation of Objective Examinations.

    ERIC Educational Resources Information Center

    Sass, M. Burke

    1978-01-01

    In response to requests for standardization on testing and grading, a pilot program for the administration and evaluation of objective examinations was instituted. Outlined are objectives, initial test item collection, procedural flow for examinations, faculty responsibilities, support staff responsibilities, and project coordinator services. (LBH)

  2. Quantitative evaluation of skeletal muscle defects in second harmonic generation images.

    PubMed

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  3. Quantitative evaluation of skeletal muscle defects in second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  4. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.

    PubMed

    Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-04-01

    To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.

  5. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  6. [Evaluation of YAG-laser vitreolysis effectiveness based on quantitative characterization of vitreous floaters].

    PubMed

    Shaimova, V A; Shaimov, T B; Shaimov, R B; Galin, A Yu; Goloshchapova, Zh A; Ryzhkov, P K; Fomin, A V

    2018-01-01

    To develop methods for evaluating effectiveness of YAG-laser vitreolysis of vitreous floaters. The study included 144 patients (173 eyes) who had underwent YAG-laser vitreolysis and were under observation from 01.09.16 to 31.01.18. The patients were 34 to 86 years old (mean age 62.7±10.2 years), 28 (19.4%) patients were male, 116 (80.6%) - female. All patients underwent standard and additional examination: ultrasonography (Accutome B-scan plus, U.S.A.), optic biometry (Lenstar 900, Haag-Streit, Switzerland), spectral optical coherence tomography using RTVue XR Avanti scanner (Optovue, U.S.A.) in modes Enhanced HD Line, 3D Retina, 3D Widefield MCT, Cross Line, Angio Retina, and scanning laser ophthalmoscopy (SLO) using Navilas 577s system. Laser vitreolysis was performed using the Ultra Q Reflex laser (Ellex, Australia). This paper presents methods of objective quantitative and qualitative assessment of artifactual shadows of vitreous floaters with spectral optical coherence tomographic scanner RTVue xR Avanti employing an algorithm of automatic detection of non-perfusion zones in modes Angio Retina, HD Angio Retina, as well as foveal avascular zone (FAZ) measurement with Angio Analytics® software. SLO performed with Navilas 577s was used as method of visualizing floaters and artifactual shadows in retinal surface layers prior to surgical treatment and after YAG-laser vitreolysis. Suggested methods of quantitative and qualitative assessment of artifactual shadows of the floaters in retinal layers are promising and may prove to be highly relevant for clinical monitoring of patients, optimization of treatment indications and evaluating effectiveness of YAG-laser vitreolysis. Further research of laser vitreolysis effectiveness in patients with vitreous floaters is necessary.

  7. Student evaluations of teaching: teaching quantitative courses can be hazardous to one's career.

    PubMed

    Uttl, Bob; Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors' teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.

  8. Attitudes as Object-Evaluation Associations of Varying Strength

    PubMed Central

    Fazio, Russell H.

    2009-01-01

    Historical developments regarding the attitude concept are reviewed, and set the stage for consideration of a theoretical perspective that views attitude, not as a hypothetical construct, but as evaluative knowledge. A model of attitudes as object-evaluation associations of varying strength is summarized, along with research supporting the model’s contention that at least some attitudes are represented in memory and activated automatically upon the individual’s encountering the attitude object. The implications of the theoretical perspective for a number of recent discussions related to the attitude concept are elaborated. Among these issues are the notion of attitudes as “constructions,” the presumed malleability of automatically-activated attitudes, correspondence between implicit and explicit measures of attitude, and postulated dual or multiple attitudes. PMID:19424447

  9. Quantitative evaluation of the voice range profile in patients with voice disorder.

    PubMed

    Ikeda, Y; Masuda, T; Manako, H; Yamashita, H; Yamamoto, T; Komiyama, S

    1999-01-01

    In 1953, Calvet first displayed the fundamental frequency (pitch) and sound pressure level (intensity) of a voice on a two-dimensional plane and created a voice range profile. This profile has been used to evaluate clinically various vocal disorders, although such evaluations to date have been subjective without quantitative assessment. In the present study, a quantitative system was developed to evaluate the voice range profile utilizing a personal computer. The area of the voice range profile was defined as the voice volume. This volume was analyzed in 137 males and 175 females who were treated for various dysphonias at Kyushu University between 1984 and 1990. Ten normal subjects served as controls. The voice volume in cases with voice disorders significantly decreased irrespective of the disease and sex. Furthermore, cases having better improvement after treatment showed a tendency for the voice volume to increase. These findings illustrated the voice volume as a useful clinical test for evaluating voice control in cases with vocal disorders.

  10. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. © 2015 British Association of Dermatologists.

  11. Evaluation of a rapid quantitative determination method of PSA concentration with gold immunochromatographic strips.

    PubMed

    Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia

    2015-11-03

    Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.

  12. Quantitative evaluation of in vivo vital-dye fluorescence endoscopic imaging for the detection of Barrett’s-associated neoplasia

    PubMed Central

    Thekkek, Nadhi; Lee, Michelle H.; Polydorides, Alexandros D.; Rosen, Daniel G.; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2015-01-01

    Abstract. Current imaging tools are associated with inconsistent sensitivity and specificity for detection of Barrett’s-associated neoplasia. Optical imaging has shown promise in improving the classification of neoplasia in vivo. The goal of this pilot study was to evaluate whether in vivo vital dye fluorescence imaging (VFI) has the potential to improve the accuracy of early-detection of Barrett’s-associated neoplasia. In vivo endoscopic VFI images were collected from 65 sites in 14 patients with confirmed Barrett’s esophagus (BE), dysplasia, or esophageal adenocarcinoma using a modular video endoscope and a high-resolution microendoscope (HRME). Qualitative image features were compared to histology; VFI and HRME images show changes in glandular structure associated with neoplastic progression. Quantitative image features in VFI images were identified for objective image classification of metaplasia and neoplasia, and a diagnostic algorithm was developed using leave-one-out cross validation. Three image features extracted from VFI images were used to classify tissue as neoplastic or not with a sensitivity of 87.8% and a specificity of 77.6% (AUC=0.878). A multimodal approach incorporating VFI and HRME imaging can delineate epithelial changes present in Barrett’s-associated neoplasia. Quantitative analysis of VFI images may provide a means for objective interpretation of BE during surveillance. PMID:25950645

  13. Direct quantitative evaluation of disease symptoms on living plant leaves growing under natural light.

    PubMed

    Matsunaga, Tomoko M; Ogawa, Daisuke; Taguchi-Shiobara, Fumio; Ishimoto, Masao; Matsunaga, Sachihiro; Habu, Yoshiki

    2017-06-01

    Leaf color is an important indicator when evaluating plant growth and responses to biotic/abiotic stress. Acquisition of images by digital cameras allows analysis and long-term storage of the acquired images. However, under field conditions, where light intensity can fluctuate and other factors (shade, reflection, and background, etc.) vary, stable and reproducible measurement and quantification of leaf color are hard to achieve. Digital scanners provide fixed conditions for obtaining image data, allowing stable and reliable comparison among samples, but require detached plant materials to capture images, and the destructive processes involved often induce deformation of plant materials (curled leaves and faded colors, etc.). In this study, by using a lightweight digital scanner connected to a mobile computer, we obtained digital image data from intact plant leaves grown in natural-light greenhouses without detaching the targets. We took images of soybean leaves infected by Xanthomonas campestris pv. glycines , and distinctively quantified two disease symptoms (brown lesions and yellow halos) using freely available image processing software. The image data were amenable to quantitative and statistical analyses, allowing precise and objective evaluation of disease resistance.

  14. Objective and quantitative definitions of modified food textures based on sensory and rheological methodology.

    PubMed

    Wendin, Karin; Ekman, Susanne; Bülow, Margareta; Ekberg, Olle; Johansson, Daniel; Rothenberg, Elisabet; Stading, Mats

    2010-06-28

    Patients who suffer from chewing and swallowing disorders, i.e. dysphagia, may have difficulties ingesting normal food and liquids. In these patients a texture modified diet may enable that the patient maintain adequate nutrition. However, there is no generally accepted definition of 'texture' that includes measurements describing different food textures. Objectively define and quantify categories of texture-modified food by conducting rheological measurements and sensory analyses. A further objective was to facilitate the communication and recommendations of appropriate food textures for patients with dysphagia. About 15 food samples varying in texture qualities were characterized by descriptive sensory and rheological measurements. Soups were perceived as homogenous; thickened soups were perceived as being easier to swallow, more melting and creamy compared with soups without thickener. Viscosity differed between the two types of soups. Texture descriptors for pâtés were characterized by high chewing resistance, firmness, and having larger particles compared with timbales and jellied products. Jellied products were perceived as wobbly, creamy, and easier to swallow. Concerning the rheological measurements, all solid products were more elastic than viscous (G'>G''), belonging to different G' intervals: jellied products (low G') and timbales together with pâtés (higher G'). By combining sensory and rheological measurements, a system of objective, quantitative, and well-defined food textures was developed that characterizes the different texture categories.

  15. Learning Objects and Virtual Learning Environments Technical Evaluation Criteria

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Dagiene, Valentina

    2009-01-01

    The main scientific problems investigated in this article deal with technical evaluation of quality attributes of the main components of e-Learning systems (referred here as DLEs--Digital Libraries of Educational Resources and Services), i.e., Learning Objects (LOs) and Virtual Learning Environments (VLEs). The main research object of the work is…

  16. Computerized quantitative evaluation of mammographic accreditation phantom images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria,more » the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.« less

  17. Evaluation of changes in periodontal bacteria in healthy dogs over 6 months using quantitative real-time PCR.

    PubMed

    Maruyama, N; Mori, A; Shono, S; Oda, H; Sako, T

    2018-03-01

    Porphyromonas gulae, Tannerella forsythia and Campylobacter rectus are considered dominant periodontal pathogens in dogs. Recently, quantitative real-time PCR (qRT-PCR) methods have been used for absolute quantitative determination of oral bacterial counts. The purpose of the present study was to establish a standardized qRT-PCR procedure to quantify bacterial counts of the three target periodontal bacteria (P. gulae, T. forsythia and C. rectus). Copy numbers of the three target periodontal bacteria were evaluated in 26 healthy dogs. Then, changes in bacterial counts of the three target periodontal bacteria were evaluated for 24 weeks in 7 healthy dogs after periodontal scaling. Analytical evaluation of each self-designed primer indicated acceptable analytical imprecision. All 26 healthy dogs were found to be positive for P. gulae, T. forsythia and C. rectus. Median total bacterial counts (copies/ng) of each target genes were 385.612 for P. gulae, 25.109 for T. forsythia and 5.771 for C. rectus. Significant differences were observed between the copy numbers of the three target periodontal bacteria. Periodontal scaling reduced median copy numbers of the three target periodontal bacteria in 7 healthy dogs. However, after periodontal scaling, copy numbers of all three periodontal bacteria significantly increased over time (p<0.05, Kruskal-Wallis test) (24 weeks). In conclusion, our results demonstrated that qRT-PCR can accurately measure periodontal bacteria in dogs. Furthermore, the present study has revealed that qRT-PCR method can be considered as a new objective evaluation system for canine periodontal disease. Copyright© by the Polish Academy of Sciences.

  18. Quantitative framework for prospective motion correction evaluation.

    PubMed

    Pannetier, Nicolas A; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2016-02-01

    Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. © 2015 Wiley Periodicals, Inc.

  19. The Positive Alternative Credit Experience (PACE) Program a Quantitative Comparative Study

    ERIC Educational Resources Information Center

    Warren, Rebecca Anne

    2011-01-01

    The purpose of this quantitative comparative study was to evaluate the Positive Alternative Credit Experience (PACE) Program using an objectives-oriented approach to a formative program evaluation. The PACE Program was a semester-long high school alternative education program designed to serve students at-risk for academic failure or dropping out…

  20. Quantitative evaluation research of glare from automotive headlamps

    NASA Astrophysics Data System (ADS)

    Wang, Tiecheng; Qian, Rui; Cao, Ye; Gao, Mingqiu

    2018-01-01

    This study concerns the quantized evaluation research of glare from automotive headlamps. In the actual regulations, only one point in the test screen is set for judging whether driver can bear the light caused by headlamps of opposing vehicle. To evaluating practical effect of glare, we accept a glare zone with the probability distribution information of the oncoming driver's eye position. In this focus area, glare level of headlamp is represented by weighted luminous flux. To confirm the most comfortable illuminance value to human eyes at 50 m, we used test point B50L as observation position, and collected 1,000 subjective evaluation data from 20 test personnel in different ages during two months. Basing on the assessment results, we calculated 0.60 lx as recommended value for standardized testing procedure at 25 m. Then we figured out 0.38 lm as optimum value, and 0.25 / 1.20 lm as limiting values depending on regulations. We tested 40 sample vehicles with different levels to verify the sectional nonlinear quantitative evaluation method we designed, and analyzed the typical test results.

  1. 'Stories' or 'snapshots'? A study directed at comparing qualitative and quantitative approaches to curriculum evaluation.

    PubMed

    Pateman, B; Jinks, A M

    1999-01-01

    The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.

  2. Objective speech quality evaluation of real-time speech coders

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. R.; Russell, W. H.; Huggins, A. W. F.

    1984-02-01

    This report describes the work performed in two areas: subjective testing of a real-time 16 kbit/s adaptive predictive coder (APC) and objective speech quality evaluation of real-time coders. The speech intelligibility of the APC coder was tested using the Diagnostic Rhyme Test (DRT), and the speech quality was tested using the Diagnostic Acceptability Measure (DAM) test, under eight operating conditions involving channel error, acoustic background noise, and tandem link with two other coders. The test results showed that the DRT and DAM scores of the APC coder equalled or exceeded the corresponding test scores fo the 32 kbit/s CVSD coder. In the area of objective speech quality evaluation, the report describes the development, testing, and validation of a procedure for automatically computing several objective speech quality measures, given only the tape-recordings of the input speech and the corresponding output speech of a real-time speech coder.

  3. Validity evidence for the Simulated Colonoscopy Objective Performance Evaluation scoring system.

    PubMed

    Trinca, Kristen D; Cox, Tiffany C; Pearl, Jonathan P; Ritter, E Matthew

    2014-02-01

    Low-cost, objective systems to assess and train endoscopy skills are needed. The aim of this study was to evaluate the ability of Simulated Colonoscopy Objective Performance Evaluation to assess the skills required to perform endoscopy. Thirty-eight subjects were included in this study, all of whom performed 4 tasks. The scoring system measured performance by calculating precision and efficiency. Data analysis assessed the relationship between colonoscopy experience and performance on each task and the overall score. Endoscopic trainees' Simulated Colonoscopy Objective Performance Evaluation scores correlated significantly with total colonoscopy experience (r = .61, P = .003) and experience in the past 12 months (r = .63, P = .002). Significant differences were seen among practicing endoscopists, nonendoscopic surgeons, and trainees (P < .0001). When the 4 tasks were analyzed, each showed significant correlation with colonoscopy experience (scope manipulation, r = .44, P = .044; tool targeting, r = .45, P = .04; loop management, r = .47, P = .032; mucosal inspection, r = .65, P = .001) and significant differences in performance between the endoscopist groups, except for mucosal inspection (scope manipulation, P < .0001; tool targeting, P = .002; loop management, P = .0008; mucosal inspection, P = .27). Simulated Colonoscopy Objective Performance Evaluation objectively assesses the technical skills required to perform endoscopy and shows promise as a platform for proficiency-based skills training. Published by Elsevier Inc.

  4. Physical therapy in Huntington's disease--toward objective assessments?

    PubMed

    Bohlen, S; Ekwall, C; Hellström, K; Vesterlin, H; Björnefur, M; Wiklund, L; Reilmann, R

    2013-02-01

    Physical therapy is recommended for the treatment of Huntington's disease, but reliable studies investigating its efficacy are almost non-existent. This may in part be due to the lack of suitable outcome measures. Therefore, we investigated the applicability of novel quantitative and objective assessments of motor dysfunction in the evaluation of physical therapy interventions aimed at improving gait and posture. Twelve patients with Huntington disease received a predefined twice-weekly intervention focusing on posture and gait over 6 weeks. The GAITRite mat and a force plate were used for objective and quantitative assessments. The Unified Huntingtons Disease Rating Scale Total Motor Score, the timed Up &Go test, and the Berg Balance Scale were used as clinical outcome measures. Significant improvements were seen in GAITRite measures after therapy. Improvements were also seen in the Up & Go test and Berg Balance Scale, whereas force plate measures and Total Motor Scores did not change. The results suggest that physical therapy has a positive effect on gait in Huntington's disease. The study shows that objective and quantitative measures of gait and posture may serve as endpoints in trials assessing the efficacy of physical therapy. They should be explored further in larger trials applying a randomized controlled setting. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.

  5. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    PubMed

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  6. A quantitative evaluation of the high elbow technique in front crawl.

    PubMed

    Suito, Hiroshi; Nunome, Hiroyuki; Ikegami, Yasuo

    2017-07-01

    Many coaches often instruct swimmers to keep the elbow in a high position (high elbow position) during early phase of the underwater stroke motion (pull phase) in front crawl, however, the high elbow position has never been quantitatively evaluated. The aims of this study were (1) to quantitatively evaluate the "high elbow" position, (2) to clarify the relationship between the high elbow position and required upper limb configuration and (3) to examine the efficacy of high elbow position on the resultant swimming velocity. Sixteen highly skilled and 6 novice male swimmers performed 25 m front crawl with maximal effort and their 3-dimensional arm stroke motion was captured at 60 Hz. An attempt was made to develop a new index to evaluate the high elbow position (I he : high elbow index) using 3-dimensional coordinates of the shoulder, elbow and wrist joints. I he of skilled swimmers moderately correlated with the average shoulder internal rotation angle (r = -0.652, P < 0.01) and swimming velocity (r = -0.683, P < 0.01) during the pull phase. These results indicate that I he is a useful index for evaluating high elbow arm stroke technique during the pull phase in front crawl.

  7. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  8. Student evaluations of teaching: teaching quantitative courses can be hazardous to one’s career

    PubMed Central

    Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards. PMID:28503380

  9. Quantitative evaluation of morphological changes in activated platelets in vitro using digital holographic microscopy.

    PubMed

    Kitamura, Yutaka; Isobe, Kazushige; Kawabata, Hideo; Tsujino, Tetsuhiro; Watanabe, Taisuke; Nakamura, Masayuki; Toyoda, Toshihisa; Okudera, Hajime; Okuda, Kazuhiro; Nakata, Koh; Kawase, Tomoyuki

    2018-06-18

    Platelet activation and aggregation have been conventionally evaluated using an aggregometer. However, this method is suitable for short-term but not long-term quantitative evaluation of platelet aggregation, morphological changes, and/or adhesion to specific materials. The recently developed digital holographic microscopy (DHM) has enabled the quantitative evaluation of cell size and morphology without labeling or destruction. Thus, we aim to validate its applicability in quantitatively evaluating changes in cell morphology, especially in the aggregation and spreading of activated platelets, thus modifying typical image analysis procedures to suit aggregated platelets. Freshly prepared platelet-rich plasma was washed with phosphate-buffered saline and treated with 0.1% CaCl 2 . Platelets were then fixed and subjected to DHM, scanning electron microscopy (SEM), atomic force microscopy, optical microscopy, and flow cytometry (FCM). Tightly aggregated platelets were identified as single cells. Data obtained from time-course experiments were plotted two-dimensionally according to the average optical thickness versus attachment area and divided into four regions. The majority of the control platelets, which supposedly contained small and round platelets, were distributed in the lower left region. As activation time increased, however, this population dispersed toward the upper right region. The distribution shift demonstrated by DHM was essentially consistent with data obtained from SEM and FCM. Therefore, DHM was validated as a promising device for testing platelet function given that it allows for the quantitative evaluation of activation-dependent morphological changes in platelets. DHM technology will be applicable to the quality assurance of platelet concentrates, as well as diagnosis and drug discovery related to platelet functions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Evaluation of colonoscopy technical skill levels by use of an objective kinematic-based system.

    PubMed

    Obstein, Keith L; Patil, Vaibhav D; Jayender, Jagadeesan; San José Estépar, Raúl; Spofford, Inbar S; Lengyel, Balazs I; Vosburgh, Kirby G; Thompson, Christopher C

    2011-02-01

    Colonoscopy requires training and experience to ensure accuracy and safety. Currently, no objective, validated process exists to determine when an endoscopist has attained technical competence. Kinematics data describing movements of laparoscopic instruments have been used in surgical skill assessment to define expert surgical technique. We have developed a novel system to record kinematics data during colonoscopy and quantitatively assess colonoscopist performance. To use kinematic analysis of colonoscopy to quantitatively assess endoscopic technical performance. Prospective cohort study. Tertiary-care academic medical center. This study involved physicians who perform colonoscopy. Application of a kinematics data collection system to colonoscopy evaluation. Kinematics data, validated task load assessment instrument, and technical difficulty visual analog scale. All 13 participants completed the colonoscopy to the terminal ileum on the standard colon model. Attending physicians reached the terminal ileum quicker than fellows (median time, 150.19 seconds vs 299.86 seconds; p<.01) with reduced path lengths for all 4 sensors, decreased flex (1.75 m vs 3.14 m; P=.03), smaller tip angulation, reduced absolute roll, and lower curvature of the endoscope. With performance of attending physicians serving as the expert reference standard, the mean kinematic score increased by 19.89 for each decrease in postgraduate year (P<.01). Overall, fellows experienced greater mental, physical, and temporal demand than did attending physicians. Small cohort size. Kinematic data and score calculation appear useful in the evaluation of colonoscopy technical skill levels. The kinematic score appears to consistently vary by year of training. Because this assessment is nonsubjective, it may be an improvement over current methods for determination of competence. Ongoing studies are establishing benchmarks and characteristic profiles of skill groups based on kinematics data. Copyright © 2011

  11. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  12. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  13. Picasso Paintings, Moon Rocks, and Hand-Written Beatles Lyrics: Adults' Evaluations of Authentic Objects.

    PubMed

    Frazier, Brandy N; Gelman, Susan A; Wilson, Alice; Hood, Bruce

    2009-01-01

    Authentic objects are those that have an historical link to a person, event, time, or place of some significance (e.g., original Picasso painting; gown worn by Princess Diana; your favorite baby blanket). The current study examines everyday beliefs about authentic objects, with three primary goals: to determine the scope of adults' evaluation of authentic objects, to examine such evaluation in two distinct cultural settings, and to determine whether a person's attachment history (i.e., whether or not they owned an attachment object as a child) predicts evaluation of authentic objects. We found that college students in the U.K. (N = 125) and U.S. (N = 119) consistently evaluate a broad range of authentic items as more valuable than matched control (inauthentic) objects, more desirable to keep, and more desirable to touch, though only non-personal authentic items were judged to be more appropriate for display in a museum. These patterns were remarkably similar across the two cultural contexts. Additionally, those who had an attachment object as a child evaluated objects more favorably, and in particular judged authentic objects to be more valuable. Altogether, these results demonstrate broad endorsement of "positive contagion" among college-educated adults.

  14. Picasso Paintings, Moon Rocks, and Hand-Written Beatles Lyrics: Adults’ Evaluations of Authentic Objects

    PubMed Central

    Frazier, Brandy N.; Gelman, Susan A.; Wilson, Alice; Hood, Bruce

    2010-01-01

    Authentic objects are those that have an historical link to a person, event, time, or place of some significance (e.g., original Picasso painting; gown worn by Princess Diana; your favorite baby blanket). The current study examines everyday beliefs about authentic objects, with three primary goals: to determine the scope of adults’ evaluation of authentic objects, to examine such evaluation in two distinct cultural settings, and to determine whether a person’s attachment history (i.e., whether or not they owned an attachment object as a child) predicts evaluation of authentic objects. We found that college students in the U.K. (N = 125) and U.S. (N = 119) consistently evaluate a broad range of authentic items as more valuable than matched control (inauthentic) objects, more desirable to keep, and more desirable to touch, though only non-personal authentic items were judged to be more appropriate for display in a museum. These patterns were remarkably similar across the two cultural contexts. Additionally, those who had an attachment object as a child evaluated objects more favorably, and in particular judged authentic objects to be more valuable. Altogether, these results demonstrate broad endorsement of "positive contagion" among college-educated adults. PMID:20631919

  15. Stylization levels of industrial design objects

    NASA Astrophysics Data System (ADS)

    Kukhta, M. S.; Sokolov, A. P.; Krauinsh, D. P.; Bouchard, C.

    2017-01-01

    The urgency of the research of form making problem in design is associated with the necessity of new understanding of visual culture and new approaches to design engineering representing the integration of artistic and designed problems. The aim of this research is to study the levels of stylization of design objects and dependance (relation) on the specific project objectives and existing technologies. On the ground of quantitative evaluation, the stylization measures are emphasized: figurative image, stylized image and abstract image. Theoretic conclusions are complemented by practical problem solution over creating openwork metal lantern. Variants of both the traditional mains supply of the lantern and the autonomic supply system based on solar energy were offered. The role of semantic factor, affecting the depth of perception of design objects semantic space, is represented in this paper.

  16. Quantitative assessment of participant knowledge and evaluation of participant satisfaction in the CARES training program.

    PubMed

    Goodman, Melody S; Si, Xuemei; Stafford, Jewel D; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2012-01-01

    The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research methodology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community-academic research partnerships.

  17. A software package to improve image quality and isolation of objects of interest for quantitative stereology studies of rat hepatocarcinogenesis.

    PubMed

    Xu, Yihua; Pitot, Henry C

    2006-03-01

    In the studies of quantitative stereology of rat hepatocarcinogenesis, we have used image analysis technology (automatic particle analysis) to obtain data such as liver tissue area, size and location of altered hepatic focal lesions (AHF), and nuclei counts. These data are then used for three-dimensional estimation of AHF occurrence and nuclear labeling index analysis. These are important parameters for quantitative studies of carcinogenesis, for screening and classifying carcinogens, and for risk estimation. To take such measurements, structures or cells of interest should be separated from the other components based on the difference of color and density. Common background problems seen on the captured sample image such as uneven light illumination or color shading can cause severe problems in the measurement. Two application programs (BK_Correction and Pixel_Separator) have been developed to solve these problems. With BK_Correction, common background problems such as incorrect color temperature setting, color shading, and uneven light illumination background, can be corrected. With Pixel_Separator different types of objects can be separated from each other in relation to their color, such as seen with different colors in immunohistochemically stained slides. The resultant images of such objects separated from other components are then ready for particle analysis. Objects that have the same darkness but different colors can be accurately differentiated in a grayscale image analysis system after application of these programs.

  18. Clusters of Insomnia Disorder: An Exploratory Cluster Analysis of Objective Sleep Parameters Reveals Differences in Neurocognitive Functioning, Quantitative EEG, and Heart Rate Variability

    PubMed Central

    Miller, Christopher B.; Bartlett, Delwyn J.; Mullins, Anna E.; Dodds, Kirsty L.; Gordon, Christopher J.; Kyle, Simon D.; Kim, Jong Won; D'Rozario, Angela L.; Lee, Rico S.C.; Comas, Maria; Marshall, Nathaniel S.; Yee, Brendon J.; Espie, Colin A.; Grunstein, Ronald R.

    2016-01-01

    Study Objectives: To empirically derive and evaluate potential clusters of Insomnia Disorder through cluster analysis from polysomnography (PSG). We hypothesized that clusters would differ on neurocognitive performance, sleep-onset measures of quantitative (q)-EEG and heart rate variability (HRV). Methods: Research volunteers with Insomnia Disorder (DSM-5) completed a neurocognitive assessment and overnight PSG measures of total sleep time (TST), wake time after sleep onset (WASO), and sleep onset latency (SOL) were used to determine clusters. Results: From 96 volunteers with Insomnia Disorder, cluster analysis derived at least two clusters from objective sleep parameters: Insomnia with normal objective sleep duration (I-NSD: n = 53) and Insomnia with short sleep duration (I-SSD: n = 43). At sleep onset, differences in HRV between I-NSD and I-SSD clusters suggest attenuated parasympathetic activity in I-SSD (P < 0.05). Preliminary work suggested three clusters by retaining the I-NSD and splitting the I-SSD cluster into two: I-SSD A (n = 29): defined by high WASO and I-SSD B (n = 14): a second I-SSD cluster with high SOL and medium WASO. The I-SSD B cluster performed worse than I-SSD A and I-NSD for sustained attention (P ≤ 0.05). In an exploratory analysis, q-EEG revealed reduced spectral power also in I-SSD B before (Delta, Alpha, Beta-1) and after sleep-onset (Beta-2) compared to I-SSD A and I-NSD (P ≤ 0.05). Conclusions: Two insomnia clusters derived from cluster analysis differ in sleep onset HRV. Preliminary data suggest evidence for three clusters in insomnia with differences for sustained attention and sleep-onset q-EEG. Clinical Trial Registration: Insomnia 100 sleep study: Australia New Zealand Clinical Trials Registry (ANZCTR) identification number 12612000049875. URL: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=347742. Citation: Miller CB, Bartlett DJ, Mullins AE, Dodds KL, Gordon CJ, Kyle SD, Kim JW, D'Rozario AL, Lee RS, Comas

  19. Quantitative Skills as a Graduate Learning Outcome: Exploring Students' Evaluative Expertise

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2017-01-01

    In the biosciences, quantitative skills are an essential graduate learning outcome. Efforts to evidence student attainment at the whole of degree programme level are rare and making sense of such data is complex. We draw on assessment theories from Sadler (evaluative expertise) and Boud (sustainable assessment) to interpret final-year bioscience…

  20. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    USDA-ARS?s Scientific Manuscript database

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  1. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  2. Quantitative Imaging Biomarkers of NAFLD

    PubMed Central

    Kinner, Sonja; Reeder, Scott B.

    2016-01-01

    Conventional imaging modalities, including ultrasonography (US), computed tomography (CT), and magnetic resonance (MR), play an important role in the diagnosis and management of patients with nonalcoholic fatty liver disease (NAFLD) by allowing noninvasive diagnosis of hepatic steatosis. However, conventional imaging modalities are limited as biomarkers of NAFLD for various reasons. Multi-parametric quantitative MRI techniques overcome many of the shortcomings of conventional imaging and allow comprehensive and objective evaluation of NAFLD. MRI can provide unconfounded biomarkers of hepatic fat, iron, and fibrosis in a single examination—a virtual biopsy has become a clinical reality. In this article, we will review the utility and limitation of conventional US, CT, and MR imaging for the diagnosis NAFLD. Recent advances in imaging biomarkers of NAFLD are also discussed with an emphasis in multi-parametric quantitative MRI. PMID:26848588

  3. Multiple-objective evaluation of wastewater treatment plant control alternatives.

    PubMed

    Flores-Alsina, Xavier; Gallego, Alejandro; Feijoo, Gumersindo; Rodriguez-Roda, Ignasi

    2010-05-01

    Besides the evaluation of the environmental issues, the correct assessment of wastewater treatment plants (WWTP) should take into account several objectives such as: economic e.g. operation costs; technical e.g. risk of suffering microbiology-related TSS separation problems; or legal e.g. accomplishment with the effluent standards in terms of the different pollution loads. For this reason, the main objective of this paper is to show the benefits of complementing the environmental assessment carried out by life cycle assessment with economical, technical and legal criteria. Using a preliminary version of the BSM2 as a case study, different combinations of controllers are implemented, simulated and evaluated. In the following step, the resulting multi-criteria matrix is mined using multivariate statistical techniques. The results showed that the presence of an external carbon source addition, the type of aeration system and the TSS controller are the key elements creating the differences amongst the alternatives. Also, it was possible to characterize the different control strategies according to a set of aggregated criteria. Additionally, the existing synergies amongst different objectives and their consequent trade-offs were identified. Finally, it was discovered that from the initial extensive list of evaluation criteria, only a small set of five are really discriminant, being useful to differentiate within the generated alternatives. Copyright 2010 Elsevier Ltd. All rights reserved.

  4. Evaluating a Dutch cardiology primary care plus intervention on the Triple Aim outcomes: study design of a practice-based quantitative and qualitative research.

    PubMed

    Quanjel, Tessa C C; Spreeuwenberg, Marieke D; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk

    2017-09-06

    In an attempt to deal with the pressures on the health-care system and to guarantee sustainability, changes are needed. This study focuses on a cardiology primary care plus intervention. Primary care plus (PC+) is a new health-care delivery model focused on substitution of specialist care in the hospital setting with specialist care in the primary care setting. The intervention consists of a cardiology PC+ centre in which cardiologists, supported by other health-care professionals, provide consultations in a primary care setting. The PC+ centre aims to improve the health of the population and quality of care as experienced by patients, and reduce the number of referrals to hospital-based outpatient specialist care in order to reduce health-care costs. These aims reflect the Triple Aim principle. Hence, the objectives of the study are to evaluate the cardiology PC+ centre in terms of the Triple Aim outcomes and to evaluate the process of the introduction of PC+. The study is a practice-based, quantitative study with a longitudinal observational design, and an additional qualitative study to supplement, interpret and improve the quantitative study. The study population of the quantitative part will consist of adult patients (≥18 years) with non-acute and low-complexity cardiology-related health complaints, who will be referred to the cardiology PC+ centre (intervention group) or hospital-based outpatient cardiology care (control group). All eligible patients will be asked to complete questionnaires at three different time points consisting of questions about their demographics, health status and experience of care. Additionally, quantitative data will be collected about health-care utilization and related health-care costs at the PC+ centre and the hospital. The qualitative part, consisting of semi-structured interviews, focus groups, and observations, is designed to evaluate the process as well as to amplify, clarify and explain quantitative results. This study

  5. Quantitative nondestructive evaluation of ceramic matrix composite by the resonance method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watanabe, T.; Aizawa, T.; Kihara, J.

    The resonance method was developed to make quantitative nondestructive evaluation on the mechanical properties without any troublesome procedure. Since the present method is indifferent to the geometry of specimen, both monolithic and ceramic matrix composite materials in process can be evaluated in the nondestructive manner. Al{sub 2}O{sub 3}, Si{sub 3}N{sub 4}, SiC/Si{sub 3}N{sub 4}, and various C/C composite materials are employed to demonstrate the validity and effectiveness of the present method.

  6. Quantitative acoustic emission monitoring of fatigue cracks in fracture critical steel bridges.

    DOT National Transportation Integrated Search

    2014-01-01

    The objective of this research is to evaluate the feasibility to employ quantitative acoustic : emission (AE) techniques for monitoring of fatigue crack initiation and propagation in steel : bridge members. Three A36 compact tension steel specimens w...

  7. Spectro-refractometry of individual microscopic objects using swept-source quantitative phase imaging.

    PubMed

    Jung, Jae-Hwang; Jang, Jaeduck; Park, Yongkeun

    2013-11-05

    We present a novel spectroscopic quantitative phase imaging technique with a wavelength swept-source, referred to as swept-source diffraction phase microscopy (ssDPM), for quantifying the optical dispersion of microscopic individual samples. Employing the swept-source and the principle of common-path interferometry, ssDPM measures the multispectral full-field quantitative phase imaging and spectroscopic microrefractometry of transparent microscopic samples in the visible spectrum with a wavelength range of 450-750 nm and a spectral resolution of less than 8 nm. With unprecedented precision and sensitivity, we demonstrate the quantitative spectroscopic microrefractometry of individual polystyrene beads, 30% bovine serum albumin solution, and healthy human red blood cells.

  8. The Effect of Instructional Objectives and General Objectives on Student Self-Evaluation of Psychomotor Performance in Power Mechanics.

    ERIC Educational Resources Information Center

    Janeczko, Robert John

    The major purpose of this study was to ascertain the relative effects of student exposure to instructional objectives upon student self-evaluation of psychomotor activities in a college-level power mechanics course. A randomized posttest-only control group design was used with two different approaches to the statement of the objectives. Four…

  9. Qualitative and quantitative evaluation of avian demineralized bone matrix in heterotopic beds.

    PubMed

    Reza Sanaei, M; Abu, Jalila; Nazari, Mojgan; A B, Mohd Zuki; Allaudin, Zeenathul N

    2013-11-01

    To evaluate the osteogenic potential of avian demineralized bone matrix (DBM) in the context of implant geometry. Experimental. Rock pigeons (n = 24). Tubular and chipped forms of DBM were prepared by acid demineralization of long bones from healthy allogeneic donors and implanted bilaterally into the pectoral region of 24 pigeons. After euthanasia at 1, 4, 6, 8, 10, and 12 weeks, explants were evaluated histologically and compared by means of quantitative (bone area) and semi quantitative measures (scores). All explants had new bone at retrieval with the exception of tubular implants at the end of week 1. The most reactive part in both implants was the interior region between the periosteal and endosteal surfaces followed by the area at the implant-muscle interface. Quantitative measurements demonstrated a significantly (P = .012) greater percentage of new bone formation induced by tubular implants (80.28 ± 8.94) compared with chip implants (57.64 ± 3.12). There was minimal inflammation. Avian DBM initiates heterotopic bone formation in allogeneic recipients with low grades of immunogenicity. Implant geometry affects this phenomenon as osteoconduction appeared to augment the magnitude of the effects in larger tubular implants. © Copyright 2013 by The American College of Veterinary Surgeons.

  10. An Objective Measure of Interconnection Usage for High Levels of Wind Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yasuda, Yoh; Gomez-Lazaro, Emilio; Holttinen, Hannele

    2014-11-13

    This paper analyzes selected interconnectors in Europe using several evaluation factors; capacity factor, congested time, and congestion ratio. In a quantitative and objective evaluation, the authors propose to use publically available data on maximum net transmission capacity (NTC) levels during a single year to study congestion rates, realizing that the capacity factor depends upon the chosen capacity of the selected interconnector. This value will be referred to as 'the annual maximum transmission capacity (AMTC)', which gives a transparent and objective evaluation of interconnector usage based on the published grid data. While the method is general, its initial application is motivatedmore » by transfer of renewable energy.« less

  11. Developing Objective Criteria for Evaluating Student Athletic Trainers.

    ERIC Educational Resources Information Center

    Treadway, Linda

    In devising a form for the evaluation of students preparing to become athletic trainers, it is helpful to have a checklist in which objectives and behavioral responses are organized into categories, such as prevention of injury, first aid, emergency care, treatment, rehabilitation, and taping and wrapping. It is also important to have records and…

  12. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  13. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  14. Quantile equivalence to evaluate compliance with habitat management objectives

    USGS Publications Warehouse

    Cade, Brian S.; Johnson, Pamela R.

    2011-01-01

    Equivalence estimated with linear quantile regression was used to evaluate compliance with habitat management objectives at Arapaho National Wildlife Refuge based on monitoring data collected in upland (5,781 ha; n = 511 transects) and riparian and meadow (2,856 ha, n = 389 transects) habitats from 2005 to 2008. Quantiles were used because the management objectives specified proportions of the habitat area that needed to comply with vegetation criteria. The linear model was used to obtain estimates that were averaged across 4 y. The equivalence testing framework allowed us to interpret confidence intervals for estimated proportions with respect to intervals of vegetative criteria (equivalence regions) in either a liberal, benefit-of-doubt or conservative, fail-safe approach associated with minimizing alternative risks. Simple Boolean conditional arguments were used to combine the quantile equivalence results for individual vegetation components into a joint statement for the multivariable management objectives. For example, management objective 2A required at least 809 ha of upland habitat with a shrub composition ≥0.70 sagebrush (Artemisia spp.), 20–30% canopy cover of sagebrush ≥25 cm in height, ≥20% canopy cover of grasses, and ≥10% canopy cover of forbs on average over 4 y. Shrub composition and canopy cover of grass each were readily met on >3,000 ha under either conservative or liberal interpretations of sampling variability. However, there were only 809–1,214 ha (conservative to liberal) with ≥10% forb canopy cover and 405–1,098 ha with 20–30%canopy cover of sagebrush ≥25 cm in height. Only 91–180 ha of uplands simultaneously met criteria for all four components, primarily because canopy cover of sagebrush and forbs was inversely related when considered at the spatial scale (30 m) of a sample transect. We demonstrate how the quantile equivalence analyses also can help refine the numerical specification of habitat objectives and explore

  15. Quantitative 3D Ultrashort Time-to-Echo (UTE) MRI and Micro-CT (μCT) Evaluation of the Temporomandibular Joint (TMJ) Condylar Morphology

    PubMed Central

    Geiger, Daniel; Bae, Won C.; Statum, Sheronda; Du, Jiang; Chung, Christine B.

    2014-01-01

    Objective Temporomandibular dysfunction involves osteoarthritis of the TMJ, including degeneration and morphologic changes of the mandibular condyle. Purpose of this study was to determine accuracy of novel 3D-UTE MRI versus micro-CT (μCT) for quantitative evaluation of mandibular condyle morphology. Material & Methods Nine TMJ condyle specimens were harvested from cadavers (2M, 3F; Age 85 ± 10 yrs., mean±SD). 3D-UTE MRI (TR=50ms, TE=0.05 ms, 104 μm isotropic-voxel) was performed using a 3-T MR scanner and μCT (18 μm isotropic-voxel) was performed. MR datasets were spatially-registered with μCT dataset. Two observers segmented bony contours of the condyles. Fibrocartilage was segmented on MR dataset. Using a custom program, bone and fibrocartilage surface coordinates, Gaussian curvature, volume of segmented regions and fibrocartilage thickness were determined for quantitative evaluation of joint morphology. Agreement between techniques (MRI vs. μCT) and observers (MRI vs. MRI) for Gaussian curvature, mean curvature and segmented volume of the bone were determined using intraclass correlation correlation (ICC) analyses. Results Between MRI and μCT, the average deviation of surface coordinates was 0.19±0.15 mm, slightly higher than spatial resolution of MRI. Average deviation of the Gaussian curvature and volume of segmented regions, from MRI to μCT, was 5.7±6.5% and 6.6±6.2%, respectively. ICC coefficients (MRI vs. μCT) for Gaussian curvature, mean curvature and segmented volumes were respectively 0.892, 0.893 and 0.972. Between observers (MRI vs. MRI), the ICC coefficients were 0.998, 0.999 and 0.997 respectively. Fibrocartilage thickness was 0.55±0.11 mm, as previously described in literature for grossly normal TMJ samples. Conclusion 3D-UTE MR quantitative evaluation of TMJ condyle morphology ex-vivo, including surface, curvature and segmented volume, shows high correlation against μCT and between observers. In addition, UTE MRI allows

  16. Testing thermal comfort of trekking boots: an objective and subjective evaluation.

    PubMed

    Arezes, P M; Neves, M M; Teixeira, S F; Leão, C P; Cunha, J L

    2013-07-01

    The study of the thermal comfort of the feet when using a specific type of shoe is of paramount importance, in particular if the main goal of the study is to attend to the needs of users. The main aim of this study was to propose a test battery for thermal comfort analysis and to apply it to the analysis of trekking boots. Methodologically, the project involves both objective and subjective evaluations. An objective evaluation of the thermal properties of the fabrics used in the boots was developed and applied. In addition, the thermal comfort provided when using the boots was also assessed both subjective and objectively. The evaluation of the thermal comfort during use, which was simulated in a laboratory environment, included the measurement of the temperature and moisture of the feet. The subjective assessment was performed using a questionnaire. From the results obtained, it was possible to define an optimal combination of fabrics to apply to trekking boots by considering the provided thermal insulation, air permeability and wicking. The results also revealed that the subjective perception of thermal comfort appears to be more related to the increase in temperature of the feet than to the moisture retention inside the boot. Although the evaluation of knits used in the boots indicated that a particular combination of fibres was optimal for use in the inner layer, the subjective and objective evaluation of thermal comfort revealed that the evaluation provided by users did not necessarily match the technical assessment data. No correlation was observed between the general comfort and specific thermal comfort assessments. Finally, the identification of thermal discomfort by specific foot areas would be useful in the process of designing and developing boots. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. An experimental comparison of online object-tracking algorithms

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Chen, Feng; Xu, Wenli; Yang, Ming-Hsuan

    2011-09-01

    This paper reviews and evaluates several state-of-the-art online object tracking algorithms. Notwithstanding decades of efforts, object tracking remains a challenging problem due to factors such as illumination, pose, scale, deformation, motion blur, noise, and occlusion. To account for appearance change, most recent tracking algorithms focus on robust object representations and effective state prediction. In this paper, we analyze the components of each tracking method and identify their key roles in dealing with specific challenges, thereby shedding light on how to choose and design algorithms for different situations. We compare state-of-the-art online tracking methods including the IVT,1 VRT,2 FragT,3 BoostT,4 SemiT,5 BeSemiT,6 L1T,7 MILT,8 VTD9 and TLD10 algorithms on numerous challenging sequences, and evaluate them with different performance metrics. The qualitative and quantitative comparative results demonstrate the strength and weakness of these algorithms.

  18. Laser flare photometry: a noninvasive, objective, and quantitative method to measure intraocular inflammation.

    PubMed

    Tugal-Tutkun, Ilknur; Herbort, Carl P

    2010-10-01

    Aqueous flare and cells are the two inflammatory parameters of anterior chamber inflammation resulting from disruption of the blood-ocular barriers. When examined with the slit lamp, measurement of intraocular inflammation remains subjective with considerable intra- and interobserver variations. Laser flare cell photometry is an objective quantitative method that enables accurate measurement of these parameters with very high reproducibility. Laser flare photometry allows detection of subclinical alterations in the blood-ocular barriers, identifying subtle pathological changes that could not have been recorded otherwise. With the use of this method, it has been possible to compare the effect of different surgical techniques, surgical adjuncts, and anti-inflammatory medications on intraocular inflammation. Clinical studies of uveitis patients have shown that flare measurements by laser flare photometry allowed precise monitoring of well-defined uveitic entities and prediction of disease relapse. Relationships of laser flare photometry values with complications of uveitis and visual loss further indicate that flare measurement by laser flare photometry should be included in the routine follow-up of patients with uveitis.

  19. Quantitative lung perfusion evaluation using Fourier decomposition perfusion MRI.

    PubMed

    Kjørstad, Åsmund; Corteville, Dominique M R; Fischer, Andre; Henzler, Thomas; Schmid-Bindert, Gerald; Zöllner, Frank G; Schad, Lothar R

    2014-08-01

    To quantitatively evaluate lung perfusion using Fourier decomposition perfusion MRI. The Fourier decomposition (FD) method is a noninvasive method for assessing ventilation- and perfusion-related information in the lungs, where the perfusion maps in particular have shown promise for clinical use. However, the perfusion maps are nonquantitative and dimensionless, making follow-ups and direct comparisons between patients difficult. We present an approach to obtain physically meaningful and quantifiable perfusion maps using the FD method. The standard FD perfusion images are quantified by comparing the partially blood-filled pixels in the lung parenchyma with the fully blood-filled pixels in the aorta. The percentage of blood in a pixel is then combined with the temporal information, yielding quantitative blood flow values. The values of 10 healthy volunteers are compared with SEEPAGE measurements which have shown high consistency with dynamic contrast enhanced-MRI. All pulmonary blood flow (PBF) values are within the expected range. The two methods are in good agreement (mean difference = 0.2 mL/min/100 mL, mean absolute difference = 11 mL/min/100 mL, mean PBF-FD = 150 mL/min/100 mL, mean PBF-SEEPAGE = 151 mL/min/100 mL). The Bland-Altman plot shows a good spread of values, indicating no systematic bias between the methods. Quantitative lung perfusion can be obtained using the Fourier Decomposition method combined with a small amount of postprocessing. Copyright © 2013 Wiley Periodicals, Inc.

  20. Objective measurements to evaluate glottal space segmentation from laryngeal images.

    PubMed

    Gutiérrez-Arriola, J M; Osma-Ruiz, V; Sáenz-Lechón, N; Godino-Llorente, J I; Fraile, R; Arias-Londoño, J D

    2012-01-01

    Objective evaluation of the results of medical image segmentation is a known problem. Applied to the task of automatically detecting the glottal area from laryngeal images, this paper proposes a new objective measurement to evaluate the quality of a segmentation algorithm by comparing with the results given by a human expert. The new figure of merit is called Area Index, and its effectiveness is compared with one of the most used figures of merit found in the literature: the Pratt Index. Results over 110 laryngeal images presented high correlations between both indexes, demonstrating that the proposed measure is comparable to the Pratt Index and it is a good indicator of the segmentation quality.

  1. Subjective and objective voice evaluation in Sjögren's syndrome.

    PubMed

    Saltürk, Ziya; Özdemir, Erdi; Kumral, Tolgar Lütfi; Karabacakoğlu, Zeynep; Kumral, Esra; Yildiz, Hatice Elvin; Mersinlioğlu, Gökhan; Atar, Yavuz; Berkiten, Güler; Yildirim, Güven; Uyar, Yavuz

    2017-04-01

    Objective The aim of this study is to assess the subjective and objective aspects of voice in Sjögren's syndrome. Methods The study enrolled 10 women with Sjögren's syndrome and 12 healthy women. Maximum phonation time, fundamental frequency, jitter, shimmer, and noise-to-harmonics ratio were determined during acoustic voice analysis. The Stroboscopy Evaluation Rating Form was used for the laryngostroboscopic evaluation. A subjective evaluation was performed using the Turkish version of Voice Handicap Index-10. Results The mean age of the Sjögren's syndrome and control groups was 46 ± 13.89 and 41.27 ± 6.99 years, respectively, and did not differ (P = 0.131). In the laryngostroboscopic evaluation, the smoothness and straightness of vocal folds, regularity, and glottal closure differed significantly. In the acoustic and aerodynamic analyses, none of the parameters differed statistically, while the Sjögren's syndrome group had significantly higher Voice Handicap Index-10 scores than the controls. Conclusion Sjögren's syndrome affects the voice and voice quality.

  2. Fluorescent proteins for quantitative microscopy: important properties and practical evaluation.

    PubMed

    Shaner, Nathan Christopher

    2014-01-01

    More than 20 years after their discovery, fluorescent proteins (FPs) continue to be the subject of massive engineering efforts yielding continued improvements. Among these efforts are many aspects that should be of great interest to quantitative imaging users. With new variants frequently introduced into the research community, "tried and true" FPs that have been relied on for many years may now be due for upgrades to more modern variants. However, the dizzying array of FPs now available can make the initial act of narrowing down the potential choices an intimidating prospect. This chapter describes the FP properties that most strongly impact their performance in quantitative imaging experiments, along with their physical origins as they are currently understood. A workflow for evaluating a given FP in the researcher's chosen experimental system (e.g., a specific cell line) is described. © 2014 Elsevier Inc. All rights reserved.

  3. Miniature objective lens for array digital pathology: design improvement based on clinical evaluation

    NASA Astrophysics Data System (ADS)

    McCall, Brian; Pierce, Mark; Graviss, Edward A.; Richards-Kortum, Rebecca R.; Tkaczyk, Tomasz S.

    2016-03-01

    A miniature objective designed for digital detection of Mycobacterium tuberculosis (MTB) was evaluated for diagnostic accuracy. The objective was designed for array microscopy, but fabricated and evaluated at this stage of development as a single objective. The counts and diagnoses of patient samples were directly compared for digital detection and standard microscopy. The results were found to be correlated and highly concordant. The evaluation of this lens by direct comparison to standard fluorescence sputum smear microscopy presented unique challenges and led to some new insights in the role played by the system parameters of the microscope. The design parameters and how they were developed are reviewed in light of these results. New system parameters are proposed with the goal of easing the challenges of evaluating the miniature objective and maintaining the optical performance that produced the agreeable results presented without over-optimizing. A new design is presented that meets and exceeds these criteria.

  4. Method and Application for Dynamic Comprehensive Evaluation with Subjective and Objective Information

    PubMed Central

    Liu, Dinglin; Zhao, Xianglian

    2013-01-01

    In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176

  5. A New Approach for the Quantitative Evaluation of Drawings in Children with Learning Disabilities

    ERIC Educational Resources Information Center

    Galli, Manuela; Vimercati, Sara Laura; Stella, Giacomo; Caiazzo, Giorgia; Norveti, Federica; Onnis, Francesca; Rigoldi, Chiara; Albertini, Giorgio

    2011-01-01

    A new method for a quantitative and objective description of drawing and for the quantification of drawing ability in children with learning disabilities (LD) is hereby presented. Twenty-four normally developing children (N) (age 10.6 [plus or minus] 0.5) and 18 children with learning disabilities (LD) (age 10.3 [plus or minus] 2.4) took part to…

  6. Beyond CCT: The spectral index system as a tool for the objective, quantitative characterization of lamps

    NASA Astrophysics Data System (ADS)

    Galadí-Enríquez, D.

    2018-02-01

    Correlated color temperature (CCT) is a semi-quantitative system that roughly describes the spectra of lamps. This parameter gives the temperature (measured in kelvins) of the black body that would show the hue more similar to that of the light emitted by the lamp. Modern lamps for indoor and outdoor lighting display many spectral energy distributions, most of them extremely different to those of black bodies, what makes CCT to be far from a perfect descriptor from the physical point of view. The spectral index system presented in this work provides an accurate, objective, quantitative procedure to characterize the spectral properties of lamps, with just a few numbers. The system is an adaptation to lighting technology of the classical procedures of multi-band astronomical photometry with wide and intermediate-band filters. We describe the basic concepts and we apply the system to a representative set of lamps of many kinds. The results lead to interesting, sometimes surprising conclusions. The spectral index system is extremely easy to implement from the spectral data that are routinely measured at laboratories. Thus, including this kind of computations in the standard protocols for the certification of lamps will be really straightforward, and will enrich the technical description of lighting devices.

  7. Validation of virtual learning object to support the teaching of nursing care systematization.

    PubMed

    Salvador, Pétala Tuani Candido de Oliveira; Mariz, Camila Maria Dos Santos; Vítor, Allyne Fortes; Ferreira Júnior, Marcos Antônio; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2018-01-01

    to describe the content validation process of a Virtual Learning Object to support the teaching of nursing care systematization to nursing professionals. methodological study, with quantitative approach, developed according to the methodological reference of Pasquali's psychometry and conducted from March to July 2016, from two-stage Delphi procedure. in the Delphi 1 stage, eight judges evaluated the Virtual Object; in Delphi 2 stage, seven judges evaluated it. The seven screens of the Virtual Object were analyzed as to the suitability of its contents. The Virtual Learning Object to support the teaching of nursing care systematization was considered valid in its content, with a Total Content Validity Coefficient of 0.96. it is expected that the Virtual Object can support the teaching of nursing care systematization in light of appropriate and effective pedagogical approaches.

  8. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. A data set for evaluating the performance of multi-class multi-object video tracking

    NASA Astrophysics Data System (ADS)

    Chakraborty, Avishek; Stamatescu, Victor; Wong, Sebastien C.; Wigley, Grant; Kearney, David

    2017-05-01

    One of the challenges in evaluating multi-object video detection, tracking and classification systems is having publically available data sets with which to compare different systems. However, the measures of performance for tracking and classification are different. Data sets that are suitable for evaluating tracking systems may not be appropriate for classification. Tracking video data sets typically only have ground truth track IDs, while classification video data sets only have ground truth class-label IDs. The former identifies the same object over multiple frames, while the latter identifies the type of object in individual frames. This paper describes an advancement of the ground truth meta-data for the DARPA Neovision2 Tower data set to allow both the evaluation of tracking and classification. The ground truth data sets presented in this paper contain unique object IDs across 5 different classes of object (Car, Bus, Truck, Person, Cyclist) for 24 videos of 871 image frames each. In addition to the object IDs and class labels, the ground truth data also contains the original bounding box coordinates together with new bounding boxes in instances where un-annotated objects were present. The unique IDs are maintained during occlusions between multiple objects or when objects re-enter the field of view. This will provide: a solid foundation for evaluating the performance of multi-object tracking of different types of objects, a straightforward comparison of tracking system performance using the standard Multi Object Tracking (MOT) framework, and classification performance using the Neovision2 metrics. These data have been hosted publically.

  10. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    NASA Astrophysics Data System (ADS)

    Bachmair, S.; Svensson, C.; Hannaford, J.; Barker, L. J.; Stahl, K.

    2016-07-01

    coverage. The predictions also provided insights into the EDII, in particular highlighting drought events where missing impact reports may reflect a lack of recording rather than true absence of impacts. Overall, the presented quantitative framework proved to be a useful tool for evaluating drought indicators, and to model impact occurrence. In summary, this study demonstrates the information gain for drought monitoring and early warning through impact data collection and analysis. It highlights the important role that quantitative analysis with impact data can have in providing "ground truth" for drought indicators, alongside more traditional stakeholder-led approaches.

  11. Determining quantitative immunophenotypes and evaluating their implications

    NASA Astrophysics Data System (ADS)

    Redelman, Douglas; Hudig, Dorothy; Berner, Dave; Castell, Linda M.; Roberts, Don; Ensign, Wayne

    2002-05-01

    Quantitative immunophenotypes varied widely among > 100 healthy young males but were maintained at characteristic levels within individuals. The initial results (SPIE Proceedings 4260:226) that examined cell numbers and the quantitative expression of adhesion and lineage-specific molecules, e.g., CD2 and CD14, have now been confirmed and extended to include the quantitative expression of inducible molecules such as HLA-DR and perforin (Pf). Some properties, such as the ratio of T helper (Th) to T cytotoxic/suppressor (Tc/s) cells, are known to be genetically determined. Other properties, e.g., the T:B cell ratio, the amount of CD19 per B cell, etc., behaved similarly and may also be inherited traits. Since some patterns observed in these healthy individuals resembled those found in pathological situations we tested whether the patterns could be associated with the occurrence of disease. The current studies shows that there were associations between quantitative immunophenotypes and the subsequent incidence and severity of disease. For example, individuals with characteristically low levels of HLA-DR or B cells or reduced numbers of Pf+ Tc/s cells had more frequent and/or more severe upper respiratory infections. Quantitative immunophenotypes will be more widely measured if the necessary standards are available and if appropriate procedures are made more accessible.

  12. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  13. Evaluating an Objective Structured Clinical Examination (OSCE) Adapted for Social Work

    ERIC Educational Resources Information Center

    Bogo, Marion; Regehr, Cheryl; Katz, Ellen; Logie, Carmen; Tufford, Lea; Litvack, Andrea

    2012-01-01

    Objectives: To evaluate an objective structured clinical examination (OSCE) adapted for social work in a lab course and examine the degree to which it predicts competence in the practicum. Method: 125 Masters students participated in a one-scenario OSCE and wrote responses to standardized reflection questions. OSCE performance and reflections were…

  14. [Clinical evaluation of a novel HBsAg quantitative assay].

    PubMed

    Takagi, Kazumi; Tanaka, Yasuhito; Naganuma, Hatsue; Hiramatsu, Kumiko; Iida, Takayasu; Takasaka, Yoshimitsu; Mizokami, Masashi

    2007-07-01

    The clinical implication of the hepatitis B surface antigen (HBsAg) concentrations in HBV-infected individuals remains unclear. The aim of this study was to evaluate a novel fully automated Chemiluminescence Enzyme Immunoassay (Sysmex HBsAg quantitative assay) by comparative measurements of the reference serum samples versus two independent commercial assays (Lumipulse f or Architect HBsAg QT). Furthermore, clinical usefulness was assessed for monitoring of the serum HBsAg levels during antiviral therapy. A dilution test using 5 reference-serum samples showed linear correlation curve in range from 0.03 to 2,360 IU/ml. The HBsAg was measured in total of 400 serum samples and 99.8% had consistent results between Sysmex and Lumipulse f. Additionally, a positive linear correlation was observed between Sysmex and Architect. To compare the Architect and Sysmex, both methods were applied to quantify the HBsAg in serum samples with different HBV genotypes/subgenotypes, as well as in serum contained HBV vaccine escape mutants (126S, 145R). Correlation between the methods was observed in results for escape mutants and common genotypes (A, B, C) in Japan. Observed during lamivudine therapy, an increase in HBsAg and HBV DNA concentrations preceded the aminotransferase (ALT) elevation associated with drug-resistant HBV variant emergence (breakthrough hepatitis). In conclusion, reliability of the Sysmex HBsAg quantitative assay was confirmed for all HBV genetic variants common in Japan. Monitoring of serum HBsAg concentrations in addition to HBV DNA quantification, is helpful in evaluation of the response to lamivudine treatment and diagnosis of the breakthrough hepatitis.

  15. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  16. An Evaluation of Database Solutions to Spatial Object Association

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, V S; Kurc, T; Saltz, J

    2008-06-24

    Object association is a common problem encountered in many applications. Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two datasets based on their positions in a common spatial coordinate system--one of the datasets may correspond to a catalog of objects observed over time in a multi-dimensional domain; the other dataset may consist of objects observed in a snapshot of the domain at a time point. The use of database management systems to the solve the object association problem provides portability across different platforms and also greater flexibility. Increasingmore » dataset sizes in today's applications, however, have made object association a data/compute-intensive problem that requires targeted optimizations for efficient execution. In this work, we investigate how database-based crossmatch algorithms can be deployed on different database system architectures and evaluate the deployments to understand the impact of architectural choices on crossmatch performance and associated trade-offs. We investigate the execution of two crossmatch algorithms on (1) a parallel database system with active disk style processing capabilities, (2) a high-throughput network database (MySQL Cluster), and (3) shared-nothing databases with replication. We have conducted our study in the context of a large-scale astronomy application with real use-case scenarios.« less

  17. Digital learning objects in nursing consultation: technology assessment by undergraduate students.

    PubMed

    Silveira, DeniseTolfo; Catalan, Vanessa Menezes; Neutzling, Agnes Ludwig; Martinato, Luísa Helena Machado

    2010-01-01

    This study followed the teaching-learning process about the nursing consultation, based on digital learning objects developed through the active Problem Based Learning method. The goals were to evaluate the digital learning objects about nursing consultation, develop cognitive skills on the subject using problem based learning and identify the students' opinions on the use of technology. This is an exploratory and descriptive study with a quantitative approach. The sample consisted of 71 students in the sixth period of the nursing program at the Federal University of Rio Grande do Sul. The data was collected through a questionnaire to evaluate the learning objects. The results showed positive agreement (58%) on the content, usability and didactics of the proposed computer-mediated activity regarding the nursing consultation. The application of materials to the students is considered positive.

  18. Evaluating the inverse reasoning account of object discovery.

    PubMed

    Carroll, Christopher D; Kemp, Charles

    2015-06-01

    People routinely make inferences about unobserved objects. A hotel guest with welts on his arms, for example, will often worry about bed bugs. The discovery of unobserved objects almost always involves a backward inference from some observed effects (e.g., welts) to unobserved causes (e.g., bed bugs). The inverse reasoning account, which is typically formalized as Bayesian inference, posits that the strength of a backward inference is closely connected to the strength of the corresponding forward inference from the unobserved causes to the observed effects. We evaluated the inverse reasoning account of object discovery in three experiments where participants were asked to discover the unobserved "attractors" and "repellers" that controlled a "particle" moving within an arena. Experiments 1 and 2 showed that participants often failed to provide the best explanations for various particle motions, even when the best explanations were simple and when participants enthusiastically endorsed these explanations when presented with them. This failure demonstrates that object discovery is critically dependent on the processes that support hypothesis generation-processes that the inverse reasoning account does not explain. Experiment 3 demonstrated that people sometimes generate explanations that are invalid even according to their own forward inferences, suggesting that the psychological processes that support forward and backward inference are less intertwined than the inverse reasoning account suggests. The experimental findings support an alternative account of object discovery in which people rely on heuristics to generate possible explanations. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology

    PubMed Central

    Zhang, Wen; Cao, Jieer

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers’ overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles. PMID:29240789

  20. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology.

    PubMed

    Zhang, Wen; Cao, Jieer; Xu, Jun

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers' overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles.

  1. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.

  2. [Reconsidering evaluation criteria regarding health care research: toward an integrative framework of quantitative and qualitative criteria].

    PubMed

    Miyata, Hiroaki; Kai, Ichiro

    2006-05-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.

  3. Objective evaluation of the visual acuity in human eyes

    NASA Astrophysics Data System (ADS)

    Rosales, M. A.; López-Olazagasti, E.; Ramírez-Zavaleta, G.; Varillas, G.; Tepichín, E.

    2009-08-01

    Traditionally, the quality of the human vision is evaluated by a subjective test in which the examiner asks the patient to read a series of characters of different sizes, located at a certain distance of the patient. Typically, we need to ensure a subtended angle of vision of 5 minutes, which implies an object of 8.8 mm high located at 6 meters (normal or 20/20 visual acuity). These characters constitute what is known as the Snellen chart, universally used to evaluate the spatial resolution of the human eyes. The mentioned process of identification of characters is carried out by means of the eye - brain system, giving an evaluation of the subjective visual performance. In this work we consider the eye as an isolated image-forming system, and show that it is possible to isolate the function of the eye from that of the brain in this process. By knowing the impulse response of the eye´s system we can obtain, in advance, the image of the Snellen chart simultaneously. From this information, we obtain the objective performance of the eye as the optical system under test. This type of results might help to detect anomalous situations of the human vision, like the so called "cerebral myopia".

  4. A new, objective, quantitative scale for measuring local skin responses following topical actinic keratosis therapy with ingenol mebutate.

    PubMed

    Rosen, Robert; Marmur, Ellen; Anderson, Lawrence; Welburn, Peter; Katsamas, Janelle

    2014-12-01

    Local skin responses (LSRs) are the most common adverse effects of topical actinic keratosis (AK) therapy. There is currently no method available that allows objective characterization of LSRs. Here, the authors describe a new scale developed to quantitatively and objectively assess the six most common LSRs resulting from topical AK therapy with ingenol mebutate. The LSR grading scale was developed using a 0-4 numerical rating, with clinical descriptors and representative photographic images for each rating. Good inter-observer grading concordance was demonstrated in peer review during development of the tool. Data on the use of the scale are described from four phase III double-blind studies of ingenol mebutate (n = 1,005). LSRs peaked on days 4 (face/scalp) or 8 (trunk/extremities), with mean maximum composite LSR scores of 9.1 and 6.8, respectively, and a rapid return toward baseline by day 15 in most cases. Mean composite LSR score at day 57 was generally lower than at baseline. The LSR grading scale is an objective tool allowing practicing dermatologists to characterize and compare LSRs to existing and, potentially, future AK therapies.

  5. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  6. Creating Objects and Object Categories for Studying Perception and Perceptual Learning

    PubMed Central

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-01-01

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created

  7. Creating objects and object categories for studying perception and perceptual learning.

    PubMed

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-11-02

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties. Many innovative and useful methods currently exist for creating novel objects and object categories (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection. Objects and object categories created by these simulations can

  8. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  9. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699

  10. The effect of image sharpness on quantitative eye movement data and on image quality evaluation while viewing natural images

    NASA Astrophysics Data System (ADS)

    Vuori, Tero; Olkkonen, Maria

    2006-01-01

    The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.

  11. Quantitative methods for evaluating the efficacy of thalamic deep brain stimulation in patients with essential tremor.

    PubMed

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life.

  12. Quantitative analysis of comparative genomic hybridization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manoir, S. du; Bentz, M.; Joos, S.

    1995-01-01

    Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a programmore » for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.« less

  13. Quantitative evaluation of orbital hybridization in carbon nanotubes under radial deformation using π-orbital axis vector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohnishi, Masato, E-mail: masato.ohnishi@rift.mech.tohoku.ac.jp; Suzuki, Ken; Miura, Hideo, E-mail: hmiura@rift.mech.tohoku.ac.jp

    2015-04-15

    When a radial strain is applied to a carbon nanotube (CNT), the increase in local curvature induces orbital hybridization. The effect of the curvature-induced orbital hybridization on the electronic properties of CNTs, however, has not been evaluated quantitatively. In this study, the strength of orbital hybridization in CNTs under homogeneous radial strain was evaluated quantitatively. Our analyses revealed the detailed procedure of the change in electronic structure of CNTs. In addition, the dihedral angle, the angle between π-orbital axis vectors of adjacent atoms, was found to effectively predict the strength of local orbital hybridization in deformed CNTs.

  14. Quantitative Information Differences Between Object-Person Presentation Methods

    ERIC Educational Resources Information Center

    Boyd, J. Edwin; Perry, Raymond P.

    1972-01-01

    Subjects used significantly more adjectives, on an adjective checklist (ACL), in giving their impressions of an object-person; based on written and audiovisual presentations, more than audio presentations. (SD)

  15. Methodology for Evaluating Quality and Reusability of Learning Objects

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Bireniene, Virginija; Serikoviene, Silvija

    2011-01-01

    The aim of the paper is to present the scientific model and several methods for the expert evaluation of quality of learning objects (LOs) paying especial attention to LOs reusability level. The activities of eQNet Quality Network for a European Learning Resource Exchange (LRE) aimed to improve reusability of LOs of European Schoolnet's LRE…

  16. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  17. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  18. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies.

    PubMed

    Madill, A; Jordan, A; Shirley, C

    2000-02-01

    The effect of the individual analyst on research findings can create a credibility problem for qualitative approaches from the perspective of evaluative criteria utilized in quantitative psychology. This paper explicates the ways in which objectivity and reliability are understood in qualitative analysis conducted from within three distinct epistemological frameworks: realism, contextual constructionism, and radical constructionism. It is argued that quality criteria utilized in quantitative psychology are appropriate to the evaluation of qualitative analysis only to the extent that it is conducted within a naive or scientific realist framework. The discussion is illustrated with reference to the comparison of two independent grounded theory analyses of identical material. An implication of this illustration is to identify the potential to develop a radical constructionist strand of grounded theory.

  19. Analytical insight into "breathing" crack-induced acoustic nonlinearity with an application to quantitative evaluation of contact cracks.

    PubMed

    Wang, Kai; Liu, Menglong; Su, Zhongqing; Yuan, Shenfang; Fan, Zheng

    2018-08-01

    To characterize fatigue cracks, in the undersized stage in particular, preferably in a quantitative and precise manner, a two-dimensional (2D) analytical model is developed for interpreting the modulation mechanism of a "breathing" crack on guided ultrasonic waves (GUWs). In conjunction with a modal decomposition method and a variational principle-based algorithm, the model is capable of analytically depicting the propagating and evanescent waves induced owing to the interaction of probing GUWs with a "breathing" crack, and further extracting linear and nonlinear wave features (e.g., reflection, transmission, mode conversion and contact acoustic nonlinearity (CAN)). With the model, a quantitative correlation between CAN embodied in acquired GUWs and crack parameters (e.g., location and severity) is obtained, whereby a set of damage indices is proposed via which the severity of the crack can be evaluated quantitatively. The evaluation, in principle, does not entail a benchmarking process against baseline signals. As validation, the results obtained from the analytical model are compared with those from finite element simulation, showing good consistency. This has demonstrated accuracy of the developed analytical model in interpreting contact crack-induced CAN, and spotlighted its application to quantitative evaluation of fatigue damage. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Integrating quantitative and qualitative evaluation methods to compare two teacher inservice training programs

    NASA Astrophysics Data System (ADS)

    Lawrenz, Frances; McCreath, Heather

    Qualitative and quantitative evaluation procedures were used to compare two physical-science teacher inservice training programs. The two programs followed the master teacher training model espoused by NSF but used different types of master teachers and types of activities. The two evaluation procedures produced different results and together they provided a much clearer picture of the strengths and weaknesses of the two programs. Using only one approach or the other would have substantially altered the conclusions.

  1. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  2. Program Fair Evaluation--Summative Appraisal of Instructional Sequences with Dissimilar Objectives.

    ERIC Educational Resources Information Center

    Popham, W. James

    A comparative evaluation involving two instructional programs is given, although the approach can easily serve to compare more than two programs. The steps involved in conducting a program fair evaluation of two instructional programs are: (1) Identify objectives (a) common to both programs, (b) unique to one program, and (c) unique to the other…

  3. Quantitative Evaluation Method of Each Generation Margin for Power System Planning

    NASA Astrophysics Data System (ADS)

    Su, Su; Tanaka, Kazuyuki

    As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.

  4. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  5. Objective Structured Professional Assessments for Trainee Educational Psychologists: An Evaluation

    ERIC Educational Resources Information Center

    Dunsmuir, Sandra; Atkinson, Cathy; Lang, Jane; Warhurst, Amy; Wright, Sarah

    2017-01-01

    Objective Structured Professional Assessments (OSPAs) were developed and evaluated at three universities in the United Kingdom, to supplement supervisor assessments of trainee educational psychologists' placement practice. Participating second year students on three educational psychology doctoral programmes (n = 31) and tutors (n = 12) were…

  6. [Evaluation on methodological problems in reports concerning quantitative analysis of syndrome differentiation of diabetes mellitus].

    PubMed

    Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu

    2006-01-01

    To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.

  7. Direct evaluation of fault trees using object-oriented programming techniques

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1989-01-01

    Object-oriented programming techniques are used in an algorithm for the direct evaluation of fault trees. The algorithm combines a simple bottom-up procedure for trees without repeated events with a top-down recursive procedure for trees with repeated events. The object-oriented approach results in a dynamic modularization of the tree at each step in the reduction process. The algorithm reduces the number of recursive calls required to solve trees with repeated events and calculates intermediate results as well as the solution of the top event. The intermediate results can be reused if part of the tree is modified. An example is presented in which the results of the algorithm implemented with conventional techniques are compared to those of the object-oriented approach.

  8. Method and Apparatus for Evaluating Multilayer Objects for Imperfections

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Inventor); Abedin, Nurul (Inventor); Sun, Kuen J. (Inventor)

    1999-01-01

    A multilayer object having multiple layers arranged in a stacking direction is evaluated for imperfections such as voids, delaminations and microcracks. First. an acoustic wave is transmitted into the object in the stacking direction via an appropriate transducer/waveguide combination. The wave propagates through the multilayer object and is received by another transducer/waveguide combination preferably located on the same surface as the transmitting combination. The received acoustic wave is correlated with the presence or absence of imperfections by, e.g., generating pulse echo signals indicative of the received acoustic wave. wherein the successive signals form distinct groups over time. The respective peak amplitudes of each group are sampled and curve fit to an exponential curve. wherein a substantial fit of approximately 80-90% indicates an absence of imperfections and a significant deviation indicates the presence of imperfections. Alternatively, the time interval between distinct groups can be measured. wherein equal intervals indicate the absence of imperfections and unequal intervals indicate the presence of imperfections.

  9. Method and apparatus for evaluating multilayer objects for imperfections

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Inventor); Abedin, Nurul (Inventor); Sun, Kuen J. (Inventor)

    1997-01-01

    A multilayer object having multiple layers arranged in a stacking direction is evaluated for imperfections such as voids, delaminations and microcracks. First, an acoustic wave is transmitted into the object in the stacking direction via an appropriate transducer/waveguide combination. The wave propagates through the multilayer object and is received by another transducer/waveguide combination preferably located on the same surface as the transmitting combination. The received acoustic wave is correlated with the presence or absence of imperfections by, e.g., generating pulse echo signals indicative of the received acoustic wave, wherein the successive signals form distinct groups over time. The respective peak amplitudes of each group are sampled and curve fit to an exponential curve, wherein a substantial fit of approximately 80-90% indicates an absence of imperfections and a significant deviation indicates the presence of imperfections. Alternatively, the time interval between distinct groups can be measured, wherein equal intervals indicate the absence of imperfections and unequal intervals indicate the presence of imperfections.

  10. Defining an Analytic Framework to Evaluate Quantitative MRI Markers of Traumatic Axonal Injury: Preliminary Results in a Mouse Closed Head Injury Model

    PubMed Central

    Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.

    2017-01-01

    Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972

  11. Simulation-based evaluation of the resolution and quantitative accuracy of temperature-modulated fluorescence tomography.

    PubMed

    Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C; Gulsen, Gultekin

    2015-09-01

    Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed "temperature-modulated fluorescence tomography" (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40  mm×W:100  mm) is recovered as an elongated object in the conventional FT (x=4.5  mm; y=10.4  mm), while TM-FT recovers it successfully in both directions (x=3.8  mm; y=4.6  mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT.

  12. Simulation-based evaluation of the resolution and quantitative accuracy of temperature-modulated fluorescence tomography

    PubMed Central

    Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C.; Gulsen, Gultekin

    2016-01-01

    Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed “temperature-modulated fluorescence tomography” (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm × W :100 mm) is recovered as an elongated object in the conventional FT (x = 4.5 mm; y = 10.4 mm), while TM-FT recovers it successfully in both directions (x = 3.8 mm; y = 4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT. PMID:26368884

  13. Fuzzy Logic Approaches to Multi-Objective Decision-Making in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1994-01-01

    Fuzzy logic allows for the quantitative representation of multi-objective decision-making problems which have vague or fuzzy objectives and parameters. As such, fuzzy logic approaches are well-suited to situations where alternatives must be assessed by using criteria that are subjective and of unequal importance. This paper presents an overview of fuzzy logic and provides sample applications from the aerospace industry. Applications include an evaluation of vendor proposals, an analysis of future space vehicle options, and the selection of a future space propulsion system. On the basis of the results provided in this study, fuzzy logic provides a unique perspective on the decision-making process, allowing the evaluator to assess the degree to which each option meets the evaluation criteria. Future decision-making should take full advantage of fuzzy logic methods to complement existing approaches in the selection of alternatives.

  14. Evaluations of UltraiQ software for objective ultrasound image quality assessment using images from a commercial scanner.

    PubMed

    Long, Zaiyang; Tradup, Donald J; Stekel, Scott F; Gorny, Krzysztof R; Hangiandreou, Nicholas J

    2018-03-01

    We evaluated a commercially available software package that uses B-mode images to semi-automatically measure quantitative metrics of ultrasound image quality, such as contrast response, depth of penetration (DOP), and spatial resolution (lateral, axial, and elevational). Since measurement of elevational resolution is not a part of the software package, we achieved it by acquiring phantom images with transducers tilted at 45 degrees relative to the phantom. Each measurement was assessed in terms of measurement stability, sensitivity, repeatability, and semi-automated measurement success rate. All assessments were performed on a GE Logiq E9 ultrasound system with linear (9L or 11L), curved (C1-5), and sector (S1-5) transducers, using a CIRS model 040GSE phantom. In stability tests, the measurements of contrast, DOP, and spatial resolution remained within a ±10% variation threshold in 90%, 100%, and 69% of cases, respectively. In sensitivity tests, contrast, DOP, and spatial resolution measurements followed the expected behavior in 100%, 100%, and 72% of cases, respectively. In repeatability testing, intra- and inter-individual coefficients of variations were equal to or less than 3.2%, 1.3%, and 4.4% for contrast, DOP, and spatial resolution (lateral and axial), respectively. The coefficients of variation corresponding to the elevational resolution test were all within 9.5%. Overall, in our assessment, the evaluated package performed well for objective and quantitative assessment of the above-mentioned image qualities under well-controlled acquisition conditions. We are finding it to be useful for various clinical ultrasound applications including performance comparison between scanners from different vendors. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. Comparative analysis of quantitative efficiency evaluation methods for transportation networks.

    PubMed

    He, Yuxin; Qin, Jin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess's Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified.

  16. Quantitative evaluation of the requirements for the promotion as associate professor at German medical faculties.

    PubMed

    Sorg, Heiko; Knobloch, Karsten

    2012-01-01

    First quantitative evaluation of the requirements for the promotion as associate professor (AP) at German medical faculties. Analysis of the AP-regulations of German medical faculties according to a validated scoring system, which has been adapted to this study. The overall scoring for the AP-requirements at 35 German medical faculties was 13.5±0.6 of 20 possible scoring points (95% confidence interval 12.2-14.7). More than 88% of the AP-regulations demand sufficient performance in teaching and research with adequate scientific publication. Furthermore, 83% of the faculties expect an expert review of the candidate's performance. Conference presentations required as an assistant professor as well as the reduction of the minimum time as an assistant professor do only play minor roles. The requirements for assistant professors to get nominated as an associate professor at German medical faculties are high with an only small range. In detail, however, it can be seen that there still exists large heterogeneity, which hinders equal opportunities and career possibilities. These data might be used for the ongoing objective discussion.

  17. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  18. Quantitative oxygen concentration imaging in toluene atmospheres using Dual Imaging with Modeling Evaluation

    NASA Astrophysics Data System (ADS)

    Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim

    2013-01-01

    Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.

  19. Quantitative oxygen concentration imaging in toluene atmospheres using Dual Imaging with Modeling Evaluation

    NASA Astrophysics Data System (ADS)

    Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim

    2012-12-01

    Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.

  20. POPA: A Personality and Object Profiling Assistant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreicer, J.S.

    POPA: A Personality and Object Profiling Assistant system utilizes an extension and variation of a process developed for decision analysis as a tool to quantify intuitive feelings and subjective judgments. The technique is based on a manipulation of the Analytical Hierarchy Process. The POPA system models an individual in terms of his character type, life orientation, and incentive (motivational) factors. Then an object (i.e., individual, project, situation, or policy) is modeled with respect to its three most important factors. The individual and object models are combined to indicate the influence each of the three object factors have on the individual.more » We have investigated this problem: 1) to develop a technique that models personality types in a quantitative and organized manner, 2) to develop a tool capable of evaluating the probable success of obtaining funding for proposed programs at Los Alamos National Laboratory, 3) to determine the feasibility of quantifying feelings and intuition, and 4) to better understand subjective knowledge acquisition (especially intuition). 49 refs., 10 figs., 5 tabs.« less

  1. Saving Educational Dollars through Quality Objectives.

    ERIC Educational Resources Information Center

    Alvir, Howard P.

    This document is a collection of working papers written to meet the specific needs of teachers who are starting to think about and write performance objectives. It emphasizes qualitative objectives as opposed to quantitative classroom goals. The author describes quality objectives as marked by their clarity, accessibility, accountability, and…

  2. Polymer on Top: Current Limits and Future Perspectives of Quantitatively Evaluating Surface Grafting.

    PubMed

    Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher

    2018-03-07

    Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Performance Evaluation and Quantitative Accuracy of Multipinhole NanoSPECT/CT Scanner for Theranostic Lu-177 Imaging

    NASA Astrophysics Data System (ADS)

    Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung

    2018-06-01

    SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.

  4. Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures

    NASA Astrophysics Data System (ADS)

    Ju, H. S.; Tittmann, B. R.

    2010-02-01

    A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.

  5. Objective Fidelity Evaluation in Multisensory Virtual Environments: Auditory Cue Fidelity in Flight Simulation

    PubMed Central

    Meyer, Georg F.; Wong, Li Ting; Timson, Emma; Perfect, Philip; White, Mark D.

    2012-01-01

    We argue that objective fidelity evaluation of virtual environments, such as flight simulation, should be human-performance-centred and task-specific rather than measure the match between simulation and physical reality. We show how principled experimental paradigms and behavioural models to quantify human performance in simulated environments that have emerged from research in multisensory perception provide a framework for the objective evaluation of the contribution of individual cues to human performance measures of fidelity. We present three examples in a flight simulation environment as a case study: Experiment 1: Detection and categorisation of auditory and kinematic motion cues; Experiment 2: Performance evaluation in a target-tracking task; Experiment 3: Transferrable learning of auditory motion cues. We show how the contribution of individual cues to human performance can be robustly evaluated for each task and that the contribution is highly task dependent. The same auditory cues that can be discriminated and are optimally integrated in experiment 1, do not contribute to target-tracking performance in an in-flight refuelling simulation without training, experiment 2. In experiment 3, however, we demonstrate that the auditory cue leads to significant, transferrable, performance improvements with training. We conclude that objective fidelity evaluation requires a task-specific analysis of the contribution of individual cues. PMID:22957068

  6. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  7. Training Objectives, Transfer, Validation and Evaluation: A Sri Lankan Study

    ERIC Educational Resources Information Center

    Wickramasinghe, Vathsala M.

    2006-01-01

    Using a stratified random sample, this paper examines the training practices of setting objectives, transfer, validation and evaluation in Sri Lanka. The paper further sets out to compare those practices across local, foreign and joint-venture companies based on the assumption that there may be significant differences across companies of different…

  8. The Quantitative Evaluation of the Clinical and Translational Science Awards (CTSA) Program Based on Science Mapping and Scientometric Analysis

    PubMed Central

    Zhang, Yin; Wang, Lei

    2013-01-01

    Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689

  9. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.

  10. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  11. Quantitative evaluation of optically induced disorientation.

    DOT National Transportation Integrated Search

    1970-01-01

    The purpose of this study was to establish quantitatively and systematically the association between the speed of movement of an optical environment and the extent of disorientation experienced by an individual viewing this environment. The degree of...

  12. Quantitative T2 Magnetic Resonance Imaging Compared to Morphological Grading of the Early Cervical Intervertebral Disc Degeneration: An Evaluation Approach in Asymptomatic Young Adults

    PubMed Central

    Han, Zhihua; Shao, Lixin; Xie, Yan; Wu, Jianhong; Zhang, Yan; Xin, Hongkui; Ren, Aijun; Guo, Yong; Wang, Deli; He, Qing; Ruan, Dike

    2014-01-01

    Objective The objective of this study was to evaluate the efficacy of quantitative T2 magnetic resonance imaging (MRI) for quantifying early cervical intervertebral disc (IVD) degeneration in asymptomatic young adults by correlating the T2 value with Pfirrmann grade, sex, and anatomic level. Methods Seventy asymptomatic young subjects (34 men and 36 women; mean age, 22.80±2.11 yr; range, 18–25 years) underwent 3.0-T MRI to obtain morphological data (one T1-fast spin echo (FSE) and three-plane T2-FSE, used to assign a Pfirrmann grade (I–V)) and for T2 mapping (multi-echo spin echo). T2 values in the nucleus pulposus (NP, n = 350) and anulus fibrosus (AF, n = 700) were obtained. Differences in T2 values between sexes and anatomic level were evaluated, and linear correlation analysis of T2 values versus degenerative grade was conducted. Findings Cervical IVDs of healthy young adults were commonly determined to be at Pfirrmann grades I and II. T2 values of NPs were significantly higher than those of AF at all anatomic levels (P<0.000). The NP, anterior AF and posterior AF values did not differ significantly between genders at the same anatomic level (P>0.05). T2 values decreased linearly with degenerative grade. Linear correlation analysis revealed a strong negative association between the Pfirrmann grade and the T2 values of the NP (P = 0.000) but not the T2 values of the AF (P = 0.854). However, non-degenerated discs (Pfirrmann grades I and II) showed a wide range of T2 relaxation time. T2 values according to disc degeneration level classification were as follows: grade I (>62.03 ms), grade II (54.60–62.03 ms), grade III (<54.60 ms). Conclusions T2 quantitation provides a more sensitive and robust approach for detecting and characterizing the early stage of cervical IVD degeneration and to create a reliable quantitative in healthy young adults. PMID:24498384

  13. A TEM quantitative evaluation of strengthening in an Mg-RE alloy reinforced with SiC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabibbo, Marcello, E-mail: m.cabibbo@univpm.it; Spigarelli, Stefano

    2011-10-15

    Magnesium alloys containing rare earth elements are known to have high specific strength, good creep and corrosion resistance up to 523 K. The addition of SiC ceramic particles strengthens the metal matrix composite resulting in better wear and creep resistance while maintaining good machinability. The role of the reinforcement particles in enhancing strength can be quantitatively evaluated using transmission electron microscopy (TEM). This paper presents a quantitative evaluation of the different strengthening contributions, determined through TEM inspections, in an SiC Mg-RE composite alloy containing yttrium, neodymium, gadolinium and dysprosium. Compression tests at temperatures ranging between 290 and 573 K weremore » carried out. The microstructure strengthening mechanism was studied for all the compression conditions. Strengthening was compared to the mechanical results and the way the different contributions were combined is also discussed and justified. - Research Highlights: {yields} TEM yield strengthening terms evaluation on a Mg-RE SiC alloy. {yields} The evaluation has been extended to different compression temperature conditions. {yields} Linear and Quadratic sum has been proposed and validated. {yields} Hall-Petch was found to be the most prominent strengthening contributions.« less

  14. Subjective and objective scales to assess the development of children cerebral palsy.

    PubMed

    Pietrzak, S; Jóźwiak, M

    2001-01-01

    Many scoring systems hale been constructed to assess the motor development of cerebral palsy children and to evaluate the effectiveness of treatment. According to the purposes they fulfill, these instruments may be divided into three types: discriminative, evaluative and predictive. The design and measurement methodology are the criteria that determine whether a given scale is quantitative or qualitative in nature, and whether is should be considered to be objective or subjective. The article presents the "reaching, losing and regaining" scale (constructed by the authors to assess functional development and its changes in certain periods of time), the Munich Functional Development Diagnostics, and the Gross Motor Function Measure (GMFM). Special attention is given to the GMFM, its methods, evaluation of results, and application. A comparison of subjective and objective assessment of two cerebral palsy children is included.

  15. Comparative analysis of quantitative efficiency evaluation methods for transportation networks

    PubMed Central

    He, Yuxin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165

  16. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Report on objective ride quality evaluation

    NASA Technical Reports Server (NTRS)

    Wambold, J. C.; Park, W. H.

    1974-01-01

    The correlation of absorbed power as an objective ride measure to the subjective evaluation for the bus data was investigated. For some individual bus rides the correlations were poor, but when a sufficient number of rides was used to give reasonable sample base, an excellent correlation was obtained. The following logarithmical function was derived: S = 1.7245 1n (39.6849 AP), where S = one subjective rating of the ride; and AP = the absorbed power in watts. A six-degree-of-freedom method developed for aircraft data was completed. Preliminary correlation of absorbed power with ISO standards further enhances the bus ride and absorbed power correlation numbers since the AP's obtained are of the same order of magnitude for both correlations. While it would then appear that one could just use ISO standards, there is no way to add the effect of three degrees of freedom. The absorbed power provides a method of adding the effects due to the three major directions plus the pitch and roll.

  18. In vivo quantitative evaluation of tooth color with hand-held colorimeter and custom template.

    PubMed

    Shimada, Kazuki; Kakehashi, Yoshiyuki; Matsumura, Hideo; Tanoue, Naomi

    2004-04-01

    This article presents a technique for quantitatively evaluating the color of teeth, as well as color change in restorations and tooth surfaces. Through use of a custom template made of a thermoplastic polymer and a dental colorimeter, tooth surface color can be recorded periodically at the same location intraorally.

  19. Metrology Standards for Quantitative Imaging Biomarkers

    PubMed Central

    Obuchowski, Nancy A.; Kessler, Larry G.; Raunig, David L.; Gatsonis, Constantine; Huang, Erich P.; Kondratovich, Marina; McShane, Lisa M.; Reeves, Anthony P.; Barboriak, Daniel P.; Guimaraes, Alexander R.; Wahl, Richard L.

    2015-01-01

    Although investigators in the imaging community have been active in developing and evaluating quantitative imaging biomarkers (QIBs), the development and implementation of QIBs have been hampered by the inconsistent or incorrect use of terminology or methods for technical performance and statistical concepts. Technical performance is an assessment of how a test performs in reference objects or subjects under controlled conditions. In this article, some of the relevant statistical concepts are reviewed, methods that can be used for evaluating and comparing QIBs are described, and some of the technical performance issues related to imaging biomarkers are discussed. More consistent and correct use of terminology and study design principles will improve clinical research, advance regulatory science, and foster better care for patients who undergo imaging studies. © RSNA, 2015 PMID:26267831

  20. Design and Implementation of Performance Metrics for Evaluation of Assessments Data

    ERIC Educational Resources Information Center

    Ahmed, Irfan; Bhatti, Arif

    2016-01-01

    Evocative evaluation of assessment data is essential to quantify the achievements at course and program levels. The objective of this paper is to design performance metrics and respective formulas to quantitatively evaluate the achievement of set objectives and expected outcomes at the course levels for program accreditation. Even though…

  1. Taking stock of four decades of quantitative research on stakeholder participation and evaluation use: a systematic map.

    PubMed

    Daigneault, Pierre-Marc

    2014-08-01

    Stakeholder participation and evaluation use have attracted a lot of attention from practitioners, theorists and researchers. A common hypothesis is that participation is positively associated with evaluation use. Whereas the number of empirical studies conducted on this topic is impressive, quantitative research has held a minority position within this scientific production. This study mobilizes systematic review methods to 'map' the empirical literature that has quantitatively studied participation and use. The goal is to take stock and assess the strength of evidence of this literature (but not to synthesize the findings) and, based on this assessment, to provide directions for future research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Methods for quantitative and qualitative evaluation of vaginal microflora during menstruation.

    PubMed Central

    Onderdonk, A B; Zamarchi, G R; Walsh, J A; Mellor, R D; Muñoz, A; Kass, E H

    1986-01-01

    The quantitative and qualitative changes in the bacterial flora of the vagina during menstruation have received inadequate study. Similarly, the effect of vaginal tampons on the microbial flora as well as the relationship between the microbial flora of the vagina and that of the tampon has not been adequately evaluated. The purposes of the present study were (i) to develop quantitative methods for studying the vaginal flora and the flora of tampons obtained during menstruation and (ii) to determine whether there were differences between the microflora of the tampon and that of the vaginal vault. Tampon and swab samples were obtained at various times from eight young healthy volunteers for 8 to 10 menstrual cycles. Samples consisted of swabs from women wearing menstrual pads compared with swab and tampon samples taken at various times during the menstrual cycle. Samples were analyzed for total facultative and anaerobic bacterial counts, and the six dominant bacterial species in each culture were identified. Statistical evaluation of the results indicates that total bacterial counts decreased during menstruation and that swab and tampon samples yielded similar total counts per unit weight of sample. The numbers of bacteria in tampons tended to be lower than in swabs taken at the same time. Overall, during menstruation, the concentrations of lactobacilli declined, but otherwise there was little difference among the species found during menstruation compared with those found in intermenstrual samples. Cotton tampons had little discernible effect on the microbial flora. PMID:3954346

  3. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. When do objects become more attractive? The individual and interactive effects of choice and ownership on object evaluation.

    PubMed

    Huang, Yunhui; Wang, Lei; Shi, Junqi

    2009-06-01

    Four studies used the Implicit Association Test to explore the individual and interactive influence of perceived ownership and perceived choice on object evaluation. In Study 1, participants implicitly preferred their possessions over others' when all chosen by a third party (i.e., the ownership effect). In Study 2, participants implicitly preferred self-chosen objects over other-chosen objects when all given to the third party (i.e., the choice effect). In Study 3, the ownership effect disappeared when participants compared their self-chosen possessions with others' possessions that were chosen by the participants. In Study 4, the choice effect remained even when participants compared their self-chosen possessions with their possessions that were chosen by others. These results suggest that while the ownership effect could be attenuated by perceived choice, the choice effect is stable even under the influence of perceived ownership.

  5. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  6. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  7. Quantitative Estimation of Plasma Free Drug Fraction in Patients With Varying Degrees of Hepatic Impairment: A Methodological Evaluation.

    PubMed

    Li, Guo-Fu; Yu, Guo; Li, Yanfei; Zheng, Yi; Zheng, Qing-Shan; Derendorf, Hartmut

    2018-07-01

    Quantitative prediction of unbound drug fraction (f u ) is essential for scaling pharmacokinetics through physiologically based approaches. However, few attempts have been made to evaluate the projection of f u values under pathological conditions. The primary objective of this study was to predict f u values (n = 105) of 56 compounds with or without the information of predominant binding protein in patients with varying degrees of hepatic insufficiency by accounting for quantitative changes in molar concentrations of either the major binding protein or albumin plus alpha 1-acid glycoprotein associated with differing levels of hepatic dysfunction. For the purpose of scaling, data pertaining to albumin and α1-acid glycoprotein levels in response to differing degrees of hepatic impairment were systematically collected from 919 adult donors. The results of the present study demonstrate for the first time the feasibility of physiologically based scaling f u in hepatic dysfunction after verifying with experimentally measured data of a wide variety of compounds from individuals with varying degrees of hepatic insufficiency. Furthermore, the high level of predictive accuracy indicates that the inter-relation between the severity of hepatic impairment and these plasma protein levels are physiologically accurate. The present study enhances the confidence in predicting f u in hepatic insufficiency, particularly for albumin-bound drugs. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  8. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples

    PubMed Central

    2016-01-01

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978

  9. An Evaluative Review of Simulated Dynamic Smart 3d Objects

    NASA Astrophysics Data System (ADS)

    Romeijn, H.; Sheth, F.; Pettit, C. J.

    2012-07-01

    Three-dimensional (3D) modelling of plants can be an asset for creating agricultural based visualisation products. The continuum of 3D plants models ranges from static to dynamic objects, also known as smart 3D objects. There is an increasing requirement for smarter simulated 3D objects that are attributed mathematically and/or from biological inputs. A systematic approach to plant simulation offers significant advantages to applications in agricultural research, particularly in simulating plant behaviour and the influences of external environmental factors. This approach of 3D plant object visualisation is primarily evident from the visualisation of plants using photographed billboarded images, to more advanced procedural models that come closer to simulating realistic virtual plants. However, few programs model physical reactions of plants to external factors and even fewer are able to grow plants based on mathematical and/or biological parameters. In this paper, we undertake an evaluation of plant-based object simulation programs currently available, with a focus upon the components and techniques involved in producing these objects. Through an analytical review process we consider the strengths and weaknesses of several program packages, the features and use of these programs and the possible opportunities in deploying these for creating smart 3D plant-based objects to support agricultural research and natural resource management. In creating smart 3D objects the model needs to be informed by both plant physiology and phenology. Expert knowledge will frame the parameters and procedures that will attribute the object and allow the simulation of dynamic virtual plants. Ultimately, biologically smart 3D virtual plants that react to changes within an environment could be an effective medium to visually represent landscapes and communicate land management scenarios and practices to planners and decision-makers.

  10. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries

    PubMed Central

    Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-01-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698

  11. Using Technology to Improve the Objectivity of Criminal Responsibility Evaluations.

    PubMed

    Vitacco, Michael J; Gottfried, Emily D; Batastini, Ashley B

    2018-03-01

    Criminal responsibility (or insanity) evaluations require forensic clinicians to reconstruct a defendant's decision-making abilities, behavioral control, and emotional state at the time of the criminal act. Forensic evaluators are ultimately tasked to evaluate whether an individual had the capacity to understand right from wrong, and in some jurisdictions, determine whether the defendant lacked substantial capacity to conform his behavior to the requirements of the law as a result of a threshold condition (e.g., mental illness). Insanity evaluations are inherently complex, because they require the clinician to determine someone's mental state at some point in the past (weeks, months, or even years). Recent research on insanity evaluations underscores significant problems with the reliability and validity of these evaluations. However, technological advances including social media (e.g., Facebook and Twitter), mandating that law enforcement videotape interrogations, and the use of body and dashboard cameras can aid clinicians in improving the precision and quality of insanity evaluations. This article discusses practical guidelines and ethics-related concerns regarding the use of technology to improve the objectivity of criminal responsibility evaluations. © 2018 American Academy of Psychiatry and the Law.

  12. The Objective and Subjective Evaluation of Multichannel Expansion in Wide Dynamic Range Compression Hearing Instruments

    ERIC Educational Resources Information Center

    Plyler, Patrick N.; Lowery, Kristy J.; Hamby, Hilary M.; Trine, Timothy D.

    2007-01-01

    Purpose: The effects of multichannel expansion on the objective and subjective evaluation of 20 listeners fitted binaurally with 4-channel, digital in-the-ear hearing instruments were investigated. Method: Objective evaluations were conducted in quiet using the Connected Speech Test (CST) and in noise using the Hearing in Noise Test (HINT) at 40,…

  13. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  14. A methodology to quantitatively evaluate the safety of a glazing robot.

    PubMed

    Lee, Seungyeol; Yu, Seungnam; Choi, Junho; Han, Changsoo

    2011-03-01

    A new construction method using robots is spreading widely among construction sites in order to overcome labour shortages and frequent construction accidents. Along with economical efficiency, safety is a very important factor for evaluating the use of construction robots in construction sites. However, the quantitative evaluation of safety is difficult compared with that of economical efficiency. In this study, we suggested a safety evaluation methodology by defining the 'worker' and 'work conditions' as two risk factors, defining the 'worker' factor as posture load and the 'work conditions' factor as the work environment and the risk exposure time. The posture load evaluation reflects the risk of musculoskeletal disorders which can be caused by work posture and the risk of accidents which can be caused by reduced concentration. We evaluated the risk factors that may cause various accidents such as falling, colliding, capsizing, and squeezing in work environments, and evaluated the operational risk by considering worker exposure time to risky work environments. With the results of the evaluations for each factor, we calculated the general operational risk and deduced the improvement ratio in operational safety by introducing a construction robot. To verify these results, we compared the safety of the existing human manual labour and the proposed robotic labour construction methods for manipulating large glass panels. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. A scoping review about conference objectives and evaluative practices: how do we get more out of them?

    PubMed

    Neves, Justin; Lavis, John N; Ranson, M Kent

    2012-08-02

    Large multi-day conferences have often been criticized as ineffective ways to improve social outcomes and to influence policy or practice. Unfortunately, many conference evaluations have also been inadequate in determining the impact of a conference on its associated social sector, with little evidence gathered or analyzed to substantiate or refute these criticisms. The aim of this scoping review is to investigate and report stakeholders' objectives for planning or participating in large multi-day conferences and how these objectives are being evaluated. We conducted a scoping review supplemented by a small number of key informant interviews. Eight bibliographic databases were systematically searched to identify papers describing conference objectives and/or evaluations. We developed a conference evaluation framework based on theoretical models and empirical findings, which structured the descriptive synthesis of the data. We identified 3,073 potential papers for review, of which 44 were included in this study. Our evaluation framework connects five key elements in planning a conference and its evaluation (number in brackets refers to number of themes identified): conference objectives (8), purpose of evaluation (7), evaluation methods (5), indicators of success (9) and theories/models (8). Further analysis of indicators of success identified three categories of indicators with differing scopes (i.e. immediate, prospective or follow-up) as well as empirical links between the purpose of evaluations and these indicators. Conference objectives and evaluations were largely correlated with the type of conference (i.e. academic, political/governmental or business) but diverse overall. While much can be done to improve the quality and usefulness of conference evaluations, there are innovative assessments that are currently being utilized by some conferences and warrant further investigation. This review provides conference evaluators and organizers a simple resource to

  16. A scoping review about conference objectives and evaluative practices: how do we get more out of them?

    PubMed Central

    2012-01-01

    Large multi-day conferences have often been criticized as ineffective ways to improve social outcomes and to influence policy or practice. Unfortunately, many conference evaluations have also been inadequate in determining the impact of a conference on its associated social sector, with little evidence gathered or analyzed to substantiate or refute these criticisms. The aim of this scoping review is to investigate and report stakeholders’ objectives for planning or participating in large multi-day conferences and how these objectives are being evaluated. We conducted a scoping review supplemented by a small number of key informant interviews. Eight bibliographic databases were systematically searched to identify papers describing conference objectives and/or evaluations. We developed a conference evaluation framework based on theoretical models and empirical findings, which structured the descriptive synthesis of the data. We identified 3,073 potential papers for review, of which 44 were included in this study. Our evaluation framework connects five key elements in planning a conference and its evaluation (number in brackets refers to number of themes identified): conference objectives (8), purpose of evaluation (7), evaluation methods (5), indicators of success (9) and theories/models (8). Further analysis of indicators of success identified three categories of indicators with differing scopes (i.e. immediate, prospective or follow-up) as well as empirical links between the purpose of evaluations and these indicators. Conference objectives and evaluations were largely correlated with the type of conference (i.e. academic, political/governmental or business) but diverse overall. While much can be done to improve the quality and usefulness of conference evaluations, there are innovative assessments that are currently being utilized by some conferences and warrant further investigation. This review provides conference evaluators and organizers a simple resource to

  17. Correlating objective and subjective evaluation of texture appearance with applications to camera phone imaging

    NASA Astrophysics Data System (ADS)

    Phillips, Jonathan B.; Coppola, Stephen M.; Jin, Elaine W.; Chen, Ying; Clark, James H.; Mauer, Timothy A.

    2009-01-01

    Texture appearance is an important component of photographic image quality as well as object recognition. Noise cleaning algorithms are used to decrease sensor noise of digital images, but can hinder texture elements in the process. The Camera Phone Image Quality (CPIQ) initiative of the International Imaging Industry Association (I3A) is developing metrics to quantify texture appearance. Objective and subjective experimental results of the texture metric development are presented in this paper. Eight levels of noise cleaning were applied to ten photographic scenes that included texture elements such as faces, landscapes, architecture, and foliage. Four companies (Aptina Imaging, LLC, Hewlett-Packard, Eastman Kodak Company, and Vista Point Technologies) have performed psychophysical evaluations of overall image quality using one of two methods of evaluation. Both methods presented paired comparisons of images on thin film transistor liquid crystal displays (TFT-LCD), but the display pixel pitch and viewing distance differed. CPIQ has also been developing objective texture metrics and targets that were used to analyze the same eight levels of noise cleaning. The correlation of the subjective and objective test results indicates that texture perception can be modeled with an objective metric. The two methods of psychophysical evaluation exhibited high correlation despite the differences in methodology.

  18. Comparison of qualitative and quantitative evaluation of diffusion-weighted MRI and chemical-shift imaging in the differentiation of benign and malignant vertebral body fractures.

    PubMed

    Geith, Tobias; Schmidt, Gerwin; Biffar, Andreas; Dietrich, Olaf; Dürr, Hans Roland; Reiser, Maximilian; Baur-Melnyk, Andrea

    2012-11-01

    The objective of our study was to compare the diagnostic value of qualitative diffusion-weighted imaging (DWI), quantitative DWI, and chemical-shift imaging in a single prospective cohort of patients with acute osteoporotic and malignant vertebral fractures. The study group was composed of patients with 26 osteoporotic vertebral fractures (18 women, eight men; mean age, 69 years; age range, 31 years 6 months to 86 years 2 months) and 20 malignant vertebral fractures (nine women, 11 men; mean age, 63.4 years; age range, 24 years 8 months to 86 years 4 months). T1-weighted, STIR, and T2-weighted sequences were acquired at 1.5 T. A DW reverse fast imaging with steady-state free precession (PSIF) sequence at different delta values was evaluated qualitatively. A DW echo-planar imaging (EPI) sequence and a DW single-shot turbo spin-echo (TSE) sequence at different b values were evaluated qualitatively and quantitatively using the apparent diffusion coefficient. Opposed-phase sequences were used to assess signal intensity qualitatively. The signal loss between in- and opposed-phase images was determined quantitatively. Two-tailed Fisher exact test, Mann-Whitney test, and receiver operating characteristic analysis were performed. Sensitivities, specificities, and accuracies were determined. Qualitative DW-PSIF imaging (delta = 3 ms) showed the best performance for distinguishing between benign and malignant fractures (sensitivity, 100%; specificity, 88.5%; accuracy, 93.5%). Qualitative DW-EPI (b = 50 s/mm(2) [p = 1.00]; b = 250 s/mm(2) [p = 0.50]) and DW single-shot TSE imaging (b = 100 s/mm(2) [p = 1.00]; b = 250 s/mm(2) [p = 0.18]; b = 400 s/mm(2) [p = 0.18]; b = 600 s/mm(2) [p = 0.39]) did not indicate significant differences between benign and malignant fractures. DW-EPI using a b value of 500 s/mm(2) (p = 0.01) indicated significant differences between benign and malignant vertebral fractures. Quantitative DW-EPI (p = 0.09) and qualitative opposed-phase imaging (p = 0

  19. Evaluation of reference genes for quantitative RT-PCR in Lolium temulentum under abiotic stress

    USDA-ARS?s Scientific Manuscript database

    Lolium temulentum is a valuable model grass species for the study of stress in forage and turf grasses. Gene expression analysis by quantitative real time RT-PCR relies on the use of proper internal standards. The aim of this study was to identify and evaluate reference genes for use in real-time q...

  20. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  1. Accurate object tracking system by integrating texture and depth cues

    NASA Astrophysics Data System (ADS)

    Chen, Ju-Chin; Lin, Yu-Hang

    2016-03-01

    A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.

  2. A new quantitative evaluation method for age-related changes of individual pigmented spots in facial skin.

    PubMed

    Kikuchi, K; Masuda, Y; Yamashita, T; Sato, K; Katagiri, C; Hirao, T; Mizokami, Y; Yaguchi, H

    2016-08-01

    Facial skin pigmentation is one of the most prominent visible features of skin aging and often affects perception of health and beauty. To date, facial pigmentation has been evaluated using various image analysis methods developed for the cosmetic and esthetic fields. However, existing methods cannot provide precise information on pigmented spots, such as variations in size, color shade, and distribution pattern. The purpose of this study is the development of image evaluation methods to analyze individual pigmented spots and acquire detailed information on their age-related changes. To characterize the individual pigmented spots within a cheek image, we established a simple object-counting algorithm. First, we captured cheek images using an original imaging system equipped with an illumination unit and a high-resolution digital camera. The acquired images were converted into melanin concentration images using compensation formulae. Next, the melanin images were converted into binary images. The binary images were then subjected to noise reduction. Finally, we calculated parameters such as the melanin concentration, quantity, and size of individual pigmented spots using a connected-components labeling algorithm, which assigns a unique label to each separate group of connected pixels. The cheek image analysis was evaluated on 643 female Japanese subjects. We confirmed that the proposed method was sufficiently sensitive to measure the melanin concentration, and the numbers and sizes of individual pigmented spots through manual evaluation of the cheek images. The image analysis results for the 643 Japanese women indicated clear relationships between age and the changes in the pigmented spots. We developed a new quantitative evaluation method for individual pigmented spots in facial skin. This method facilitates the analysis of the characteristics of various pigmented facial spots and is directly applicable to the fields of dermatology, pharmacology, and esthetic

  3. Evaluation of a miniature microscope objective designed for fluorescence array microscopy detection of Mycobacterium tuberculosis.

    PubMed

    McCall, Brian; Olsen, Randall J; Nelles, Nicole J; Williams, Dawn L; Jackson, Kevin; Richards-Kortum, Rebecca; Graviss, Edward A; Tkaczyk, Tomasz S

    2014-03-01

    A prototype miniature objective that was designed for a point-of-care diagnostic array microscope for detection of Mycobacterium tuberculosis and previously fabricated and presented in a proof of concept is evaluated for its effectiveness in detecting acid-fast bacteria. To evaluate the ability of the microscope to resolve submicron features and details in the image of acid-fast microorganisms stained with a fluorescent dye, and to evaluate the accuracy of clinical diagnoses made with digital images acquired with the objective. The lens prescription data for the microscope design are presented. A test platform is built by combining parts of a standard microscope, a prototype objective, and a digital single-lens reflex camera. Counts of acid-fast bacteria made with the prototype objective are compared to counts obtained with a standard microscope over matched fields of view. Two sets of 20 smears, positive and negative, are diagnosed by 2 pathologists as sputum smear positive or sputum smear negative, using both a standard clinical microscope and the prototype objective under evaluation. The results are compared to a reference diagnosis of the same sample. More bacteria are counted in matched fields of view in digital images taken with the prototype objective than with the standard clinical microscope. All diagnostic results are found to be highly concordant. An array microscope built with this miniature lens design will be able to detect M tuberculosis with high sensitivity and specificity.

  4. An objective method for a video quality evaluation in a 3DTV service

    NASA Astrophysics Data System (ADS)

    Wilczewski, Grzegorz

    2015-09-01

    The following article describes proposed objective method for a 3DTV video quality evaluation, a Compressed Average Image Intensity (CAII) method. Identification of the 3DTV service's content chain nodes enables to design a versatile, objective video quality metric. It is based on an advanced approach to the stereoscopic videostream analysis. Insights towards designed metric mechanisms, as well as the evaluation of performance of the designed video quality metric, in the face of the simulated environmental conditions are herein discussed. As a result, created CAII metric might be effectively used in a variety of service quality assessment applications.

  5. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes.

    PubMed

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-08-27

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications.

  6. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes

    PubMed Central

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-01-01

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications. PMID:26343656

  7. Quantitative Evaluation of Musical Scale Tunings

    ERIC Educational Resources Information Center

    Hall, Donald E.

    1974-01-01

    The acoustical and mathematical basis of the problem of tuning the twelve-tone chromatic scale is reviewed. A quantitative measurement showing how well any tuning succeeds in providing just intonation for any specific piece of music is explained and applied to musical examples using a simple computer program. (DT)

  8. Evaluation of Quantitative Literacy Series: Exploring Data and Exploring Probability. Program Report 87-5.

    ERIC Educational Resources Information Center

    Day, Roger P.; And Others

    A quasi-experimental design with two experimental groups and one control group was used to evaluate the use of two books in the Quantitative Literacy Series, "Exploring Data" and "Exploring Probability." Group X teachers were those who had attended a workshop on the use of the materials and were using the materials during the…

  9. Qualitative and quantitative evaluation of some vocal function parameters following fitting of a prosthesis.

    PubMed

    Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G

    2001-12-01

    The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.

  10. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  11. Content-Related Issues Pertaining to Teaching Statistics: Making Decisions about Educational Objectives in Statistics Courses.

    ERIC Educational Resources Information Center

    Bliss, Leonard B.; Tashakkori, Abbas

    This paper discusses the objectives that would be appropriate for statistics classes for students who are not majoring in statistics, evaluation, or quantitative research design. These "non-majors" should be able to choose appropriate analytical methods for specific sets of data based on the research question and the nature of the data, and they…

  12. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  13. Diffusion tensor imaging with quantitative evaluation and fiber tractography of lumbar nerve roots in sciatica.

    PubMed

    Shi, Yin; Zong, Min; Xu, Xiaoquan; Zou, Yuefen; Feng, Yang; Liu, Wei; Wang, Chuanbing; Wang, Dehang

    2015-04-01

    To quantitatively evaluate nerve roots by measuring fractional anisotropy (FA) values in healthy volunteers and sciatica patients, visualize nerve roots by tractography, and compare the diagnostic efficacy between conventional magnetic resonance imaging (MRI) and DTI. Seventy-five sciatica patients and thirty-six healthy volunteers underwent MR imaging using DTI. FA values for L5-S1 lumbar nerve roots were calculated at three levels from DTI images. Tractography was performed on L3-S1 nerve roots. ROC analysis was performed for FA values. The lumbar nerve roots were visualized and FA values were calculated in all subjects. FA values decreased in compressed nerve roots and declined from proximal to distal along the compressed nerve tracts. Mean FA values were more sensitive and specific than MR imaging for differentiating compressed nerve roots, especially in the far lateral zone at distal nerves. DTI can quantitatively evaluate compressed nerve roots, and DTT enables visualization of abnormal nerve tracts, providing vivid anatomic information and localization of probable nerve compression. DTI has great potential utility for evaluating lumbar nerve compression in sciatica. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  15. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    PubMed

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  16. Objective and subjective treatment evaluation of scars using optical coherence tomography, sonography, photography, and standardised questionnaires.

    PubMed

    Reinholz, Markus; Schwaiger, Hannah; Poetschke, Julian; Epple, Andreas; Ruzicka, Thomas; Von Braunmühl, Tanja; Gauglitz, Gerd G

    2016-12-01

    Currently, different types of treatments for pathological scars are available, however, to date, there is no established method of measurement to objectively assess therapeutic outcome. Treatment success is usually evaluated clinically by the physician and patient. Non-invasive imaging techniques, such as HD-OCT (high-definition optical coherence tomography), may represent a valuable diagnostic tool to objectively measure therapeutic outcome. To compare HD-OCT with ultrasound and subjective evaluation tools, such as questionnaires. In total, eight patients with pathological scars were treated in this pilot study with cryotherapy and intralesional steroid injections, and evaluated pre- and post-treatment using clinical examination, photography, sonography, and HD-OCT. The analysis of objective and subjective measuring methods was used to draw direct comparisons. HD-OCT revealed reduced epidermal and dermal thickness of the scar after four treatments with triamcinolone acetonide and cryotherapy. Based on sonography, a total reduction in scar height and reduction in scar depth was demonstrated. Both methods correlated well with the injected amount of triamcinolone acetonide. In addition, a positive correlation between well-established subjective and objective evaluation methods was found. We demonstrate that HD-OCT may be used as an objective diagnostic instrument to evaluate skin thickness under therapy for pathological scars, and serves as a valuable adjunctive device in combination with ultrasound and subjective evaluation tools. This provides additional information for the therapist concerning the quality and success of the applied treatment.

  17. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  18. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... Evaluation and Research (CBER) and suggestions for further development. The public workshop will include... Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...

  19. Evaluation of Moving Object Detection Based on Various Input Noise Using Fixed Camera

    NASA Astrophysics Data System (ADS)

    Kiaee, N.; Hashemizadeh, E.; Zarrinpanjeh, N.

    2017-09-01

    Detecting and tracking objects in video has been as a research area of interest in the field of image processing and computer vision. This paper evaluates the performance of a novel method for object detection algorithm in video sequences. This process helps us to know the advantage of this method which is being used. The proposed framework compares the correct and wrong detection percentage of this algorithm. This method was evaluated with the collected data in the field of urban transport which include car and pedestrian in fixed camera situation. The results show that the accuracy of the algorithm will decreases because of image resolution reduction.

  20. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  1. Quantitative evaluation of learning and memory trace in studies of mnemotropic effects of immunotropic drugs.

    PubMed

    Kiseleva, N M; Novoseletskaya, A V; Voevodina, Ye B; Kozlov, I G; Inozemtsev, A N

    2012-12-01

    Apart from restoration of disordered immunological parameters, tactivin and derinat exhibit a pronounced effect on the higher integrative functions of the brain. Experiments on Wistar rats have shown that these drugs accelerated conditioning of food and defense responses. New methods for quantitative evaluation of memory trace consolidation are proposed.

  2. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  3. A combined pulmonary-radiology workshop for visual evaluation of COPD: study design, chest CT findings and concordance with quantitative evaluation.

    PubMed

    Barr, R Graham; Berkowitz, Eugene A; Bigazzi, Francesca; Bode, Frederick; Bon, Jessica; Bowler, Russell P; Chiles, Caroline; Crapo, James D; Criner, Gerard J; Curtis, Jeffrey L; Dass, Chandra; Dirksen, Asger; Dransfield, Mark T; Edula, Goutham; Erikkson, Leif; Friedlander, Adam; Galperin-Aizenberg, Maya; Gefter, Warren B; Gierada, David S; Grenier, Philippe A; Goldin, Jonathan; Han, MeiLan K; Hanania, Nicola A; Hansel, Nadia N; Jacobson, Francine L; Kauczor, Hans-Ulrich; Kinnula, Vuokko L; Lipson, David A; Lynch, David A; MacNee, William; Make, Barry J; Mamary, A James; Mann, Howard; Marchetti, Nathaniel; Mascalchi, Mario; McLennan, Geoffrey; Murphy, James R; Naidich, David; Nath, Hrudaya; Newell, John D; Pistolesi, Massimo; Regan, Elizabeth A; Reilly, John J; Sandhaus, Robert; Schroeder, Joyce D; Sciurba, Frank; Shaker, Saher; Sharafkhaneh, Amir; Silverman, Edwin K; Steiner, Robert M; Strange, Charlton; Sverzellati, Nicola; Tashjian, Joseph H; van Beek, Edwin J R; Washington, Lacey; Washko, George R; Westney, Gloria; Wood, Susan A; Woodruff, Prescott G

    2012-04-01

    The purposes of this study were: to describe chest CT findings in normal non-smoking controls and cigarette smokers with and without COPD; to compare the prevalence of CT abnormalities with severity of COPD; and to evaluate concordance between visual and quantitative chest CT (QCT) scoring. Volumetric inspiratory and expiratory CT scans of 294 subjects, including normal non-smokers, smokers without COPD, and smokers with GOLD Stage I-IV COPD, were scored at a multi-reader workshop using a standardized worksheet. There were 58 observers (33 pulmonologists, 25 radiologists); each scan was scored by 9-11 observers. Interobserver agreement was calculated using kappa statistic. Median score of visual observations was compared with QCT measurements. Interobserver agreement was moderate for the presence or absence of emphysema and for the presence of panlobular emphysema; fair for the presence of centrilobular, paraseptal, and bullous emphysema subtypes and for the presence of bronchial wall thickening; and poor for gas trapping, centrilobular nodularity, mosaic attenuation, and bronchial dilation. Agreement was similar for radiologists and pulmonologists. The prevalence on CT readings of most abnormalities (e.g. emphysema, bronchial wall thickening, mosaic attenuation, expiratory gas trapping) increased significantly with greater COPD severity, while the prevalence of centrilobular nodularity decreased. Concordances between visual scoring and quantitative scoring of emphysema, gas trapping and airway wall thickening were 75%, 87% and 65%, respectively. Despite substantial inter-observer variation, visual assessment of chest CT scans in cigarette smokers provides information regarding lung disease severity; visual scoring may be complementary to quantitative evaluation.

  4. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  5. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  6. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    PubMed

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  7. Quantitative image quality evaluation of MR images using perceptual difference models

    PubMed Central

    Miao, Jun; Huo, Donglai; Wilson, David L.

    2008-01-01

    The authors are using a perceptual difference model (Case-PDM) to quantitatively evaluate image quality of the thousands of test images which can be created when optimizing fast magnetic resonance (MR) imaging strategies and reconstruction techniques. In this validation study, they compared human evaluation of MR images from multiple organs and from multiple image reconstruction algorithms to Case-PDM and similar models. The authors found that Case-PDM compared very favorably to human observers in double-stimulus continuous-quality scale and functional measurement theory studies over a large range of image quality. The Case-PDM threshold for nonperceptible differences in a 2-alternative forced choice study varied with the type of image under study, but was ≈1.1 for diffuse image effects, providing a rule of thumb. Ordering the image quality evaluation models, we found in overall Case-PDM ≈ IDM (Sarnoff Corporation) ≈ SSIM [Wang et al. IEEE Trans. Image Process. 13, 600–612 (2004)] > mean squared error ≈ NR [Wang et al. (2004) (unpublished)] > DCTune (NASA) > IQM (MITRE Corporation). The authors conclude that Case-PDM is very useful in MR image evaluation but that one should probably restrict studies to similar images and similar processing, normally not a limitation in image reconstruction studies. PMID:18649487

  8. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  9. Development and Implementation of a Learning Object Repository for French Teaching and Learning: Issues and Promises

    ERIC Educational Resources Information Center

    Caws, Catherine

    2008-01-01

    This paper discusses issues surrounding the development of a learning object repository (FLORE) for teaching and learning French at the postsecondary level. An evaluation based on qualitative and quantitative data was set up in order to better assess how second-language (L2) students in French perceived the integration of this new repository into…

  10. Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging☆

    PubMed Central

    Oishi, Kenichi; Faria, Andreia V.; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2013-01-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a “growth percentile chart,” which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced

  11. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  12. Dual respiratory and cardiac motion estimation in PET imaging: Methods design and quantitative evaluation.

    PubMed

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-04-01

    The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be

  13. Multi-objective decision-making under uncertainty: Fuzzy logic methods

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1994-01-01

    Selecting the best option among alternatives is often a difficult process. This process becomes even more difficult when the evaluation criteria are vague or qualitative, and when the objectives vary in importance and scope. Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.

  14. MR Morphology of Triangular Fibrocartilage Complex: Correlation with Quantitative MR and Biomechanical Properties

    PubMed Central

    Bae, Won C.; Ruangchaijatuporn, Thumanoon; Chang, Eric Y; Biswas, Reni; Du, Jiang; Statum, Sheronda

    2016-01-01

    Objective To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Materials and Methods Five cadaveric wrists (22 to 70 yrs) were imaged at 3T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. Results On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. Conclusion These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques. PMID:26691643

  15. Development and characterization of a dynamic lesion phantom for the quantitative evaluation of dynamic contrast-enhanced MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Hariharan, Prasanna; Myers, Matthew R; Badano, Aldo

    2011-10-01

    To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml/s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml/s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions.

  16. Development and characterization of a dynamic lesion phantom for the quantitative evaluation of dynamic contrast-enhanced MRI

    PubMed Central

    Freed, Melanie; de Zwart, Jacco A.; Hariharan, Prasanna; R. Myers, Matthew; Badano, Aldo

    2011-01-01

    Purpose: To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. Methods: The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml∕s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. Results: The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml∕s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. Conclusions: The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to

  17. Evaluation of patients with painful total hip arthroplasty using combined single photon emission tomography and conventional computerized tomography (SPECT/CT) - a comparison of semi-quantitative versus 3D volumetric quantitative measurements.

    PubMed

    Barthassat, Emilienne; Afifi, Faik; Konala, Praveen; Rasch, Helmut; Hirschmann, Michael T

    2017-05-08

    It was the primary purpose of our study to evaluate the inter- and intra-observer reliability of a standardized SPECT/CT algorithm for evaluating patients with painful primary total hip arthroplasty (THA). The secondary purpose was a comparison of semi-quantitative and 3D volumetric quantification method for assessment of bone tracer uptake (BTU) in those patients. A novel SPECT/CT localization scheme consisting of 14 femoral and 4 acetabular regions on standardized axial and coronal slices was introduced and evaluated in terms of inter- and intra-observer reliability in 37 consecutive patients with hip pain after THA. BTU for each anatomical region was assessed semi-quantitatively using a color-coded Likert type scale (0-10) and volumetrically quantified using a validated software. Two observers interpreted the SPECT/CT findings in all patients two times with six weeks interval between interpretations in random order. Semi-quantitative and quantitative measurements were compared in terms of reliability. In addition, the values were correlated using Pearson`s correlation. A factorial cluster analysis of BTU was performed to identify clinically relevant regions, which should be grouped and analysed together. The localization scheme showed high inter- and intra-observer reliabilities for all femoral and acetabular regions independent of the measurement method used (semiquantitative versus 3D volumetric quantitative measurements). A high to moderate correlation between both measurement methods was shown for the distal femur, the proximal femur and the acetabular cup. The factorial cluster analysis showed that the anatomical regions might be summarized into three distinct anatomical regions. These were the proximal femur, the distal femur and the acetabular cup region. The SPECT/CT algorithm for assessment of patients with pain after THA is highly reliable independent from the measurement method used. Three clinically relevant anatomical regions (proximal femoral

  18. Relative Navigation Light Detection and Ranging (LIDAR) Sensor Development Test Objective (DTO) Performance Verification

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) received a request from the NASA Associate Administrator (AA) for Human Exploration and Operations Mission Directorate (HEOMD), to quantitatively evaluate the individual performance of three light detection and ranging (LIDAR) rendezvous sensors flown as orbiter's development test objective on Space Transportation System (STS)-127, STS-133, STS-134, and STS-135. This document contains the outcome of the NESC assessment.

  19. Surface plasmon resonance microscopy: achieving a quantitative optical response

    PubMed Central

    Peterson, Alexander W.; Halter, Michael; Plant, Anne L.; Elliott, John T.

    2016-01-01

    Surface plasmon resonance (SPR) imaging allows real-time label-free imaging based on index of refraction, and changes in index of refraction at an interface. Optical parameter analysis is achieved by application of the Fresnel model to SPR data typically taken by an instrument in a prism based configuration. We carry out SPR imaging on a microscope by launching light into a sample, and collecting reflected light through a high numerical aperture microscope objective. The SPR microscope enables spatial resolution that approaches the diffraction limit, and has a dynamic range that allows detection of subnanometer to submicrometer changes in thickness of biological material at a surface. However, unambiguous quantitative interpretation of SPR changes using the microscope system could not be achieved using the Fresnel model because of polarization dependent attenuation and optical aberration that occurs in the high numerical aperture objective. To overcome this problem, we demonstrate a model to correct for polarization diattenuation and optical aberrations in the SPR data, and develop a procedure to calibrate reflectivity to index of refraction values. The calibration and correction strategy for quantitative analysis was validated by comparing the known indices of refraction of bulk materials with corrected SPR data interpreted with the Fresnel model. Subsequently, we applied our SPR microscopy method to evaluate the index of refraction for a series of polymer microspheres in aqueous media and validated the quality of the measurement with quantitative phase microscopy. PMID:27782542

  20. A quantitative evaluation of cell migration by the phagokinetic track motility assay.

    PubMed

    Nogalski, Maciej T; Chan, Gary C T; Stevenson, Emily V; Collins-McMillen, Donna K; Yurochko, Andrew D

    2012-12-04

    Cellular motility is an important biological process for both unicellular and multicellular organisms. It is essential for movement of unicellular organisms towards a source of nutrients or away from unsuitable conditions, as well as in multicellular organisms for tissue development, immune surveillance and wound healing, just to mention a few roles(1,2,3). Deregulation of this process can lead to serious neurological, cardiovascular and immunological diseases, as well as exacerbated tumor formation and spread(4,5). Molecularly, actin polymerization and receptor recycling have been shown to play important roles in creating cellular extensions (lamellipodia), that drive the forward movement of the cell(6,7,8). However, many biological questions about cell migration remain unanswered. The central role for cellular motility in human health and disease underlines the importance of understanding the specific mechanisms involved in this process and makes accurate methods for evaluating cell motility particularly important. Microscopes are usually used to visualize the movement of cells. However, cells move rather slowly, making the quantitative measurement of cell migration a resource-consuming process requiring expensive cameras and software to create quantitative time-lapsed movies of motile cells. Therefore, the ability to perform a quantitative measurement of cell migration that is cost-effective, non-laborious, and that utilizes common laboratory equipment is a great need for many researchers. The phagokinetic track motility assay utilizes the ability of a moving cell to clear gold particles from its path to create a measurable track on a colloidal gold-coated glass coverslip(9,10). With the use of freely available software, multiple tracks can be evaluated for each treatment to accomplish statistical requirements. The assay can be utilized to assess motility of many cell types, such as cancer cells(11,12), fibroblasts(9), neutrophils(13), skeletal muscle cells(14

  1. Objective and quantitative analysis of daytime sleepiness in physicians after night duties.

    PubMed

    Wilhelm, Barbara J; Widmann, Anja; Durst, Wilhelm; Heine, Christian; Otto, Gerhard

    2009-06-01

    Work place studies often have the disadvantage of lacking objective data less prone to subject bias. The aim of this study was to contribute objective data to the discussion about safety aspects of night shifts in physicians. For this purpose we applied the Pupillographic Sleepiness Test (PST). The PST allows recording and analyses of pupillary sleepiness-related oscillations in darkness for 11 min in the sitting subject. The parameter of evaluation is the Pupillary Unrest Index (PUI; mm/min). For statistical analysis the natural logarithm of this parameter is used (lnPUI). Thirty-four physicians were examined by the PST and subjective scales during the first half of the day. Data taken during a day work period (D) were compared to those taken directly after night duty (N) by a Wilcoxon signed rank test. Night duty caused a mean sleep reduction of 3 h (Difference N-D: median 3 h, minimum 0 h, maximum 7 h, p < 0.001). Time since the last sleep period was about equal in both conditions (Difference N-D: median -0.25 h, min. -4 h, max. 20 h, p = 0.2). The lnPUI was larger after night duty (Difference N-D: median 0.19, min. -0.71, max. 1.29, p = 0.03). The increase of physiologically measured sleepiness correlated significantly with changes in subjective measures (PUI/SSS, Spearman Rho 0.41, p = 0.02; PUI/VAS, Spearman Rho 0.38, p = 0.02). Despite a mean sleep duration of 4 h, considerable sleepiness in physicians after nights on duty was found, implying lower safety levels for both patients (if physicians remaining on duty) and physicians while commuting home.

  2. Quantitative evaluation of malignant gliomas damage induced by photoactivation of IR700 dye

    NASA Astrophysics Data System (ADS)

    Sakuma, Morito; Kita, Sayaka; Higuchi, Hideo

    2016-01-01

    The processes involved in malignant gliomas damage were quantitatively evaluated by microscopy. The near-infrared fluorescent dye IR700 that is conjugated to an anti-CD133 antibody (IR700-CD133) specifically targets malignant gliomas (U87MG) and stem cells (BT142) and is endocytosed into the cells. The gliomas are then photodamaged by the release of reactive oxygen species (ROS) and the heat induced by illumination of IR700 by a red laser, and the motility of the vesicles within these cells is altered as a result of cellular damage. To investigate these changes in motility, we developed a new method that measures fluctuations in the intensity of phase-contrast images obtained from small areas within cells. The intensity fluctuation in U87MG cells gradually decreased as cell damage progressed, whereas the fluctuation in BT142 cells increased. The endocytosed IR700 dye was co-localized in acidic organelles such as endosomes and lysosomes. The pH in U87MG cells, as monitored by a pH indicator, was decreased and then gradually increased by the illumination of IR700, while the pH in BT142 cells increased monotonically. In these experiments, the processes of cell damage were quantitatively evaluated according to the motility of vesicles and changes in pH.

  3. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  4. A Simple Configuration for Quantitative Phase Contrast Microscopy of Transmissible Samples

    NASA Astrophysics Data System (ADS)

    Sengupta, Chandan; Dasgupta, Koustav; Bhattacharya, K.

    Phase microscopy attempts to visualize and quantify the phase distribution of samples which are otherwise invisible under microscope without the use of stains. The two principal approaches to phase microscopy are essentially those of Fourier plane modulation and interferometric techniques. Although the former, first proposed by Zernike, had been the harbinger of phase microscopy, it was the latter that allowed for quantitative evaluation of phase samples. However interferometric techniques are fraught with associated problems such as complicated setup involving mirrors and beam-splitters, the need for a matched objective in the reference arm and also the need for vibration isolation. The present work proposes a single element cube beam-splitter (CBS) interferometer combined with a microscope objective (MO) for interference microscopy. Because of the monolithic nature of the interferometer, the system is almost insensitive to vibrations and relatively simple to align. It will be shown that phase shifting properties may also be introduced by suitable and proper use of polarizing devices. Initial results showing the quantitative three dimensional phase profiles of simulated and actual biological specimens are presented.

  5. Quantitative Evaluation of Atherosclerotic Plaque Using Ultrasound Tissue Characterization.

    NASA Astrophysics Data System (ADS)

    Yigiter, Ersin

    Evaluation of therapeutic methods directed toward interrupting and/or delaying atherogenesis is impeded by the lack of a reliable, non-invasive means for monitoring progression or regression of disease. The ability to characterize the predominant component of plaque may be very valuable in the study of this disease's natural history. The earlier the lesion, the more likely is lipid to be the predominant component. Progression of plaque is usually by way of overgrowth of fibrous tissues around the fatty pool. Calcification is usually a feature of the older or complicated lesion. To explore the feasibility of using ultrasound to characterize plaque we have conducted measurements of the acoustical properties of various atherosclerotic lesions found in freshly excised samples of human abdominal aorta. Our objective has been to determine whether or not the acoustical properties of plaque correlate with the type and/or chemical composition of plaque and, if so, to define a measurement scheme which could be done in-vivo and non-invasively. Our current data base consists of individual tissue samples from some 200 different aortas. Since each aorta yields between 10 to 30 tissue samples for study, we have data on some 4,468 different lesions or samples. Measurements of the acoustical properties of plaque were found to correlate well with the chemical composition of plaque. In short, measurements of impedance and attenuation seem sufficient to classify plaque as to type and to composition. Based on the in-vitro studies, the parameter of attenuation was selected as a means of classifying the plaque. For these measurements, an intravascular ultrasound scanner was modified according to our specifications. Signal processing algorithms were developed which would analyze the complex ultrasound waveforms and estimate tissue properties such as attenuation. Various methods were tried to estimate the attenuation from the pulse-echo backscattered signal. Best results were obtained by

  6. WE-FG-207B-12: Quantitative Evaluation of a Spectral CT Scanner in a Phantom Study: Results of Spectral Reconstructions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, X; Arbique, G; Guild, J

    Purpose: To evaluate the quantitative image quality of spectral reconstructions of phantom data from a spectral CT scanner. Methods: The spectral CT scanner (IQon Spectral CT, Philips Healthcare) is equipped with a dual-layer detector and generates conventional 80-140 kVp images and variety of spectral reconstructions, e.g., virtual monochromatic (VM) images, virtual non-contrast (VNC) images, iodine maps, and effective atomic number (Z) images. A cylindrical solid water phantom (Gammex 472, 33 cm diameter and 5 cm thick) with iodine (2.0-20.0 mg I/ml) and calcium (50-600 mg/ml) rod inserts was scanned at 120 kVp and 27 mGy CTDIvol. Spectral reconstructions were evaluatedmore » by comparing image measurements with theoretical values calculated from nominal rod compositions provided by the phantom manufacturer. The theoretical VNC was calculated using water and iodine basis material decomposition, and the theoretical Z was calculated using two common methods, the chemical formula method (Z1) and the dual-energy ratio method (Z2). Results: Beam-hardening-like artifacts between high-attenuation calcium rods (≥300 mg/ml, >800 HU) influenced quantitative measurements, so the quantitative analysis was only performed on iodine rods using the images from the scan with all the calcium rods removed. The CT numbers of the iodine rods in the VM images (50∼150 keV) were close to theoretical values with average difference of 2.4±6.9 HU. Compared with theoretical values, the average difference for iodine concentration, VNC CT number and effective Z of iodine rods were −0.10±0.38 mg/ml, −0.1±8.2 HU, 0.25±0.06 (Z1) and −0.23±0.07 (Z2). Conclusion: The results indicate that the spectral CT scanner generates quantitatively accurate spectral reconstructions at clinically relevant iodine concentrations. Beam-hardening-like artifacts still exist when high-attenuation objects are present and their impact on patient images needs further investigation. YY is an employee of

  7. Using reusable learning objects (RLOs) in wound care education: Undergraduate student nurse's evaluation of their learning gain.

    PubMed

    Redmond, Catherine; Davies, Carmel; Cornally, Deirdre; Adam, Ewa; Daly, Orla; Fegan, Marianne; O'Toole, Margaret

    2018-01-01

    Both nationally and internationally concerns have been expressed over the adequacy of preparation of undergraduate nurses for the clinical skill of wound care. This project describes the educational evaluation of a series of Reusable Learning Objects (RLOs) as a blended learning approach to facilitate undergraduate nursing students learning of wound care for competence development. Constructivism Learning Theory and Cognitive Theory of Multimedia Learning informed the design of the RLOs, promoting active learner approaches. Clinically based case studies and visual data from two large university teaching hospitals provided the authentic learning materials required. Interactive exercises and formative feedback were incorporated into the educational resource. Evaluation of student perceived learning gains in terms of knowledge, ability and attitudes were measured using a quantitative pre and posttest Wound Care Competency Outcomes Questionnaire. The RLO CETL Questionnaire was used to identify perceived learning enablers. Statistical and deductive thematic analyses inform the findings. Students (n=192) reported that their ability to meet the competency outcomes for wound care had increased significantly after engaging with the RLOs. Students rated the RLOs highly across all categories of perceived usefulness, impact, access and integration. These findings provide evidence that the use of RLOs for both knowledge-based and performance-based learning is effective. RLOs when designed using clinically real case scenarios reflect the true complexities of wound care and offer innovative interventions in nursing curricula. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. WE-AB-207A-12: HLCC Based Quantitative Evaluation Method of Image Artifact in Dental CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y; Wu, S; Qi, H

    Purpose: Image artifacts are usually evaluated qualitatively via visual observation of the reconstructed images, which is susceptible to subjective factors due to the lack of an objective evaluation criterion. In this work, we propose a Helgason-Ludwig consistency condition (HLCC) based evaluation method to quantify the severity level of different image artifacts in dental CBCT. Methods: Our evaluation method consists of four step: 1) Acquire Cone beam CT(CBCT) projection; 2) Convert 3D CBCT projection to fan-beam projection by extracting its central plane projection; 3) Convert fan-beam projection to parallel-beam projection utilizing sinogram-based rebinning algorithm or detail-based rebinning algorithm; 4) Obtain HLCCmore » profile by integrating parallel-beam projection per view and calculate wave percentage and variance of the HLCC profile, which can be used to describe the severity level of image artifacts. Results: Several sets of dental CBCT projections containing only one type of artifact (i.e. geometry, scatter, beam hardening, lag and noise artifact), were simulated using gDRR, a GPU tool developed for efficient, accurate, and realistic simulation of CBCT Projections. These simulated CBCT projections were used to test our proposed method. HLCC profile wave percentage and variance induced by geometry distortion are about 3∼21 times and 16∼393 times as large as that of the artifact-free projection, respectively. The increase factor of wave percentage and variance are 6 and133 times for beam hardening, 19 and 1184 times for scatter, and 4 and16 times for lag artifacts, respectively. In contrast, for noisy projection the wave percentage, variance and inconsistency level are almost the same with those of the noise-free one. Conclusion: We have proposed a quantitative evaluation method of image artifact based on HLCC theory. According to our simulation results, the severity of different artifact types is found to be in a following order: Scatter

  9. Defining competency-based evaluation objectives in family medicine

    PubMed Central

    Allen, Tim; Brailovsky, Carlos; Rainsberry, Paul; Lawrence, Katherine; Crichton, Tom; Carpentier, Marie-Pierre; Visser, Shaun

    2011-01-01

    Abstract Objective To develop a definition of competence in family medicine sufficient to guide a review of Certification examinations by the Board of Examiners of the College of Family Physicians of Canada. Design Delphi analysis of responses to a 4-question postal survey. Setting Canadian family practice. Participants A total of 302 family physicians who have served as examiners for the College of Family Physicians of Canada’s Certification examination. Methods A survey comprising 4 short-answer questions was mailed to the 302 participating family physicians asking them to list elements that define competence in family medicine among newly certified family physicians beginning independent practice. Two expert groups used a modified Delphi consensus process to analyze responses and generate 2 basic components of this definition of competence: first, the problems that a newly practising family physician should be competent to handle; second, the qualities, behaviour, and skills that characterize competence at the start of independent practice. Main findings Response rate was 54%; total number of elements among all responses was 5077, for an average 31 per respondent. Of the elements, 2676 were topics or clinical situations to be dealt with; the other 2401 were skills, behaviour patterns, or qualities, without reference to a specific clinical problem. The expert groups identified 6 essential skills, the phases of the clinical encounter, and 99 priority topics as the descriptors used by the respondents. More than 20% of respondents cited 30 of the topics. Conclusion Family physicians define the domain of competence in family medicine in terms of 6 essential skills, the phases of the clinical encounter, and priority topics. This survey represents the first level of definition of evaluation objectives in family medicine. Definition of the interactions among these elements will permit these objectives to become detailed enough to effectively guide assessment. PMID

  10. [Comparison of two quantitative methods of endobronchial ultrasound real-time elastography for evaluating intrathoracic lymph nodes].

    PubMed

    Mao, X W; Yang, J Y; Zheng, X X; Wang, L; Zhu, L; Li, Y; Xiong, H K; Sun, J Y

    2017-06-12

    Objective: To compare the clinical value of two quantitative methods in analyzing endobronchial ultrasound real-time elastography (EBUS-RTE) images for evaluating intrathoracic lymph nodes. Methods: From January 2014 to April 2014, EBUS-RTE examination was performed in patients who received EBUS-TBNA examination in Shanghai Chest Hospital. Each intrathoracic lymph node had a selected EBUS-RTE image. Stiff area ratio and mean hue value of region of interest (ROI) in each image were calculated respectively. The final diagnosis of lymph node was based on the pathologic/microbiologic results of EBUS-TBNA, pathologic/microbiologic results of other examinations and clinical following-up. The sensitivity, specificity, positive predictive value, negative predictive value and accuracy were evaluated for distinguishing malignant and benign lesions. Results: Fifty-six patients and 68 lymph nodes were enrolled in this study, of which 35 lymph nodes were malignant and 33 lymph nodes were benign. The stiff area ratio and mean hue value of benign and malignant lesions were 0.32±0.29, 0.62±0.20 and 109.99±28.13, 141.62±17.52, respectively, and statistical differences were found in both of those two methods ( t =-5.14, P <0.01; t =-5.53, P <0.01). The area under curves was 0.813, 0.814 in stiff area ratio and mean hue value, respectively. The optimal diagnostic cut-off value of stiff area ratio was 0.48, and the sensitivity, specificity, positive predictive value, negative predictive value and accuracy were 82.86%, 81.82%, 82.86%, 81.82% and 82.35%, respectively. The optimal diagnostic cut-off value of mean hue value was 126.28, and the sensitivity, specificity, positive predictive value, negative predictive value and accuracy were 85.71%, 75.76%, 78.95%, 83.33% and 80.88%, respectively. Conclusion: Both the stiff area ratio and mean hue value methods can be used for analyzing EBUS-RTE images quantitatively, having the value of differentiating benign and malignant intrathoracic

  11. Application of Organosilane Monolayer Template to Quantitative Evaluation of Cancer Cell Adhesive Ability

    NASA Astrophysics Data System (ADS)

    Tanii, Takashi; Sasaki, Kosuke; Ichisawa, Kota; Demura, Takanori; Beppu, Yuichi; Vu, Hoan Anh; Thanh Chi, Hoan; Yamamoto, Hideaki; Sato, Yuko

    2011-06-01

    The adhesive ability of two human pancreatic cancer cell lines was evaluated using organosilane monolayer templates (OMTs). Using the OMT, the spreading area of adhered cells can be limited, and this enables us to focus on the initial attachment process of adhesion. Moreover, it becomes possible to arrange the cells in an array and to quantitatively evaluate the number of attached cells. The adhesive ability of the cancer cells cultured on the OMT was controlled by adding (-)-epigallocatechin-3-gallate (EGCG), which blocks a receptor that mediates cell adhesion and is overexpressed in cancer cells. Measurement of the relative ability of the cancer cells to attach to the OMT revealed that the ability for attachment decreased with increasing EGCG concentration. The results agreed well with the western blot analysis, indicating that the OMT can potentially be employed to evaluate the adhesive ability of various cancer cells.

  12. Object-oriented Persistent Homology

    PubMed Central

    Wang, Bao; Wei, Guo-Wei

    2015-01-01

    Persistent homology provides a new approach for the topological simplification of big data via measuring the life time of intrinsic topological features in a filtration process and has found its success in scientific and engineering applications. However, such a success is essentially limited to qualitative data classification and analysis. Indeed, persistent homology has rarely been employed for quantitative modeling and prediction. Additionally, the present persistent homology is a passive tool, rather than a proactive technique, for classification and analysis. In this work, we outline a general protocol to construct object-oriented persistent homology methods. By means of differential geometry theory of surfaces, we construct an objective functional, namely, a surface free energy defined on the data of interest. The minimization of the objective functional leads to a Laplace-Beltrami operator which generates a multiscale representation of the initial data and offers an objective oriented filtration process. The resulting differential geometry based object-oriented persistent homology is able to preserve desirable geometric features in the evolutionary filtration and enhances the corresponding topological persistence. The cubical complex based homology algorithm is employed in the present work to be compatible with the Cartesian representation of the Laplace-Beltrami flow. The proposed Laplace-Beltrami flow based persistent homology method is extensively validated. The consistence between Laplace-Beltrami flow based filtration and Euclidean distance based filtration is confirmed on the Vietoris-Rips complex for a large amount of numerical tests. The convergence and reliability of the present Laplace-Beltrami flow based cubical complex filtration approach are analyzed over various spatial and temporal mesh sizes. The Laplace-Beltrami flow based persistent homology approach is utilized to study the intrinsic topology of proteins and fullerene molecules. Based on a

  13. Objective evaluation of choroidal melanin contents with polarization-sensitive optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Miura, Masahiro; Makita, Shuichi; Yasuno, Yoshiaki; Ikuno, Yasushi; Uematsu, Sato; Iwasaki, Takuya; Goto, Hiroshi

    2018-02-01

    We non-invasively evaluated choroidal melanin contents in human eyes with PS-OCT. We calculated the percentage area of low DOPU in the choroidal interstitial stroma for Vogt-Koyanagi- Harada disease with sunset glow fundus, without sunset glow fundus, control group and tessellated fundus with high myopia. The mean percentage area of low DOPU in the sunset group was significantly lower than the other groups. PS-OCT provides an in vivo objective evaluation of choroidal melanin loss in vivo human eyes.

  14. Reading K-3. Instructional Objectives Exchange. A Project of the Center for the Study of Evaluation.

    ERIC Educational Resources Information Center

    California Univ., Los Angeles. Center for the Study of Evaluation.

    Three hundred and ninety-seven objectives and related evaluation items for reading in grades kindergarten through three are presented for the teacher and administrator in this collection developed by the Instructional Objectives Exchange (IOX). The objectives are organized into the categories of word recognition, comprehension, and study skills,…

  15. An Objective Evaluation of Mass Scaling Techniques Utilizing Computational Human Body Finite Element Models.

    PubMed

    Davis, Matthew L; Scott Gayzik, F

    2016-10-01

    Biofidelity response corridors developed from post-mortem human subjects are commonly used in the design and validation of anthropomorphic test devices and computational human body models (HBMs). Typically, corridors are derived from a diverse pool of biomechanical data and later normalized to a target body habitus. The objective of this study was to use morphed computational HBMs to compare the ability of various scaling techniques to scale response data from a reference to a target anthropometry. HBMs are ideally suited for this type of study since they uphold the assumptions of equal density and modulus that are implicit in scaling method development. In total, six scaling procedures were evaluated, four from the literature (equal-stress equal-velocity, ESEV, and three variations of impulse momentum) and two which are introduced in the paper (ESEV using a ratio of effective masses, ESEV-EffMass, and a kinetic energy approach). In total, 24 simulations were performed, representing both pendulum and full body impacts for three representative HBMs. These simulations were quantitatively compared using the International Organization for Standardization (ISO) ISO-TS18571 standard. Based on these results, ESEV-EffMass achieved the highest overall similarity score (indicating that it is most proficient at scaling a reference response to a target). Additionally, ESEV was found to perform poorly for two degree-of-freedom (DOF) systems. However, the results also indicated that no single technique was clearly the most appropriate for all scenarios.

  16. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  17. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  18. Objective evaluation of the knocking sound of a diesel engine considering the temporal and frequency masking effect simultaneously

    NASA Astrophysics Data System (ADS)

    Yun, Dong-Un; Lee, Sang-Kwon

    2017-06-01

    In this paper, we present a novel method for an objective evaluation of knocking noise emitted by diesel engines based on the temporal and frequency masking theory. The knocking sound of a diesel engine is a vibro-acoustic sound correlated with the high-frequency resonances of the engine structure and a periodic impulsive sound with amplitude modulation. Its period is related to the engine speed and includes specific frequency bands related to the resonances of the engine structure. A knocking sound with the characteristics of a high-frequency impulsive wave can be masked by low-frequency sounds correlated with the harmonics of the firing frequency and broadband noise. The degree of modulation of the knocking sound signal was used for such objective evaluations in previous studies, without considering the masking effect. However, the frequency masking effect must be considered for the objective evaluation of the knocking sound. In addition to the frequency masking effect, the temporal masking effect occurs because the period of the knocking sound changes according to the engine speed. Therefore, an evaluation method considering the temporal and frequency masking effect is required to analyze the knocking sound objectively. In this study, an objective evaluation method considering the masking effect was developed based on the masking theory of sound and signal processing techniques. The method was applied successfully for the objective evaluation of the knocking sound of a diesel engine.

  19. The role of objective personality inventories in suicide risk assessment: an evaluation and proposal.

    PubMed

    Johnson, W B; Lall, R; Bongar, B; Nordlund, M D

    1999-01-01

    Objective personality assessment instruments offer a comparatively underutilized source of clinical data in attempts to evaluate and predict risk for suicide. In contrast to focal suicide risk measures, global personality inventories may be useful in identification of long-standing styles that predispose persons to eventual suicidal behavior. This article reviews the empirical literature regarding the efficacy of established personality inventories in predicting suicidality. The authors offer several recommendations for future research with these measures and conclude that such objective personality instruments offer only marginal utility as sources of clinical information in comprehensive suicide risk evaluations. Personality inventories may offer greatest utility in long-term assessment of suicide risk.

  20. [Study on the quantitative evaluation on the degree of TCM basic syndromes often encountered in patients with primary liver cancer].

    PubMed

    Li, Dong-tao; Ling, Chang-quan; Zhu, De-zeng

    2007-07-01

    To establish a quantitative model for evaluating the degree of the TCM basic syndromes often encountered in patients with primary liver cancer (PLC). Medical literatures concerning the clinical investigation and TCM syndrome of PLC were collected and analyzed adopting expert-composed symposium method, and the 100 millimeter scaling was applied in combining with scoring on degree of symptoms to establish a quantitative criterion for symptoms and signs degree classification in patients with PLC. Two models, i.e. the additive model and the additive-multiplicative model, were established by using comprehensive analytic hierarchy process (AHP) as the mathematical tool to estimate the weight of the criterion for evaluating basic syndromes in various layers by specialists. Then the two models were verified in clinical practice and the outcomes were compared with that fuzzy evaluated by specialists. Verification on 459 times/case of PLC showed that the coincidence rate between the outcomes derived from specialists with that from the additive model was 84.53 %, and with that from the additive-multificative model was 62.75 %, the difference between the two showed statistical significance (P<0.01). It could be decided that the additive model is the principle model suitable for quantitative evaluation on the degree of TCM basic syndromes in patients with PLC.

  1. Framework for objective evaluation of privacy filters

    NASA Astrophysics Data System (ADS)

    Korshunov, Pavel; Melle, Andrea; Dugelay, Jean-Luc; Ebrahimi, Touradj

    2013-09-01

    Extensive adoption of video surveillance, affecting many aspects of our daily lives, alarms the public about the increasing invasion into personal privacy. To address these concerns, many tools have been proposed for protection of personal privacy in image and video. However, little is understood regarding the effectiveness of such tools and especially their impact on the underlying surveillance tasks, leading to a tradeoff between the preservation of privacy offered by these tools and the intelligibility of activities under video surveillance. In this paper, we investigate this privacy-intelligibility tradeoff objectively by proposing an objective framework for evaluation of privacy filters. We apply the proposed framework on a use case where privacy of people is protected by obscuring faces, assuming an automated video surveillance system. We used several popular privacy protection filters, such as blurring, pixelization, and masking and applied them with varying strengths to people's faces from different public datasets of video surveillance footage. Accuracy of face detection algorithm was used as a measure of intelligibility (a face should be detected to perform a surveillance task), and accuracy of face recognition algorithm as a measure of privacy (a specific person should not be identified). Under these conditions, after application of an ideal privacy protection tool, an obfuscated face would be visible as a face but would not be correctly identified by the recognition algorithm. The experiments demonstrate that, in general, an increase in strength of privacy filters under consideration leads to an increase in privacy (i.e., reduction in recognition accuracy) and to a decrease in intelligibility (i.e., reduction in detection accuracy). Masking also shows to be the most favorable filter across all tested datasets.

  2. Evaluation Processes Used to Assess the Effectiveness of Vocational-Technical Programs.

    ERIC Educational Resources Information Center

    Bruhns, Arthur E.

    Evaluation is quantitative or qualitative, the criteria determined by or given to the student. The criteria show how close he has come to the program's objectives and the ranking of individual performance. Vocational education programs susceptible to evaluation are listed and relevant evaluative techniques discussed. Graduate interviews concerning…

  3. Teaching Research and Practice Evaluation Skills to Graduate Social Work Students

    ERIC Educational Resources Information Center

    Wong, Stephen E.; Vakharia, Sheila P.

    2012-01-01

    Objective: The authors examined outcomes of a graduate course on evaluating social work practice that required students to use published research, quantitative measures, and single-system designs in a simulated practice evaluation project. Method: Practice evaluation projects from a typical class were analyzed for the number of research references…

  4. Quantitative evaluation of palatal bone thickness for the placement of orthodontic miniscrews in adults with different facial types

    PubMed Central

    Wang, Yunji; Qiu, Ye; Liu, Henglang; He, Jinlong; Fan, Xiaoping

    2017-01-01

    Objectives: To quantitatively evaluate palatal bone thickness in adults with different facial types using cone beam computed tomography (CBCT). Methods: The CBCT volumetric data of 123 adults (mean age, 26.8 years) collected between August 2014 and August 2016 was retrospectively studied. The subjects were divided into a low-angle group (39 subjects), a normal-angle group (48 subjects) and a high-angle group (36 subjects) based on facial types assigned by cephalometric radiography. The thickness of the palatal bone was assessed at designated points. A repeated-measure analysis of variance (rm-ANOVA) test was used to test the relationship between facial types and palatal bone thickness. Results: Compared to the low-angle group, the high-angle group had significantly thinner palatal bones (p<0.05), except for the anterior-midline, anterior-medial and middle-midline areas. Conclusion: The safest zone for the placement of microimplants is the anterior part of the paramedian palate. Clinicians should pay special attention to the probability of thinner bone plates and the risk of perforation in high-angle patients. PMID:28917071

  5. Evaluation of New Zealand's high-seas bottom trawl closures using predictive habitat models and quantitative risk assessment.

    PubMed

    Penney, Andrew J; Guinotte, John M

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas.

  6. Imaging Performance of Quantitative Transmission Ultrasound

    PubMed Central

    Lenox, Mark W.; Wiskin, James; Lewis, Matthew A.; Darrouzet, Stephen; Borup, David; Hsieh, Scott

    2015-01-01

    Quantitative Transmission Ultrasound (QTUS) is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound. PMID:26604918

  7. Real Progress in Maryland: Student Learning Objectives and Teacher and Principal Evaluation

    ERIC Educational Resources Information Center

    Slotnik, William J.; Bugler, Daniel; Liang, Guodong

    2014-01-01

    The Maryland State Department of Education (MSDE) is making significant strides in guiding and supporting the implementation of Student Learning Objectives (SLOs) as well as a teacher and principal evaluation (TPE) system statewide. MSDE support focuses on helping districts prepare for full SLO implementation by providing technical assistance with…

  8. Evaluation of work zone speed limits : an objective and subjective analysis of work zones in Missouri.

    DOT National Transportation Integrated Search

    2011-02-01

    This study objectively and subjectively examined speed characteristics and driver compliance with the posted speed limit : in Missouri work zones. The objective evaluation collected vehicle speeds from four work zones with different : configurations ...

  9. One device, one equation: the simplest way to objectively evaluate psoriasis severity.

    PubMed

    Choi, Jae Woo; Kim, Bo Ri; Choi, Chong Won; Youn, Sang Woong

    2015-02-01

    The erythema, scale and thickness of psoriasis lesions could be converted to bioengineering parameters. An objective psoriasis severity assessment is advantageous in terms of accuracy and reproducibility over conventional severity assessment. We aimed to formulate an objective psoriasis severity index with a single bioengineering device that can possibly substitute the conventional subjective Psoriasis Severity Index. A linear regression analysis was performed to derive the formula with the subjective Psoriasis Severity Index as the dependent variable and various bioengineering parameters determined from 157 psoriasis lesions as independent variables. The construct validity of the objective Psoriasis Severity Index was evaluated with an additional 30 psoriasis lesions through a Pearson correlation analysis. The formula is composed of hue and brightness, which are sufficiently obtainable with a Colorimeter alone. A very strong positive correlation was found between the objective and subjective psoriasis severity indexes. The objective Psoriasis Severity Index is a novel, practical and valid assessment method that can substitute the conventional one. Combined with subjective area assessment, it could further replace the Psoriasis Area and Severity Index which is currently most popular. © 2014 Japanese Dermatological Association.

  10. Effectiveness evaluation of objective and subjective weighting methods for aquifer vulnerability assessment in urban context

    NASA Astrophysics Data System (ADS)

    Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet

    2016-10-01

    Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.

  11. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  12. Implementation and Evaluation of a Course Concept Based on Reusable Learning Objects

    ERIC Educational Resources Information Center

    Van Zele, Els; Vandaele, Pieter; Botteldooren, Dick; Lenaerts, Josephina

    2003-01-01

    This article describes the implementation and evaluation of a learning objects based computer aided system for an advanced engineering course at Ghent University, Belgium. A new syllabus concept was introduced: students had access to a Web-delivered component and received an identical printed component as two sources of information additional to…

  13. An electronic portfolio for quantitative assessment of surgical skills in undergraduate medical education.

    PubMed

    Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero

    2013-05-06

    We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved

  14. An Object-Based Approach to Evaluation of Climate Variability Projections and Predictions

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.

    2017-12-01

    Evaluations of the performance of earth system model predictions and projections are of critical importance to enhance usefulness of these products. Such evaluations need to address specific concerns depending on the system and decisions of interest; hence, evaluation tools must be tailored to inform about specific issues. Traditional approaches that summarize grid-based comparisons of analyses and models, or between current and future climate, often do not reveal important information about the models' performance (e.g., spatial or temporal displacements; the reason behind a poor score) and are unable to accommodate these specific information needs. For example, summary statistics such as the correlation coefficient or the mean-squared error provide minimal information to developers, users, and decision makers regarding what is "right" and "wrong" with a model. New spatial and temporal-spatial object-based tools from the field of weather forecast verification (where comparisons typically focus on much finer temporal and spatial scales) have been adapted to more completely answer some of the important earth system model evaluation questions. In particular, the Method for Object-based Diagnostic Evaluation (MODE) tool and its temporal (three-dimensional) extension (MODE-TD) have been adapted for these evaluations. More specifically, these tools can be used to address spatial and temporal displacements in projections of El Nino-related precipitation and/or temperature anomalies, ITCZ-associated precipitation areas, atmospheric rivers, seasonal sea-ice extent, and other features of interest. Examples of several applications of these tools in a climate context will be presented, using output of the CESM large ensemble. In general, these tools provide diagnostic information about model performance - accounting for spatial, temporal, and intensity differences - that cannot be achieved using traditional (scalar) model comparison approaches. Thus, they can provide more

  15. Development and Evaluation of Glycine max Germplasm Lines with Quantitative Resistance to Sclerotinia sclerotiorum

    PubMed Central

    McCaghey, Megan; Willbur, Jaime; Ranjan, Ashish; Grau, Craig R.; Chapman, Scott; Diers, Brian; Groves, Carol; Kabbage, Mehdi; Smith, Damon L.

    2017-01-01

    Sclerotinia sclerotiorum, the causal agent of Sclerotinia stem rot, is a devastating fungal pathogen of soybean that can cause significant yield losses to growers when environmental conditions are favorable for the disease. The development of resistant varieties has proven difficult. However, poor resistance in commercial cultivars can be improved through additional breeding efforts and understanding the genetic basis of resistance. The objective of this project was to develop soybean germplasm lines that have a high level of Sclerotinia stem rot resistance to be used directly as cultivars or in breeding programs as a source of improved Sclerotinia stem rot resistance. Sclerotinia stem rot-resistant soybean germplasm was developed by crossing two sources of resistance, W04-1002 and AxN-1-55, with lines exhibiting resistance to Heterodera glycines and Cadophora gregata in addition to favorable agronomic traits. Following greenhouse evaluations of 1,076 inbred lines derived from these crosses, 31 lines were evaluated for resistance in field tests during the 2014 field season. Subsequently, 11 Sclerotinia stem rot resistant breeding lines were moved forward for field evaluation in 2015, and seven elite breeding lines were selected and evaluated in the 2016 field season. To better understand resistance mechanisms, a marker analysis was conducted to identify quantitative trait loci linked to resistance. Thirteen markers associated with Sclerotinia stem rot resistance were identified on chromosomes 15, 16, 17, 18, and 19. Our markers confirm previously reported chromosomal regions associated with Sclerotinia stem rot resistance as well as a novel region of chromosome 16. The seven elite germplasm lines were also re-evaluated within a greenhouse setting using a cut petiole technique with multiple S. sclerotiorum isolates to test the durability of physiological resistance of the lines in a controlled environment. This work presents a novel and comprehensive classical

  16. Quantitative Laser Biospeckle Method for the Evaluation of the Activity of Trypanosoma cruzi Using VDRL Plates and Digital Analysis.

    PubMed

    Grassi, Hilda Cristina; García, Lisbette C; Lobo-Sulbarán, María Lorena; Velásquez, Ana; Andrades-Grassi, Francisco A; Cabrera, Humberto; Andrades-Grassi, Jesús E; Andrades, Efrén D J

    2016-12-01

    In this paper we report a quantitative laser Biospeckle method using VDRL plates to monitor the activity of Trypanosoma cruzi and the calibration conditions including three image processing algorithms and three programs (ImageJ and two programs designed in this work). Benznidazole was used as a test drug. Variable volume (constant density) and variable density (constant volume) were used for the quantitative evaluation of parasite activity in calibrated wells of the VDRL plate. The desiccation process within the well was monitored as a function of volume and of the activity of the Biospeckle pattern of the parasites as well as the quantitative effect of the surface parasite quantity (proportion of the object's plane). A statistical analysis was performed with ANOVA, Tukey post hoc and Descriptive Statistics using R and R Commander. Conditions of volume (100μl) and parasite density (2-4x104 parasites/well, in exponential growth phase), assay time (up to 204min), frame number (11 frames), algorithm and program (RCommander/SAGA) for image processing were selected to test the effect of variable concentrations of benznidazole (0.0195 to 20μg/mL / 0.075 to 76.8μM) at various times (1, 61, 128 and 204min) on the activity of the Biospeckle pattern. The flat wells of the VDRL plate were found to be suitable for the quantitative calibration of the activity of Trypanosoma cruzi using the appropriate algorithm and program. Under these conditions, benznidazole produces at 1min an instantaneous effect on the activity of the Biospeckle pattern of T. cruzi, which remains with a similar profile up to 1 hour. A second effect which is dependent on concentrations above 1.25μg/mL and is statistically different from the effect at lower concentrations causes a decrease in the activity of the Biospeckle pattern. This effect is better detected after 1 hour of drug action. This behavior may be explained by an instantaneous effect on a membrane protein of Trypanosoma cruzi that could

  17. Quantitative evaluation of protocorm growth and fungal colonization in Bletilla striata (Orchidaceae) reveals less-productive symbiosis with a non-native symbiotic fungus.

    PubMed

    Yamamoto, Tatsuki; Miura, Chihiro; Fuji, Masako; Nagata, Shotaro; Otani, Yuria; Yagame, Takahiro; Yamato, Masahide; Kaminaka, Hironori

    2017-02-21

    In nature, orchid plants depend completely on symbiotic fungi for their nutrition at the germination and the subsequent seedling (protocorm) stages. However, only limited quantitative methods for evaluating the orchid-fungus interactions at the protocorm stage are currently available, which greatly constrains our understanding of the symbiosis. Here, we aimed to improve and integrate quantitative evaluations of the growth and fungal colonization in the protocorms of a terrestrial orchid, Blettila striata, growing on a plate medium. We achieved both symbiotic and asymbiotic germinations for the terrestrial orchid B. striata. The protocorms produced by the two germination methods grew almost synchronously for the first three weeks. At week four, however, the length was significantly lower in the symbiotic protocorms. Interestingly, the dry weight of symbiotic protocorms did not significantly change during the growth period, which implies that there was only limited transfer of carbon compounds from the fungus to the protocorms in this relationship. Next, to evaluate the orchid-fungus interactions, we developed an ink-staining method to observe the hyphal coils in protocorms without preparing thin sections. Crushing the protocorm under the coverglass enables us to observe all hyphal coils in the protocorms with high resolution. For this observation, we established a criterion to categorize the stages of hyphal coils, depending on development and degradation. By counting the symbiotic cells within each stage, it was possible to quantitatively evaluate the orchid-fungus symbiosis. We describe a method for quantitative evaluation of orchid-fungus symbiosis by integrating the measurements of plant growth and fungal colonization. The current study revealed that although fungal colonization was observed in the symbiotic protocorms, the weight of the protocorm did not significantly increase, which is probably due to the incompatibility of the fungus in this symbiosis. These

  18. Systems and technologies for objective evaluation of technical skills in laparoscopic surgery.

    PubMed

    Sánchez-Margallo, Juan A; Sánchez-Margallo, Francisco M; Oropesa, Ignacio; Gómez, Enrique J

    2014-01-01

    Minimally invasive surgery is a highly demanding surgical approach regarding technical requirements for the surgeon, who must be trained in order to perform a safe surgical intervention. Traditional surgical education in minimally invasive surgery is commonly based on subjective criteria to quantify and evaluate surgical abilities, which could be potentially unsafe for the patient. Authors, surgeons and associations are increasingly demanding the development of more objective assessment tools that can accredit surgeons as technically competent. This paper describes the state of the art in objective assessment methods of surgical skills. It gives an overview on assessment systems based on structured checklists and rating scales, surgical simulators, and instrument motion analysis. As a future work, an objective and automatic assessment method of surgical skills should be standardized as a means towards proficiency-based curricula for training in laparoscopic surgery and its certification.

  19. Integrating regional conservation priorities for multiple objectives into national policy

    PubMed Central

    Beger, Maria; McGowan, Jennifer; Treml, Eric A.; Green, Alison L.; White, Alan T.; Wolff, Nicholas H.; Klein, Carissa J.; Mumby, Peter J.; Possingham, Hugh P.

    2015-01-01

    Multinational conservation initiatives that prioritize investment across a region invariably navigate trade-offs among multiple objectives. It seems logical to focus where several objectives can be achieved efficiently, but such multi-objective hotspots may be ecologically inappropriate, or politically inequitable. Here we devise a framework to facilitate a regionally cohesive set of marine-protected areas driven by national preferences and supported by quantitative conservation prioritization analyses, and illustrate it using the Coral Triangle Initiative. We identify areas important for achieving six objectives to address ecosystem representation, threatened fauna, connectivity and climate change. We expose trade-offs between areas that contribute substantially to several objectives and those meeting one or two objectives extremely well. Hence there are two strategies to guide countries choosing to implement regional goals nationally: multi-objective hotspots and complementary sets of single-objective priorities. This novel framework is applicable to any multilateral or global initiative seeking to apply quantitative information in decision making. PMID:26364769

  20. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging

    PubMed Central

    Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A.; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-01-01

    Background Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. Methods We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow’s disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. Results On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Conclusions Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the

  1. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging.

    PubMed

    Sturla, Francesco; Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-04-01

    Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow's disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment.

  2. Quantitative evaluation of haze formation of koji and progression of internal haze by drying of koji during koji making.

    PubMed

    Ito, Kazunari; Gomi, Katsuya; Kariyama, Masahiro; Miyake, Tsuyoshi

    2017-07-01

    The construction of an experimental system that can mimic koji making in the manufacturing setting of a sake brewery is initially required for the quantitative evaluation of mycelia grown on/in koji pellets (haze formation). Koji making with rice was investigated with a solid-state fermentation (SSF) system using a non-airflow box (NAB), which produced uniform conditions in the culture substrate with high reproducibility and allowed for the control of favorable conditions in the substrate during culture. The SSF system using NAB accurately reproduced koji making in a manufacturing setting. To evaluate haze formation during koji making, surfaces and cross sections of koji pellets obtained from koji making tests were observed using a digital microscope. Image analysis was used to distinguish between haze and non-haze sections of koji pellets, enabling the evaluation of haze formation in a batch by measuring the haze rate of a specific number of koji pellets. This method allowed us to obtain continuous and quantitative data on the time course of haze formation. Moreover, drying koji during the late stage of koji making was revealed to cause further penetration of mycelia into koji pellets (internal haze). The koji making test with the SSF system using NAB and quantitative evaluation of haze formation in a batch by image analysis is a useful method for understanding the relations between haze formation and koji making conditions. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  3. Thermal evaluation for exposed stone house with quantitative and qualitative approach in mountainous area, Wonosobo, Indonesia

    NASA Astrophysics Data System (ADS)

    Hermawan, Hermawan; Prianto, Eddy

    2017-12-01

    A building can be considered as having a good thermal performance if it can make the occupant comfortable. Thermal comfort can be seen from the occupant's respond toward the architectural elements and the environment, such as lighting, the room crowding, air temperature, humidity, oxygen level, and occupant's behaviours. The objective of this research is to analyse the thermal performance of four different orientation houses in mountainous area. The research was conducted on the four expose stone houses with four different orientations in the slope of Sindoro Mountain which has relative cool temperature, about 26°C. The measurement of the elements above was done quantitatively and qualitatively for 24 hours. The results are as follows. First, the most comfortable house is west-orientation house. Second, based on the quantitative and qualitative observation, there is no significant difference (±5 %). Third, the occupant's behaviours (caring and genen) also become factors influencing occupant's comfort.

  4. A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.

    PubMed

    Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui

    2017-10-01

    Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.

  5. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography.

    PubMed

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A

    2014-03-01

    Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also verified. The maximum

  6. Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.

    PubMed

    He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan

    2009-07-01

    Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.

  7. Quantitative computed tomography and cranial burr holes: a model to evaluate the quality of cranial reconstruction in humans.

    PubMed

    Worm, Paulo Valdeci; Ferreira, Nelson Pires; Ferreira, Marcelo Paglioli; Kraemer, Jorge Luiz; Lenhardt, Rene; Alves, Ronnie Peterson Marcondes; Wunderlich, Ricardo Castilho; Collares, Marcus Vinicius Martins

    2012-05-01

    Current methods to evaluate the biologic development of bone grafts in human beings do not quantify results accurately. Cranial burr holes are standardized critical bone defects, and the differences between bone powder and bone grafts have been determined in numerous experimental studies. This study evaluated quantitative computed tomography (QCT) as a method to objectively measure cranial bone density after cranial reconstruction with autografts. In each of 8 patients, 2 of 4 surgical burr holes were reconstructed with autogenous wet bone powder collected during skull trephination, and the other 2 holes, with a circular cortical bone fragment removed from the inner table of the cranial bone flap. After 12 months, the reconstructed areas and a sample of normal bone were studied using three-dimensional QCT; bone density was measured in Hounsfield units (HU). Mean (SD) bone density was 1535.89 (141) HU for normal bone (P < 0.0001), 964 (176) HU for bone fragments, and 453 (241) HU for bone powder (P < 0.001). As expected, the density of the bone fragment graft was consistently greater than that of bone powder. Results confirm the accuracy and reproducibility of QCT, already demonstrated for bone in other locations, and suggest that it is an adequate tool to evaluate cranial reconstructions. The combination of QCT and cranial burr holes is an excellent model to accurately measure the quality of new bone in cranial reconstructions and also seems to be an appropriate choice of experimental model to clinically test any cranial bone or bone substitute reconstruction.

  8. Evaluation of Instructional Materials for Exceptional Children and Youth: A Preliminary Instrument.

    ERIC Educational Resources Information Center

    Eash, Maurice

    An instrument for the evaluation of instructional materials is presented. Evaluative items are arranged under four constructs: objectives, organization of material (both scope and sequence), methodology, and evaluation. A section is also provided for summary quantitative judgment. A glossary of terms used in the instrument is included. A training…

  9. Investigation of the feasibility of non-invasive optical sensors for the quantitative assessment of dehydration.

    PubMed

    Visser, Cobus; Kieser, Eduard; Dellimore, Kiran; van den Heever, Dawie; Smith, Johan

    2017-10-01

    This study explores the feasibility of prospectively assessing infant dehydration using four non-invasive, optical sensors based on the quantitative and objective measurement of various clinical markers of dehydration. The sensors were investigated to objectively and unobtrusively assess the hydration state of an infant based on the quantification of capillary refill time (CRT), skin recoil time (SRT), skin temperature profile (STP) and skin tissue hydration by means of infrared spectrometry (ISP). To evaluate the performance of the sensors a clinical study was conducted on a cohort of 10 infants (aged 6-36 months) with acute gastroenteritis. High sensitivity and specificity were exhibited by the sensors, in particular the STP and SRT sensors, when combined into a fusion regression model (sensitivity: 0.90, specificity: 0.78). The SRT and STP sensors and the fusion model all outperformed the commonly used "gold standard" clinical dehydration scales including the Gorelick scale (sensitivity: 0.56, specificity: 0.56), CDS scale (sensitivity: 1.0, specificity: 0.2) and WHO scale (sensitivity: 0.13, specificity: 0.79). These results suggest that objective and quantitative assessment of infant dehydration may be possible using the sensors investigated. However, further evaluation of the sensors on a larger sample population is needed before deploying them in a clinical setting. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  10. Objective Evaluation Tool for Texture-Modified Food (OET-TMF): Development of the Tool and Validation.

    PubMed

    Calleja-Fernández, Alicia; Pintor-de-la-Maza, Begoña; Vidal-Casariego, Alfonso; Cano-Rodríguez, Isidoro; Ballesteros-Pomar, María D

    2016-06-01

    Texture-modified diets (TMDs) should fulfil nutritional goals, guarantee homogenous texture, and meet food safety regulations. The food industry has created texture-modified food (TMF) that meets the TMD requirements of quality and safety for inpatients. To design and develop a tool that allows the objective selection of foodstuffs for TMDs that ensures nutritional requirements and swallowing safety of inpatients in order to improve their quality of life, especially regarding their food satisfaction. An evaluation tool was designed to objectively determine the adequacy of food included in the TMD menus of a hospital. The "Objective Evaluation Tool for Texture-Modified Food" (OET-TMF) consists of seven items that evaluate the food's nutritional quality (energy and protein input), presence of allergens, texture and viscosity, cooking, storage type, useful life, and patient acceptance. The total score ranged from 0 to 64 and was divided into four categories: high quality, good quality, medium quality, and low quality. Studying four different commercial TMFs contributed to the validation of the tool. All the evaluated products scored between high and good regarding quality. There was a tendency (p = 0.077) towards higher consumption and a higher overall quality of the product obtained with the OET-TMF. The product that scored highest with the tool was the best accepted; the product with the lowest score had the highest rate of refusal. The OET-TMF allows for the objective discrimination of the quality of TMF. In addition, it shows a certain relationship between the observed and assessed quality intake.

  11. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  12. Miramar College Program Evaluation: Aviation Maintenance.

    ERIC Educational Resources Information Center

    Moriyama, Bruce; Brumley, Leslie

    Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in aviation maintenance. The report first provides the results of an interview with the program chairperson, which sought information on program objectives and goals and their determination, the extent…

  13. Efficacy of fluoride varnishes for preventing enamel demineralization after interproximal enamel reduction. Qualitative and quantitative evaluation

    PubMed Central

    González Paz, Belén Manuela; García López, José

    2017-01-01

    Objectives To evaluate quantitatively and qualitatively the changes produced to enamel after interproximal reduction and subjected to demineralization cycles, after applying a fluoride varnish (Profluorid) and a fluoride varnish containing tricalcium phosphate modified by fumaric acid (Clinpro White). Materials and methods 138 interproximal dental surfaces were divided into six groups: 1) Intact enamel; 2) Intact enamel + demineralization cycles (DC); 3) Interproximal Reduction (IR); 4) IR + DC; 5) IR + Profluorid + DC; 6) IR + Clinpro White + DC. IR was performed with a 0.5 mm cylindrical diamond bur. The weight percentage of calcium (Ca), phosphorous (P) and fluoride (F) were quantified by energy-dispersive X-ray spectrometry (EDX). Samples were examined under scanning electron microscopy (SEM). Results The weight percentage of Ca was significantly higher (p<0.05) in Groups 1, 2 and 5 than Groups 4 and 6. No significant differences were detected in the weight percentage of Ca between Group 3 and the other groups (p>0.05). The weight percentage of P was similar among all six groups (p>0.05). F was detected on 65% of Group 6 surfaces. SEM images of Groups 4 and 6 showed signs of demineralization, while Group 5 did not. Conclusions Profluorid application acts as a barrier against the demineralization of interproximally reduced enamel. PMID:28430810

  14. Quantitative Image Informatics for Cancer Research (QIICR) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.

  15. Qualitative analysis of student beliefs and attitudes after an objective structured clinical evaluation: implications for affective domain learning in undergraduate nursing education.

    PubMed

    Cazzell, Mary; Rodriguez, Amber

    2011-12-01

    This qualitative study explored the feelings, beliefs, and attitudes of senior-level undergraduate pediatric nursing students upon completion of a medication administration Objective Structured Clinical Evaluation (OSCE). The affective domain is the most neglected domain in higher education, although it is deemed the "gateway to learning." Quantitative assessments of clinical skills performed during OSCEs usually address two of the three domains of learning: cognitive (knowledge) and psychomotor skills. Twenty students volunteered to participate in focus groups (10 per group) and were asked three questions relevant to their feelings, beliefs, and attitudes about their OSCE experiences. Students integrated the attitude of safety first into future practice but felt that anxiety, loss of control, reaction under pressure, and no feedback affected their ability to connect the OSCE performance with future clinical practice. The findings affect future affective domain considerations in the development, modification, and assessment of OSCEs across the undergraduate nursing curriculum.

  16. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  17. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    ERIC Educational Resources Information Center

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  18. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    PubMed

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  19. Overview of the JPEG XS objective evaluation procedures

    NASA Astrophysics Data System (ADS)

    Willème, Alexandre; Richter, Thomas; Rosewarne, Chris; Macq, Benoit

    2017-09-01

    JPEG XS is a standardization activity conducted by the Joint Photographic Experts Group (JPEG), formally known as ISO/IEC SC29 WG1 group that aims at standardizing a low-latency, lightweight and visually lossless video compression scheme. This codec is intended to be used in applications where image sequences would otherwise be transmitted or stored in uncompressed form, such as in live production (through SDI or IP transport), display links, or frame buffers. Support for compression ratios ranging from 2:1 to 6:1 allows significant bandwidth and power reduction for signal propagation. This paper describes the objective quality assessment procedures conducted as part of the JPEG XS standardization activity. Firstly, this paper discusses the objective part of the experiments that led to the technology selection during the 73th WG1 meeting in late 2016. This assessment consists of PSNR measurements after a single and multiple compression decompression cycles at various compression ratios. After this assessment phase, two proposals among the six responses to the CfP were selected and merged to form the first JPEG XS test model (XSM). Later, this paper describes the core experiments (CEs) conducted so far on the XSM. These experiments are intended to evaluate its performance in more challenging scenarios, such as insertion of picture overlays, robustness to frame editing, assess the impact of the different algorithmic choices, and also to measure the XSM performance using the HDR VDP metric.

  20. Objective Evaluation of Muscle Strength in Infants with Hypotonia and Muscle Weakness

    ERIC Educational Resources Information Center

    Reus, Linda; van Vlimmeren, Leo A.; Staal, J. Bart; Janssen, Anjo J. W. M.; Otten, Barto J.; Pelzer, Ben J.; Nijhuis-van der Sanden, Maria W. G.

    2013-01-01

    The clinical evaluation of an infant with motor delay, muscle weakness, and/or hypotonia would improve considerably if muscle strength could be measured objectively and normal reference values were available. The authors developed a method to measure muscle strength in infants and tested 81 typically developing infants, 6-36 months of age, and 17…

  1. Evaluating Students with Disabilities and Their Teachers: Use of Student Learning Objectives

    ERIC Educational Resources Information Center

    Joyce, Jeanette; Harrison, Judith R.; Murphy, Danielle

    2016-01-01

    Over the past decade, there has been a movement toward increased accountability, focusing on teacher performance, in U.S. education. The purpose of this chapter is to discuss student learning objectives (SLOs) as one component of high-stakes teacher evaluation systems, within the context of learners with special needs. We describe SLOs and their…

  2. Clusters of Insomnia Disorder: An Exploratory Cluster Analysis of Objective Sleep Parameters Reveals Differences in Neurocognitive Functioning, Quantitative EEG, and Heart Rate Variability.

    PubMed

    Miller, Christopher B; Bartlett, Delwyn J; Mullins, Anna E; Dodds, Kirsty L; Gordon, Christopher J; Kyle, Simon D; Kim, Jong Won; D'Rozario, Angela L; Lee, Rico S C; Comas, Maria; Marshall, Nathaniel S; Yee, Brendon J; Espie, Colin A; Grunstein, Ronald R

    2016-11-01

    To empirically derive and evaluate potential clusters of Insomnia Disorder through cluster analysis from polysomnography (PSG). We hypothesized that clusters would differ on neurocognitive performance, sleep-onset measures of quantitative ( q )-EEG and heart rate variability (HRV). Research volunteers with Insomnia Disorder (DSM-5) completed a neurocognitive assessment and overnight PSG measures of total sleep time (TST), wake time after sleep onset (WASO), and sleep onset latency (SOL) were used to determine clusters. From 96 volunteers with Insomnia Disorder, cluster analysis derived at least two clusters from objective sleep parameters: Insomnia with normal objective sleep duration (I-NSD: n = 53) and Insomnia with short sleep duration (I-SSD: n = 43). At sleep onset, differences in HRV between I-NSD and I-SSD clusters suggest attenuated parasympathetic activity in I-SSD (P < 0.05). Preliminary work suggested three clusters by retaining the I-NSD and splitting the I-SSD cluster into two: I-SSD A (n = 29): defined by high WASO and I-SSD B (n = 14): a second I-SSD cluster with high SOL and medium WASO. The I-SSD B cluster performed worse than I-SSD A and I-NSD for sustained attention (P ≤ 0.05). In an exploratory analysis, q -EEG revealed reduced spectral power also in I-SSD B before (Delta, Alpha, Beta-1) and after sleep-onset (Beta-2) compared to I-SSD A and I-NSD (P ≤ 0.05). Two insomnia clusters derived from cluster analysis differ in sleep onset HRV. Preliminary data suggest evidence for three clusters in insomnia with differences for sustained attention and sleep-onset q -EEG. Insomnia 100 sleep study: Australia New Zealand Clinical Trials Registry (ANZCTR) identification number 12612000049875. URL: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=347742. © 2016 Associated Professional Sleep Societies, LLC.

  3. Objective breast tissue image classification using Quantitative Transmission ultrasound tomography

    NASA Astrophysics Data System (ADS)

    Malik, Bilal; Klock, John; Wiskin, James; Lenox, Mark

    2016-12-01

    Quantitative Transmission Ultrasound (QT) is a powerful and emerging imaging paradigm which has the potential to perform true three-dimensional image reconstruction of biological tissue. Breast imaging is an important application of QT and allows non-invasive, non-ionizing imaging of whole breasts in vivo. Here, we report the first demonstration of breast tissue image classification in QT imaging. We systematically assess the ability of the QT images’ features to differentiate between normal breast tissue types. The three QT features were used in Support Vector Machines (SVM) classifiers, and classification of breast tissue as either skin, fat, glands, ducts or connective tissue was demonstrated with an overall accuracy of greater than 90%. Finally, the classifier was validated on whole breast image volumes to provide a color-coded breast tissue volume. This study serves as a first step towards a computer-aided detection/diagnosis platform for QT.

  4. The Use of Mouse Models of Breast Cancer and Quantitative Image Analysis to Evaluate Hormone Receptor Antigenicity after Microwave-assisted Formalin Fixation

    PubMed Central

    Engelberg, Jesse A.; Giberson, Richard T.; Young, Lawrence J.T.; Hubbard, Neil E.

    2014-01-01

    Microwave methods of fixation can dramatically shorten fixation times while preserving tissue structure; however, it remains unclear if adequate tissue antigenicity is preserved. To assess and validate antigenicity, robust quantitative methods and animal disease models are needed. We used two mouse mammary models of human breast cancer to evaluate microwave-assisted and standard 24-hr formalin fixation. The mouse models expressed four antigens prognostic for breast cancer outcome: estrogen receptor, progesterone receptor, Ki67, and human epidermal growth factor receptor 2. Using pathologist evaluation and novel methods of quantitative image analysis, we measured and compared the quality of antigen preservation, percentage of positive cells, and line plots of cell intensity. Visual evaluations by pathologists established that the amounts and patterns of staining were similar in tissues fixed by the different methods. The results of the quantitative image analysis provided a fine-grained evaluation, demonstrating that tissue antigenicity is preserved in tissues fixed using microwave methods. Evaluation of the results demonstrated that a 1-hr, 150-W fixation is better than a 45-min, 150-W fixation followed by a 15-min, 650-W fixation. The results demonstrated that microwave-assisted formalin fixation can standardize fixation times to 1 hr and produce immunohistochemistry that is in every way commensurate with longer conventional fixation methods. PMID:24682322

  5. Assessing ADHD symptoms in children and adults: evaluating the role of objective measures.

    PubMed

    Emser, Theresa S; Johnston, Blair A; Steele, J Douglas; Kooij, Sandra; Thorell, Lisa; Christiansen, Hanna

    2018-05-18

    Diagnostic guidelines recommend using a variety of methods to assess and diagnose ADHD. Applying subjective measures always incorporates risks such as informant biases or large differences between ratings obtained from diverse sources. Furthermore, it has been demonstrated that ratings and tests seem to assess somewhat different constructs. The use of objective measures might thus yield valuable information for diagnosing ADHD. This study aims at evaluating the role of objective measures when trying to distinguish between individuals with ADHD and controls. Our sample consisted of children (n = 60) and adults (n = 76) diagnosed with ADHD and matched controls who completed self- and observer ratings as well as objective tasks. Diagnosis was primarily based on clinical interviews. A popular pattern recognition approach, support vector machines, was used to predict the diagnosis. We observed relatively high accuracy of 79% (adults) and 78% (children) applying solely objective measures. Predicting an ADHD diagnosis using both subjective and objective measures exceeded the accuracy of objective measures for both adults (89.5%) and children (86.7%), with the subjective variables proving to be the most relevant. We argue that objective measures are more robust against rater bias and errors inherent in subjective measures and may be more replicable. Considering the high accuracy of objective measures only, we found in our study, we think that they should be incorporated in diagnostic procedures for assessing ADHD.

  6. Quantitation of aortic and mitral regurgitation in the pediatric population: evaluation by radionuclide angiocardiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurwitz, R.A.; Treves, S.; Freed, M.

    The ability to quantitate aortic (AR) or mitral regurgitation (MR), or both, by radionuclide angiocardiography was evaluated in children and young adults at rest and during isometric exercise. Regurgitation was estimated by determining the ratio of left ventricular stroke volume to right ventricular stroke volume obtained during equilibrium ventriculography. The radionuclide measurement was compared with results of cineangiography, with good correlation between both studies in 47 of 48 patients. Radionuclide stroke volume ratio was used to classify severity: the group with equivocal regurgitation differed from the group with mild regurgitation (p less than 0.02); patients with mild regurgitation differed frommore » those with moderate regurgitation (p less than 0.001); and those with moderate regurgitation differed from those with severe regurgitation (p less than 0.01). The stroke volume ratio was responsive to isometric exercise, remaining constant or increasing in 16 of 18 patients. After surgery to correct regurgitation, the stroke volume ratio significantly decreased from preoperative measurements in all 7 patients evaluated. Results from the present study demonstrate that a stroke volume ratio greater than 2.0 is compatible with moderately severe regurgitation and that a ratio greater than 3.0 suggests the presence of severe regurgitation. Thus, radionuclide angiocardiography should be useful for noninvasive quantitation of AR or MR, or both, helping define the course of young patients with left-side valvular regurgitation.« less

  7. Evaluating the Impact of Action Plans on Trainee Compliance with Learning Objectives

    ERIC Educational Resources Information Center

    Aumann, Michael J.

    2013-01-01

    This mixed methods research study evaluated the use of technology-based action plans as a way to help improve compliance with the learning objectives of an online training event. It explored how the action planning strategy impacted subjects in a treatment group and compared them to subjects in a control group who did not get the action plan. The…

  8. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  9. [Information value of "additional tasks" method to evaluate pilot's work load].

    PubMed

    Gorbunov, V V

    2005-01-01

    "Additional task" method was used to evaluate pilot's work load in prolonged flight. Calculated through durations of latent periods of motor responses, quantitative criterion of work load is more informative for objective evaluation of pilot's involvement in his piloting functions rather than of other registered parameters.

  10. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation): Comprehensive 3-year progress report for the period January 15, 1986-January 14, 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1988-06-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 58 refs., 15 figs., 4 tabs.« less

  11. Attitudes and evaluative practices: category vs. item and subjective vs. objective constructions in everyday food assessments.

    PubMed

    Wiggins, Sally; Potter, Jonathan

    2003-12-01

    In social psychology, evaluative expressions have traditionally been understood in terms of their relationship to, and as the expression of, underlying 'attitudes'. In contrast, discursive approaches have started to study evaluative expressions as part of varied social practices, considering what such expressions are doing rather than their relationship to attitudinal objects or other putative mental entities. In this study the latter approach will be used to examine the construction of food and drink evaluations in conversation. The data are taken from a corpus of family mealtimes recorded over a period of months. The aim of this study is to highlight two distinctions that are typically obscured in traditional attitude work ('subjective' vs. 'objective' expressions, category vs. item evaluations). A set of extracts is examined to document the presence of these distinctions in talk that evaluates food and the way they are used and rhetorically developed to perform particular activities (accepting/refusing food, complimenting the food provider, persuading someone to eat). The analysis suggests that researchers (a) should be aware of the potential significance of these distinctions; (b) should be cautious when treating evaluative terms as broadly equivalent and (c) should be cautious when blurring categories and instances. This analysis raises the broader question of how far evaluative practices may be specific to particular domains, and what this specificity might consist in. It is concluded that research in this area could benefit from starting to focus on the role of evaluations in practices and charting their association with specific topics and objects.

  12. Patient-reported outcome assessment and objective evaluation of chemotherapy-induced alopecia.

    PubMed

    Komen, Manon M C; van den Hurk, Corina J G; Nortier, Johan W R; van der Ploeg, T; Smorenburg, Carolien H; van der Hoeven, Jacobus J M

    2018-04-01

    Alopecia is one of the most distressing side effects of chemotherapy. Evaluating and comparing the efficacy of potential therapies to prevent chemotherapy-induced alopecia (CIA) has been complicated by the lack of a standardized measurement for hair loss. In this study we investigated the correlation between patient-reported outcome assessments and quantitative measurement with the hair check to assess CIA in clinical practice. Scalp cooling efficacy was evaluated by patients by World Health Organisation (WHO) of CIA, Visual Analogue Scale (VAS) and wig use. The Hair Check was used to determine the amount of hair (in mm 2 ) per unit of scalp skin area (in cm 2 ) (Hair Mass Index, HMI). CIA was also evaluated by doctors, nurses and hairdressers. Baseline HMI was not predictive for hair loss. HMI declined throughout all chemotherapy cycles, which was not reflected by patient-reported measures. HMI correlated with patient-reported hair quantity before the start of the therapy, but not with WHO and/or VAS during therapy. Patient's opinion correlated moderately with the opinion of doctors and nurses (ρ = 0.50-0.56 respectively), but strongly with hair dressers (ρ = 0.70). The Hair check is suitable to quantify the amount of hair loss and could complement research on refining outcome of scalp cooling, but the patient's opinion should be considered as the best method to assess hair loss in clinical practice. Trialregister.nl NTR number 3082. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Miramar College Program Evaluation: Criminal Justice.

    ERIC Educational Resources Information Center

    Moriyama, Bruce; Brumley, Leslie

    Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in criminal justice. The report first outlines the information gathered in an interview with the program chairperson, conducted to determine program objectives and goals and how they were determined,…

  14. Miramar College Program Evaluation: Fire Science.

    ERIC Educational Resources Information Center

    Moriyama, Bruce; Brumley, Leslie

    Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in fire sciences. The report first provides the results of an interview with the program chairperson, which sought information on program objectives and goals and their determination, the extent to…

  15. Quantitative evaluation method of the threshold adjustment and the flat field correction performances of hybrid photon counting pixel detectors

    NASA Astrophysics Data System (ADS)

    Medjoubi, K.; Dawiec, A.

    2017-12-01

    A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.

  16. Quantitative evaluation of 3D dosimetry for stereotactic volumetric‐modulated arc delivery using COMPASS

    PubMed Central

    Manigandan, Durai; Karrthick, Karukkupalayam Palaniappan; Sambasivaselli, Raju; Senniandavar, Vellaingiri; Ramu, Mahendran; Rajesh, Thiyagarajan; Lutz, Muller; Muthukumaran, Manavalan; Karthikeyan, Nithyanantham; Tejinder, Kataria

    2014-01-01

    The purpose of this study was to evaluate quantitatively the patient‐specific 3D dosimetry tool COMPASS with 2D array MatriXX detector for stereotactic volumetric‐modulated arc delivery. Twenty‐five patients CT images and RT structures from different sites (brain, head & neck, thorax, abdomen, and spine) were taken from CyberKnife Multiplan planning system for this study. All these patients underwent radical stereotactic treatment in CyberKnife. For each patient, linac based volumetric‐modulated arc therapy (VMAT) stereotactic plans were generated in Monaco TPS v3.1 using Elekta Beam Modulator MLC. Dose prescription was in the range of 5–20 Gy per fraction. Target prescription and critical organ constraints were tried to match the delivered treatment plans. Each plan quality was analyzed using conformity index (CI), conformity number (CN), gradient Index (GI), target coverage (TC), and dose to 95% of volume (D95). Monaco Monte Carlo (MC)‐calculated treatment plan delivery accuracy was quantitatively evaluated with COMPASS‐calculated (CCA) dose and COMPASS indirectly measured (CME) dose based on dose‐volume histogram metrics. In order to ascertain the potential of COMPASS 3D dosimetry for stereotactic plan delivery, 2D fluence verification was performed with MatriXX using MultiCube phantom. Routine quality assurance of absolute point dose verification was performed to check the overall delivery accuracy. Quantitative analyses of dose delivery verification were compared with pass and fail criteria of 3 mm and 3% distance to agreement and dose differences. Gamma passing rate was compared with 2D fluence verification from MatriXX with MultiCube. Comparison of COMPASS reconstructed dose from measured fluence and COMPASS computed dose has shown a very good agreement with TPS calculated dose. Each plan was evaluated based on dose volume parameters for target volumes such as dose at 95% of volume (D95) and average dose. For critical organs dose at 20% of

  17. An object tracking method based on guided filter for night fusion image

    NASA Astrophysics Data System (ADS)

    Qian, Xiaoyan; Wang, Yuedong; Han, Lei

    2016-01-01

    Online object tracking is a challenging problem as it entails learning an effective model to account for appearance change caused by intrinsic and extrinsic factors. In this paper, we propose a novel online object tracking with guided image filter for accurate and robust night fusion image tracking. Firstly, frame difference is applied to produce the coarse target, which helps to generate observation models. Under the restriction of these models and local source image, guided filter generates sufficient and accurate foreground target. Then accurate boundaries of the target can be extracted from detection results. Finally timely updating for observation models help to avoid tracking shift. Both qualitative and quantitative evaluations on challenging image sequences demonstrate that the proposed tracking algorithm performs favorably against several state-of-art methods.

  18. Evaluation of New Zealand’s High-Seas Bottom Trawl Closures Using Predictive Habitat Models and Quantitative Risk Assessment

    PubMed Central

    Penney, Andrew J.; Guinotte, John M.

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  19. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.

  20. Standardizing Evaluation of pQCT Image Quality in the Presence of Subject Movement: Qualitative vs. Quantitative Assessment

    PubMed Central

    Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.

    2013-01-01

    Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  1. A fuzzy MCDM model with objective and subjective weights for evaluating service quality in hotel industries

    NASA Astrophysics Data System (ADS)

    Zoraghi, Nima; Amiri, Maghsoud; Talebi, Golnaz; Zowghi, Mahdi

    2013-12-01

    This paper presents a fuzzy multi-criteria decision-making (FMCDM) model by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels. The objective method selects weights of criteria through mathematical calculation, while the subjective method uses judgments of decision makers. In this paper, we use a combination of weights obtained by both approaches in evaluating service quality in hotel industries. A real case study that considered ranking five hotels is illustrated. Examples are shown to indicate capabilities of the proposed method.

  2. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    PubMed

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  3. Subjective and objective evaluation of visual fatigue on viewing 3D display continuously

    NASA Astrophysics Data System (ADS)

    Wang, Danli; Xie, Yaohua; Yang, Xinpan; Lu, Yang; Guo, Anxiang

    2015-03-01

    In recent years, three-dimensional (3D) displays become more and more popular in many fields. Although they can provide better viewing experience, they cause extra problems, e.g., visual fatigue. Subjective or objective methods are usually used in discrete viewing processes to evaluate visual fatigue. However, little research combines subjective indicators and objective ones in an entirely continuous viewing process. In this paper, we propose a method to evaluate real-time visual fatigue both subjectively and objectively. Subjects watch stereo contents on a polarized 3D display continuously. Visual Reaction Time (VRT), Critical Flicker Frequency (CFF), Punctum Maximum Accommodation (PMA) and subjective scores of visual fatigue are collected before and after viewing. During the viewing process, the subjects rate the visual fatigue whenever it changes, without breaking the viewing process. At the same time, the blink frequency (BF) and percentage of eye closure (PERCLOS) of each subject is recorded for comparison to a previous research. The results show that the subjective visual fatigue and PERCLOS increase with time and they are greater in a continuous process than a discrete one. The BF increased with time during the continuous viewing process. Besides, the visual fatigue also induced significant changes of VRT, CFF and PMA.

  4. Objective evaluation of acute adverse events and image quality of gadolinium-based contrast agents (gadobutrol and gadobenate dimeglumine) by blinded evaluation. Pilot study.

    PubMed

    Semelka, Richard C; Hernandes, Mateus de A; Stallings, Clifton G; Castillo, Mauricio

    2013-01-01

    The purpose was to objectively evaluate a recently FDA-approved gadolinium-based contrast agent (GBCA) in comparison to our standard GBCA for acute adverse events and image quality by blinded evaluation. Evaluation was made of a recently FDA-approved GBCA, gadobutrol (Gadavist; Bayer), in comparison to our standard GBCA, gadobenate dimeglumine (MultiHance; Bracco), in an IRB- and HIPAA-compliant study. Both the imaging technologist and patient were not aware of the brand of the GBCA used. A total of 59 magnetic resonance studies were evaluated (59 patients, 31 men, 28 women, age range of 5-85 years, mean age of 52 years). Twenty-nine studies were performed with gadobutrol (22 abdominal and 7 brain studies), and 30 studies were performed with gadobenate dimeglumine (22 abdominal and 8 brain studies). Assessment was made of acute adverse events focusing on objective observations of vomiting, hives, and moderate and severe reactions. Adequacy of enhancement was rated as poor, fair and good by one of two experienced radiologists who were blinded to the type of agent evaluated. No patient experienced acute adverse events with either agent. The target minor adverse events of vomiting or hives, and moderate and severe reactions were not observed in any patient. Adequacy of enhancement was rated as good for both agents in all patients. Objective, blinded evaluation is feasible and readily performable for the evaluation of GBCAs. This proof-of-concept study showed that both GBCAs evaluated exhibited consistent good image quality and no noteworthy adverse events. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. How Much Are Harry Potter's Glasses Worth? Children's Monetary Evaluation of Authentic Objects

    ERIC Educational Resources Information Center

    Gelman, Susan A.; Frazier, Brandy N.; Noles, Nicholaus S.; Manczak, Erika M.; Stilwell, Sarah M.

    2015-01-01

    Adults attach special value to objects that link to notable people or events--authentic objects. We examined children's monetary evaluation of authentic objects, focusing on four kinds: celebrity possessions (e.g., Harry Potter's glasses), original creations (e.g., the very first teddy bear), personal possessions (e.g., your…

  6. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  7. Credit assignment between body and object probed by an object transportation task.

    PubMed

    Kong, Gaiqing; Zhou, Zhihao; Wang, Qining; Kording, Konrad; Wei, Kunlin

    2017-10-17

    It has been proposed that learning from movement errors involves a credit assignment problem: did I misestimate properties of the object or those of my body? For example, an overestimate of arm strength and an underestimate of the weight of a coffee cup can both lead to coffee spills. Though previous studies have found signs of simultaneous learning of the object and of the body during object manipulation, there is little behavioral evidence about their quantitative relation. Here we employed a novel weight-transportation task, in which participants lift the first cup filled with liquid while assessing their learning from errors. Specifically, we examined their transfer of learning when switching to a contralateral hand, the second identical cup, or switching both hands and cups. By comparing these transfer behaviors, we found that 25% of the learning was attributed to the object (simply because of the use of the same cup) and 58% of the learning was attributed to the body (simply because of the use of the same hand). The nervous system thus seems to partition the learning of object manipulation between the object and the body.

  8. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-06-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  9. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  10. Pilot Program on Common Status Measures Objective-Referenced Tests. Colorado Evaluation Project, Report No. 1.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver.

    The purpose of the Colorado Evaluation Project was to field test the Common Status Measures at grades four and eleven in conjunction with a statewide assessment program based on objective-referenced testing instruments developed by the Colorado Department of Education for grades kindergarten, three, six, nine, and twelve. The evaluation was…

  11. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    PubMed

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.

  12. Patterns of Learning Object Reuse in the Connexions Repository

    ERIC Educational Resources Information Center

    Duncan, S. M.

    2009-01-01

    Since the term "learning object" was first published, there has been either an explicit or implicit expectation of reuse. There has also been a lot of speculation about why learning objects are, or are not, reused. This study quantitatively examined the actual amount and type of learning object use, to include reuse, modification, and translation,…

  13. A Rotatable Quality Control Phantom for Evaluating the Performance of Flat Panel Detectors in Imaging Moving Objects.

    PubMed

    Haga, Yoshihiro; Chida, Koichi; Inaba, Yohei; Kaga, Yuji; Meguro, Taiichiro; Zuguchi, Masayuki

    2016-02-01

    As the use of diagnostic X-ray equipment with flat panel detectors (FPDs) has increased, so has the importance of proper management of FPD systems. To ensure quality control (QC) of FPD system, an easy method for evaluating FPD imaging performance for both stationary and moving objects is required. Until now, simple rotatable QC phantoms have not been available for the easy evaluation of the performance (spatial resolution and dynamic range) of FPD in imaging moving objects. We developed a QC phantom for this purpose. It consists of three thicknesses of copper and a rotatable test pattern of piano wires of various diameters. Initial tests confirmed its stable performance. Our moving phantom is very useful for QC of FPD images of moving objects because it enables visual evaluation of image performance (spatial resolution and dynamic range) easily.

  14. 3D Slicer as an Image Computing Platform for the Quantitative Imaging Network

    PubMed Central

    Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron

    2012-01-01

    Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future

  15. Simple Learned Weighted Sums of Inferior Temporal Neuronal Firing Rates Accurately Predict Human Core Object Recognition Performance

    PubMed Central

    Hong, Ha; Solomon, Ethan A.; DiCarlo, James J.

    2015-01-01

    database of images for evaluating object recognition performance. We used multielectrode arrays to characterize hundreds of neurons in the visual ventral stream of nonhuman primates and measured the object recognition performance of >100 human observers. Remarkably, we found that simple learned weighted sums of firing rates of neurons in monkey inferior temporal (IT) cortex accurately predicted human performance. Although previous work led us to expect that IT would outperform V4, we were surprised by the quantitative precision with which simple IT-based linking hypotheses accounted for human behavior. PMID:26424887

  16. Simple Learned Weighted Sums of Inferior Temporal Neuronal Firing Rates Accurately Predict Human Core Object Recognition Performance.

    PubMed

    Majaj, Najib J; Hong, Ha; Solomon, Ethan A; DiCarlo, James J

    2015-09-30

    database of images for evaluating object recognition performance. We used multielectrode arrays to characterize hundreds of neurons in the visual ventral stream of nonhuman primates and measured the object recognition performance of >100 human observers. Remarkably, we found that simple learned weighted sums of firing rates of neurons in monkey inferior temporal (IT) cortex accurately predicted human performance. Although previous work led us to expect that IT would outperform V4, we were surprised by the quantitative precision with which simple IT-based linking hypotheses accounted for human behavior. Copyright © 2015 the authors 0270-6474/15/3513402-17$15.00/0.

  17. Evaluating Robotic Surgical Skills Performance Under Distractive Environment Using Objective and Subjective Measures.

    PubMed

    Suh, Irene H; LaGrange, Chad A; Oleynikov, Dmitry; Siu, Ka-Chun

    2016-02-01

    Distractions are recognized as a significant factor affecting performance in safety critical domains. Although operating rooms are generally full of distractions, the effect of distractions on robot-assisted surgical (RAS) performance is unclear. Our aim was to investigate the effect of distractions on RAS performance using both objective and subjective measures. Fifteen participants performed a knot-tying task using the da Vinci Surgical System and were exposed to 3 distractions: (1) passive distraction entailed listening to noise with a constant heart rate, (2) active distraction included listening to noise and acknowledging a change of random heart rate from 60 to 120 bpm, and (3) interactive distraction consisted of answering math questions. The objective kinematics of the surgical instrument tips were used to evaluate performance. Electromyography (EMG) of the forearm and hand muscles of the participants were collected. The median EMG frequency (EMG(fmed)) and the EMG envelope (EMG(env)) were analyzed. NASA Task Load Index and Fundamentals of Laparoscopic Surgery score were used to evaluate the subjective performance. One-way repeated analysis of variance was applied to examine the effects of distraction on skills performance. Spearman's correlations were conducted to compare objective and subjective measures. Significant distraction effect was found for all objective kinematics measures (P < .05). There were significant distraction effects for EMG measures (EMG(env), P < .004; EMG(fmed), P = .031). Significant distraction effects were also found for subjective measurements. Distraction impairs surgical skills performance and increases muscle work. Understanding how the surgeons cope with distractions is important in developing surgical education. © The Author(s) 2015.

  18. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  19. Serum Squamous Cell Carcinoma Antigen in Psoriasis: A Potential Quantitative Biomarker for Disease Severity.

    PubMed

    Sun, Ziwen; Shi, Xiaomin; Wang, Yun; Zhao, Yi

    2018-06-05

    An objective and quantitative method to evaluate psoriasis severity is important for practice and research in the precision care of psoriasis. We aimed to explore serum biomarkers quantitatively in association with disease severity and treatment response in psoriasis patients, with serum squamous cell carcinoma antigen (SCCA) evaluated in this pilot study. 15 psoriasis patients were treated with adalimumab. At different visits before and after treatment, quantitative body surface area (qBSA) was obtained from standardized digital body images of the patients, and the psoriasis area severity index (PASI) was also monitored. SCCA were detected by using microparticle enzyme immunoassay. The serum biomarkers were also tested in healthy volunteers as normal controls. Receiver-operating characteristic (ROC) curve analysis was used to explore the optimal cutoff point of SCCA to differentiate mild and moderate-to-severe psoriasis. The serum SCCA level in the psoriasis group was significantly higher (p < 0.05) than in the normal control group. After treatment, the serum SCCA levels were significantly decreased (p < 0.05). The SCCA level was well correlated with PASI and qBSA. In ROC analysis, when taking PASI = 10 or qBSA = 10% as the threshold, an optimal cutoff point of SCCA was found at 2.0 ng/mL with the highest Youden index. Serum SCCA might be a useful quantitative biomarker for psoriasis disease severity. © 2018 S. Karger AG, Basel.

  20. Image restoration using aberration taken by a Hartmann wavefront sensor on extended object, towards real-time deconvolution

    NASA Astrophysics Data System (ADS)

    Darudi, Ahmad; Bakhshi, Hadi; Asgari, Reza

    2015-05-01

    In this paper we present the results of image restoration using the data taken by a Hartmann sensor. The aberration is measure by a Hartmann sensor in which the object itself is used as reference. Then the Point Spread Function (PSF) is simulated and used for image reconstruction using the Lucy-Richardson technique. A technique is presented for quantitative evaluation the Lucy-Richardson technique for deconvolution.

  1. A Component Analysis of the Impact of Evaluative and Objective Feedback on Performance

    ERIC Educational Resources Information Center

    Johnson, Douglas A.

    2013-01-01

    Despite the frequency with which performance feedback interventions are used in organizational behavior management, component analyses of such feedback are rare. It has been suggested that evaluation of performance and objective details about performance are two necessary components for performance feedback. The present study was designed to help…

  2. Quantitative reflectance spectroscopy of buddingtonite from the Cuprite mining district, Nevada

    NASA Technical Reports Server (NTRS)

    Felzer, Benjamin; Hauff, Phoebe; Goetz, Alexander F. H.

    1994-01-01

    Buddingtonite, an ammonium-bearing feldspar diagnostic of volcanic-hosted alteration, can be identified and, in some cases, quantitatively measured using short-wave infrared (SWIR) reflectance spectroscopy. In this study over 200 samples from Cuprite, Nevada, were evaluated by X ray diffraction, chemical analysis, scanning electron microscopy, and SWIR reflectance spectroscopy with the objective of developing a quantitative remote-sensing technique for rapid determination of the amount of ammonium or buddingtonite present, and its distribution across the site. Based upon the Hapke theory of radiative transfer from particulate surfaces, spectra from quantitative, physical mixtures were compared with computed mixture spectra. We hypothesized that the concentration of ammonium in each sample is related to the size and shape of the ammonium absorption bands and tested this hypothesis for samples of relatively pure buddingtonite. We found that the band depth of the 2.12-micron NH4 feature is linearly related to the NH4 concentration for the Cuprite buddingtonite, and that the relationship is approximately exponential for a larger range of NH4 concentrations. Associated minerals such as smectite and jarosite suppress the depth of the 2.12-micron NH4 absorption band. Quantitative reflectance spectroscopy is possible when the effects of these associated minerals are also considered.

  3. Quantitative evaluation of analyte transport on microfluidic paper-based analytical devices (μPADs).

    PubMed

    Ota, Riki; Yamada, Kentaro; Suzuki, Koji; Citterio, Daniel

    2018-02-07

    The transport efficiency during capillary flow-driven sample transport on microfluidic paper-based analytical devices (μPADs) made from filter paper has been investigated for a selection of model analytes (Ni 2+ , Zn 2+ , Cu 2+ , PO 4 3- , bovine serum albumin, sulforhodamine B, amaranth) representing metal cations, complex anions, proteins and anionic molecules. For the first time, the transport of the analytical target compounds rather than the sample liquid, has been quantitatively evaluated by means of colorimetry and absorption spectrometry-based methods. The experiments have revealed that small paperfluidic channel dimensions, additional user operation steps (e.g. control of sample volume, sample dilution, washing step) as well as the introduction of sample liquid wicking areas allow to increase analyte transport efficiency. It is also shown that the interaction of analytes with the negatively charged cellulosic paper substrate surface is strongly influenced by the physico-chemical properties of the model analyte and can in some cases (Cu 2+ ) result in nearly complete analyte depletion during sample transport. The quantitative information gained through these experiments is expected to contribute to the development of more sensitive μPADs.

  4. A GATE evaluation of the sources of error in quantitative {sup 90}Y PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydhorst, Jared, E-mail: jared.strydhorst@gmail.

    Purpose: Accurate reconstruction of the dose delivered by {sup 90}Y microspheres using a postembolization PET scan would permit the establishment of more accurate dose–response relationships for treatment of hepatocellular carcinoma with {sup 90}Y. However, the quality of the PET data obtained is compromised by several factors, including poor count statistics and a very high random fraction. This work uses Monte Carlo simulations to investigate what impact factors other than low count statistics have on the quantification of {sup 90}Y PET. Methods: PET acquisitions of two phantoms—a NEMA PET phantom and the NEMA IEC PET body phantom-containing either {sup 90}Y ormore » {sup 18}F were simulated using GATE. Simulated projections were created with subsets of the simulation data allowing the contributions of random, scatter, and LSO background to be independently evaluated. The simulated projections were reconstructed using the commercial software for the simulated scanner, and the quantitative accuracy of the reconstruction and the contrast recovery of the reconstructed images were evaluated. Results: The quantitative accuracy of the {sup 90}Y reconstructions were not strongly influenced by the high random fraction present in the projection data, and the activity concentration was recovered to within 5% of the known value. The contrast recovery measured for simulated {sup 90}Y data was slightly poorer than that for simulated {sup 18}F data with similar count statistics. However, the degradation was not strongly linked to any particular factor. Using a more restricted energy range to reduce the random fraction in the projections had no significant effect. Conclusions: Simulations of {sup 90}Y PET confirm that quantitative {sup 90}Y is achievable with the same approach as that used for {sup 18}F, and that there is likely very little margin for improvement by attempting to model aspects unique to {sup 90}Y, such as the much higher random fraction or the presence of

  5. Identifying Anomalous Citations for Objective Evaluation of Scholarly Article Impact.

    PubMed

    Bai, Xiaomei; Xia, Feng; Lee, Ivan; Zhang, Jun; Ning, Zhaolong

    2016-01-01

    Evaluating the impact of a scholarly article is of great significance and has attracted great attentions. Although citation-based evaluation approaches have been widely used, these approaches face limitations e.g. in identifying anomalous citations patterns. This negligence would inevitably cause unfairness and inaccuracy to the article impact evaluation. In this study, in order to discover the anomalous citations and ensure the fairness and accuracy of research outcome evaluation, we investigate the citation relationships between articles using the following factors: collaboration times, the time span of collaboration, citing times and the time span of citing to weaken the relationship of Conflict of Interest (COI) in the citation network. Meanwhile, we study a special kind of COI, namely suspected COI relationship. Based on the COI relationship, we further bring forward the COIRank algorithm, an innovative scheme for accurately assessing the impact of an article. Our method distinguishes the citation strength, and utilizes PageRank and HITS algorithms to rank scholarly articles comprehensively. The experiments are conducted on the American Physical Society (APS) dataset. We find that about 80.88% articles contain contributed citations by co-authors in 26,366 articles and 75.55% articles among these articles are cited by the authors belonging to the same affiliation, indicating COI and suspected COI should not be ignored for evaluating impact of scientific papers objectively. Moreover, our experimental results demonstrate COIRank algorithm significantly outperforms the state-of-art solutions. The validity of our approach is verified by using the probability of Recommendation Intensity.

  6. Comprehensive benefit analysis of regional water resources based on multi-objective evaluation

    NASA Astrophysics Data System (ADS)

    Chi, Yixia; Xue, Lianqing; Zhang, Hui

    2018-01-01

    The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.

  7. Quantitative electromyography in ambulatory boys with Duchenne muscular dystrophy.

    PubMed

    Verma, Sumit; Lin, Jenny; Travers, Curtis; McCracken, Courtney; Shah, Durga

    2017-12-01

    This study's objective was to evaluate quantitative electromyography (QEMG) using multiple-motor-unit (multi-MUP) analysis in Duchenne muscular dystrophy (DMD). Ambulatory DMD boys, aged 5-15 years, were evaluated with QEMG at 6-month intervals over 14 months. EMG was performed in the right biceps brachii (BB) and tibialis anterior (TA) muscles. Normative QEMG data were obtained from age-matched healthy boys. Wilcoxon signed-rank tests were performed. Eighteen DMD subjects were enrolled, with a median age of 7 (interquartile range 7-10) years. Six-month evaluations were performed on 14 subjects. QEMG showed significantly abnormal mean MUP duration in BB and TA muscles, with no significant change over 6 months. QEMG is a sensitive electrophysiological marker of myopathy in DMD. Preliminary data do not reflect a significant change in MUP parameters over a 6-month interval; long-term follow-up QEMG studies are needed to understand its role as a biomarker for disease progression. Muscle Nerve 56: 1361-1364, 2017. © 2017 Wiley Periodicals, Inc.

  8. TECHNOLOGICAL INNOVATION IN NEUROSURGERY: A QUANTITATIVE STUDY

    PubMed Central

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-01-01

    Object Technological innovation within healthcare may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technologically intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical technique. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation respectively. Methods A patent database was searched between 1960 and 2010 using the search terms “neurosurgeon” OR “neurosurgical” OR “neurosurgery”. The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top performing technology cluster was then selected as an exemplar for more detailed analysis of individual patents. Results In all, 11,672 patents and 208,203 publications relating to neurosurgery were identified. The top performing technology clusters over the 50 years were: image guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes and endoscopes. Image guidance and neuromodulation devices demonstrated a highly correlated rapid rise in patents and publications, suggesting they are areas of technology expansion. In-depth analysis of neuromodulation patents revealed that the majority of high performing patents were related to Deep Brain Stimulation (DBS). Conclusions Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery. PMID:25699414

  9. Quantitative evaluation of the CEEM soil sampling intercomparison.

    PubMed

    Wagner, G; Lischer, P; Theocharopoulos, S; Muntau, H; Desaules, A; Quevauviller, P

    2001-01-08

    The aim of the CEEM soil project was to compare and to test the soil sampling and sample preparation guidelines used in the member states of the European Union and Switzerland for investigations of background and large-scale contamination of soils, soil monitoring and environmental risk assessments. The results of the comparative evaluation of the sampling guidelines demonstrated that, in soil contamination studies carried out with different sampling strategies and methods, comparable results can hardly be expected. Therefore, a reference database (RDB) was established by the organisers, which acted as a basis for the quantitative comparison of the participants' results. The detected deviations were related to the methodological details of the individual strategies. The comparative evaluation concept consisted of three steps: The first step was a comparison of the participants' samples (which were both centrally and individually analysed) between each other, as well as with the reference data base (RDB) and some given soil quality standards on the level of concentrations present. The comparison was made using the example of the metals cadmium, copper, lead and zinc. As a second step, the absolute and relative deviations between the reference database and the participants' results (both centrally analysed under repeatability conditions) were calculated. The comparability of the samples with the RDB was categorised on four levels. Methods of exploratory statistical analysis were applied to estimate the differential method bias among the participants. The levels of error caused by sampling and sample preparation were compared with those caused by the analytical procedures. As a third step, the methodological profiles of the participants were compiled to concisely describe the different procedures used. They were related to the results to find out the main factors leading to their incomparability. The outcome of this evaluation process was a list of strategies and

  10. A Quantitative Technique for Beginning Microscopists.

    ERIC Educational Resources Information Center

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  11. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  12. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  14. A quantitative approach to evolution of music and philosophy

    NASA Astrophysics Data System (ADS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano

    2012-08-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.

  15. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  16. Technology efficacy in active prosthetic knees for transfemoral amputees: a quantitative evaluation.

    PubMed

    El-Sayed, Amr M; Hamzaid, Nur Azah; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development.

  17. How much are Harry Potter’s glasses worth? Children’s monetary evaluation of authentic objects

    PubMed Central

    Gelman, Susan A.; Frazier, Brandy N.; Noles, Nicholaus S.; Manczak, Erika M.; Stilwell, Sarah M.

    2014-01-01

    Adults attach special value to objects that link to notable people or events – authentic objects. We examined children’s monetary evaluation of authentic objects, focusing on four kinds: celebrity possessions (e.g., Harry Potter’s glasses), original creations (e.g., the very first teddy bear), personal possessions (e.g., your grandfather’s baseball glove), and merely old items (e.g., an old chair). Children ages 4–12 years and adults (N= 151) were asked how much people would pay for authentic and control objects. Young children consistently placed greater monetary value on celebrity possessions than original creations, even when adults judged the two kinds of items to be equivalent. These results suggest that contact with a special individual may be the foundation for the value placed on authentic objects. PMID:25663829

  18. Quantitative evaluation of photoplethysmographic artifact reduction for pulse oximetry

    NASA Astrophysics Data System (ADS)

    Hayes, Matthew J.; Smith, Peter R.

    1999-01-01

    Motion artefact corruption of pulse oximeter output, causing both measurement inaccuracies and false alarm conditions, is a primary restriction in the current clinical practice and future applications of this useful technique. Artefact reduction in photoplethysmography (PPG), and therefore by application in pulse oximetry, is demonstrated using a novel non-linear methodology recently proposed by the authors. The significance of these processed PPG signals for pulse oximetry measurement is discussed, with particular attention to the normalization inherent in the artefact reduction process. Quantitative experimental investigation of the performance of PPG artefact reduction is then utilized to evaluate this technology for application to pulse oximetry. While the successfully demonstrated reduction of severe artefacts may widen the applicability of all PPG technologies and decrease the occurrence of pulse oximeter false alarms, the observed reduction of slight artefacts suggests that many such effects may go unnoticed in clinical practice. The signal processing and output averaging used in most commercial oximeters can incorporate these artefact errors into the output, while masking the true PPG signal corruption. It is therefore suggested that PPG artefact reduction should be incorporated into conventional pulse oximetry measurement, even in the absence of end-user artefact problems.

  19. Progress in quantitative GPR development at CNDE

    NASA Astrophysics Data System (ADS)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-02-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  20. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    NASA Astrophysics Data System (ADS)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  1. Evaluation of background parenchymal enhancement on breast MRI: a systematic review

    PubMed Central

    Signori, Alessio; Valdora, Francesca; Rossi, Federica; Calabrese, Massimo; Durando, Manuela; Mariscotto, Giovanna; Tagliafico, Alberto

    2017-01-01

    Objective: To perform a systematic review of the methods used for background parenchymal enhancement (BPE) evaluation on breast MRI. Methods: Studies dealing with BPE assessment on breast MRI were retrieved from major medical libraries independently by four reviewers up to 6 October 2015. The keywords used for database searching are “background parenchymal enhancement”, “parenchymal enhancement”, “MRI” and “breast”. The studies were included if qualitative and/or quantitative methods for BPE assessment were described. Results: Of the 420 studies identified, a total of 52 articles were included in the systematic review. 28 studies performed only a qualitative assessment of BPE, 13 studies performed only a quantitative assessment and 11 studies performed both qualitative and quantitative assessments. A wide heterogeneity was found in the MRI sequences and in the quantitative methods used for BPE assessment. Conclusion: A wide variability exists in the quantitative evaluation of BPE on breast MRI. More studies focused on a reliable and comparable method for quantitative BPE assessment are needed. Advances in knowledge: More studies focused on a quantitative BPE assessment are needed. PMID:27925480

  2. Semi-Automatic Segmentation Software for Quantitative Clinical Brain Glioblastoma Evaluation

    PubMed Central

    Zhu, Y; Young, G; Xue, Z; Huang, R; You, H; Setayesh, K; Hatabu, H; Cao, F; Wong, S.T.

    2012-01-01

    Rationale and Objectives Quantitative measurement provides essential information about disease progression and treatment response in patients with Glioblastoma multiforme (GBM). The goal of this paper is to present and validate a software pipeline for semi-automatic GBM segmentation, called AFINITI (Assisted Follow-up in NeuroImaging of Therapeutic Intervention), using clinical data from GBM patients. Materials and Methods Our software adopts the current state-of-the-art tumor segmentation algorithms and combines them into one clinically usable pipeline. Both the advantages of the traditional voxel-based and the deformable shape-based segmentation are embedded into the software pipeline. The former provides an automatic tumor segmentation scheme based on T1- and T2-weighted MR brain data, and the latter refines the segmentation results with minimal manual input. Results Twenty six clinical MR brain images of GBM patients were processed and compared with manual results. The results can be visualized using the embedded graphic user interface (GUI). Conclusion Validation results using clinical GBM data showed high correlation between the AFINITI results and manual annotation. Compared to the voxel-wise segmentation, AFINITI yielded more accurate results in segmenting the enhanced GBM from multimodality MRI data. The proposed pipeline could be used as additional information to interpret MR brain images in neuroradiology. PMID:22591720

  3. Establishment of a new method to quantitatively evaluate hyphal fusion ability in Aspergillus oryzae.

    PubMed

    Tsukasaki, Wakako; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko

    2014-01-01

    Hyphal fusion is involved in the formation of an interconnected colony in filamentous fungi, and it is the first process in sexual/parasexual reproduction. However, it was difficult to evaluate hyphal fusion efficiency due to the low frequency in Aspergillus oryzae in spite of its industrial significance. Here, we established a method to quantitatively evaluate the hyphal fusion ability of A. oryzae with mixed culture of two different auxotrophic strains, where the ratio of heterokaryotic conidia growing without the auxotrophic requirements reflects the hyphal fusion efficiency. By employing this method, it was demonstrated that AoSO and AoFus3 are required for hyphal fusion, and that hyphal fusion efficiency of A. oryzae was increased by depleting nitrogen source, including large amounts of carbon source, and adjusting pH to 7.0.

  4. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A., E-mail: willi.kalender@imp.uni-erlangen.de

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, andmore » an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations

  5. A simple hemostasis model for the quantitative evaluation of hydrogel-based local hemostatic biomaterials on tissue surface.

    PubMed

    Murakami, Yoshihiko; Yokoyama, Masayuki; Nishida, Hiroshi; Tomizawa, Yasuko; Kurosawa, Hiromi

    2008-09-01

    Several hemostat hydrogels are clinically used, and some other agents are studied for safer, more facile, and more efficient hemostasis. In the present paper, we proposed a novel method to evaluate local hemostat hydrogel on tissue surface. The procedure consisted of the following steps: (step 1) a mouse was fixed on a cork board, and its abdomen was incised; (step 2) serous fluid was carefully removed because it affected the estimation of the weight gained by the filter paper, and parafilm and preweighted filter paper were placed beneath the liver (parafilm prevented the filter paper's absorption of gradually oozing serous fluid); (step 3) the cork board was tilted and maintained at an angle of about 45 degrees so that the bleeding would more easily flow from the liver toward the filter paper; and (step 4) the bleeding lasted for 3 min. In this step, a hemostat was applied to the liver wound immediately after the liver was pricked with a needle. We found that (1) a careful removal of serous fluid prior to a bleeding and (2) a quantitative determination of the amount of excess aqueous solution that oozed out from a hemostat were important to a rigorous evaluation of hemostat efficacy. We successfully evaluated the efficacy of a fibrin-based hemostat hydrogel by using our method. The method proposed in the present study enabled the quantitative, accurate, and easy evaluation of the efficacy of local hemostatic hydrogel which acts as tissue-adhesive agent on biointerfaces.

  6. Quantitative T2 magnetic resonance imaging compared to morphological grading of the early cervical intervertebral disc degeneration: an evaluation approach in asymptomatic young adults.

    PubMed

    Chen, Chun; Huang, Minghua; Han, Zhihua; Shao, Lixin; Xie, Yan; Wu, Jianhong; Zhang, Yan; Xin, Hongkui; Ren, Aijun; Guo, Yong; Wang, Deli; He, Qing; Ruan, Dike

    2014-01-01

    The objective of this study was to evaluate the efficacy of quantitative T2 magnetic resonance imaging (MRI) for quantifying early cervical intervertebral disc (IVD) degeneration in asymptomatic young adults by correlating the T2 value with Pfirrmann grade, sex, and anatomic level. Seventy asymptomatic young subjects (34 men and 36 women; mean age, 22.80±2.11 yr; range, 18-25 years) underwent 3.0-T MRI to obtain morphological data (one T1-fast spin echo (FSE) and three-plane T2-FSE, used to assign a Pfirrmann grade (I-V)) and for T2 mapping (multi-echo spin echo). T2 values in the nucleus pulposus (NP, n = 350) and anulus fibrosus (AF, n = 700) were obtained. Differences in T2 values between sexes and anatomic level were evaluated, and linear correlation analysis of T2 values versus degenerative grade was conducted. Cervical IVDs of healthy young adults were commonly determined to be at Pfirrmann grades I and II. T2 values of NPs were significantly higher than those of AF at all anatomic levels (P<0.000). The NP, anterior AF and posterior AF values did not differ significantly between genders at the same anatomic level (P>0.05). T2 values decreased linearly with degenerative grade. Linear correlation analysis revealed a strong negative association between the Pfirrmann grade and the T2 values of the NP (P = 0.000) but not the T2 values of the AF (P = 0.854). However, non-degenerated discs (Pfirrmann grades I and II) showed a wide range of T2 relaxation time. T2 values according to disc degeneration level classification were as follows: grade I (>62.03 ms), grade II (54.60-62.03 ms), grade III (<54.60 ms). T2 quantitation provides a more sensitive and robust approach for detecting and characterizing the early stage of cervical IVD degeneration and to create a reliable quantitative in healthy young adults.

  7. Quantitative skeletal evaluation based on cervical vertebral maturation: a longitudinal study of adolescents with normal occlusion.

    PubMed

    Chen, L; Liu, J; Xu, T; Long, X; Lin, J

    2010-07-01

    The study aims were to investigate the correlation between vertebral shape and hand-wrist maturation and to select characteristic parameters of C2-C5 (the second to fifth cervical vertebrae) for cervical vertebral maturation determination by mixed longitudinal data. 87 adolescents (32 males, 55 females) aged 8-18 years with normal occlusion were studied. Sequential lateral cephalograms and hand-wrist radiographs were taken annually for 6 consecutive years. Lateral cephalograms were divided into 11 maturation groups according to Fishman Skeletal Maturity Indicators (SMI). 62 morphological measurements of C2-C5 at 11 different developmental stages (SMI1-11) were measured and analysed. Locally weighted scatterplot smoothing, correlation coefficient analysis and variable cluster analysis were used for statistical analysis. Of the 62 cervical vertebral parameters, 44 were positively correlated with SMI, 6 were negatively correlated and 12 were not correlated. The correlation coefficients between cervical vertebral parameters and SMI were relatively high. Characteristic parameters for quantitative analysis of cervical vertebral maturation were selected. In summary, cervical vertebral maturation could be used reliably to evaluate the skeletal stage instead of the hand-wrist radiographic method. Selected characteristic parameters offered a simple and objective reference for the assessment of skeletal maturity and timing of orthognathic surgery. Copyright 2010 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  8. Evaluation Report of the Presidential/Secretarial Objective -- School Management Options Available to Indian People. Research and Evaluation Report Series No. 29.08.

    ERIC Educational Resources Information Center

    National Indian Training and Research Center, Tempe, AZ.

    Presenting evaluations of the Bureau of Indian Affairs' (BIA) implementation of the School Management Option Project, a project designed to further American Indian control of schools and elevated to the status of Presidential/Secretarial Objective (P/SO) in 1974, this report includes separate evaluations by the National Indian Training and…

  9. Quantitative and Qualitative Evaluation of Iranian Researchers' Scientific Production in Dentistry Subfields.

    PubMed

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-10-01

    As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers' scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields.

  10. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data*

    PubMed Central

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G.; Khanna, Sanjeev

    2017-01-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings. PMID:29151821

  11. Automatized image processing of bovine blastocysts produced in vitro for quantitative variable determination

    NASA Astrophysics Data System (ADS)

    Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Maserati, Marc Peter, Jr.; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia

    2017-12-01

    There is currently no objective, real-time and non-invasive method for evaluating the quality of mammalian embryos. In this study, we processed images of in vitro produced bovine blastocysts to obtain a deeper comprehension of the embryonic morphological aspects that are related to the standard evaluation of blastocysts. Information was extracted from 482 digital images of blastocysts. The resulting imaging data were individually evaluated by three experienced embryologists who graded their quality. To avoid evaluation bias, each image was related to the modal value of the evaluations. Automated image processing produced 36 quantitative variables for each image. The images, the modal and individual quality grades, and the variables extracted could potentially be used in the development of artificial intelligence techniques (e.g., evolutionary algorithms and artificial neural networks), multivariate modelling and the study of defined structures of the whole blastocyst.

  12. Speckle-free and halo-free low coherent Mach-Zehnder quantitative-phase-imaging module as a replacement of objective lens in conventional inverted microscopes

    NASA Astrophysics Data System (ADS)

    Yamauchi, Toyohiko; Yamada, Hidenao; Matsui, Hisayuki; Yasuhiko, Osamu; Ueda, Yukio

    2018-02-01

    We developed a compact Mach-Zehnder interferometer module to be used as a replacement of the objective lens in a conventional inverted microscope (Nikon, TS100-F) in order to make them quantitative phase microscopes. The module has a 90-degree-flipped U-shape; the dimensions of the module are 160 mm by 120 mm by 40 mm and the weight is 380 grams. The Mach-Zehnder interferometer equipped with the separate reference and sample arms was implemented in this U-shaped housing and the path-length difference between the two arms was manually adjustable. The sample under test was put on the stage of the microscope and a sample light went through it. Both arms had identical achromatic lenses for image formation and the lateral positions of them were also manually adjustable. Therefore, temporally and spatially low coherent illumination was applicable because the users were able to balance precisely the path length of the two arms and to overlap the two wavefronts. In the experiment, spectrally filtered LED light for illumination (wavelength = 633 nm and bandwidth = 3 nm) was input to the interferometer module via a 50 micrometer core optical fiber. We have successfully captured full-field interference images by a camera put on the trinocular tube of the microscope and constructed quantitative phase images of the cultured cells by means of the quarter-wavelength phase shifting algorithm. The resultant quantitative phase images were speckle-free and halo-free due to spectrally and spatially low coherent illumination.

  13. An Evaluation of Learning Objects in Singapore Primary Education: A Case Study Approach

    ERIC Educational Resources Information Center

    Grace, Tay Pei Lyn; Suan, Ng Peck; Wanzhen, Liaw

    2008-01-01

    Purpose: The purpose of this paper is to evaluate the usability and interface design of e-learning portal developed for primary schools in Singapore. Design/methodology/approach: Using Singapore-based learning EDvantage (LEAD) portal as a case study, this paper reviews and analyses the usability and usefulness of embedded learning objects (LOs)…

  14. Quantitative contrast enhanced magnetic resonance imaging for the evaluation of peripheral arterial disease: a comparative study versus standard digital angiography.

    PubMed

    Pavlovic, Chris; Futamatsu, Hideki; Angiolillo, Dominick J; Guzman, Luis A; Wilke, Norbert; Siragusa, Daniel; Wludyka, Peter; Percy, Robert; Northrup, Martin; Bass, Theodore A; Costa, Marco A

    2007-04-01

    The purpose of this study is to evaluate the accuracy of semiautomated analysis of contrast enhanced magnetic resonance angiography (MRA) in patients who have undergone standard angiographic evaluation for peripheral vascular disease (PVD). Magnetic resonance angiography is an important tool for evaluating PVD. Although this technique is both safe and noninvasive, the accuracy and reproducibility of quantitative measurements of disease severity using MRA in the clinical setting have not been fully investigated. 43 lesions in 13 patients who underwent both MRA and digital subtraction angiography (DSA) of iliac and common femoral arteries within 6 months were analyzed using quantitative magnetic resonance angiography (QMRA) and quantitative vascular analysis (QVA). Analysis was repeated by a second operator and by the same operator in approximately 1 month time. QMRA underestimated percent diameter stenosis (%DS) compared to measurements made with QVA by 2.47%. Limits of agreement between the two methods were +/- 9.14%. Interobserver variability in measurements of %DS were +/- 12.58% for QMRA and +/- 10.04% for QVA. Intraobserver variability of %DS for QMRA was +/- 4.6% and for QVA was +/- 8.46%. QMRA displays a high level of agreement to QVA when used to determine stenosis severity in iliac and common femoral arteries. Similar levels of interobserver and intraobserver variability are present with each method. Overall, QMRA represents a useful method to quantify severity of PVD.

  15. Evaluating fuzzy operators of an object-based image analysis for detecting landslides and their changes

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei

    2017-09-01

    This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.

  16. Ultrasonographic evaluation of equine tendons: a quantitative in vitro study of the effects of amplifier gain level, transducer-tilt, and transducer-displacement.

    PubMed

    van Schie, J T; Bakker, E M; van Weeren, P R

    1999-01-01

    The objective of the in vitro experiments described in this paper was to quantify the effects of some instrumental variables on the quantitative evaluation, by means of first-order gray-level statistics, of ultrasonographic images of equine tendons. The experiments were done on three isolated equine superficial digital flexor tendons that were mounted in a frame and submerged in a waterbath. Sections with either normal tendon tissue, an acute lesion, or a chronic scar, were selected. In these sections, the following experiments were done: 1) a gradual increase of total amplifier gain output subdivided in 12 equal steps; 2) a transducer tilt plus or minus 3 degrees from perpendicular, with steps of 1 degree; and 3) a transducer displacement along, and perpendicular to, the tendon long axis, with 16 steps of 0.25 mm each. Transverse ultrasonographic images were collected, and in the regions of interest (ROI) first-order gray-level statistics were calculated to quantify the effects of each experiment. Some important observations were: 1) the total amplifier gain output has a substantial influence on the ultrasonographic image; for example, in the case of an acute lesion, a low gain setting results in an almost completely black image; whereas, with higher gain settings, a marked "filling in" effect on the lesion can be observed; 2) the relative effects of the tilting of the transducer are substantial in normal tendon tissue (18%) and chronic scar (12%); whereas, in the event of an acute lesion, the effects on the mean gray level are dramatic (40%); and 3) the relative effects of displacement of the transducer are small in normal tendon tissue, but on the other hand, the mean gray-level changes 7% in chronic scar, and even 20% in an acute lesion. In general, slight variations in scanner settings and transducer handling can have considerable effects on the gray levels of the ultrasonographic image. Furthermore, there is a strong indication that this quantitative method

  17. Relationships between objective acoustic indices and acoustic comfort evaluation in nonacoustic spaces

    NASA Astrophysics Data System (ADS)

    Kang, Jian

    2004-05-01

    Much attention has been paid to acoustic spaces such as concert halls and recording studios, whereas research on nonacoustic buildings/spaces has been rather limited, especially from the viewpoint of acoustic comfort. In this research a series of case studies has been carried out on this topic, considering various spaces including shopping mall atrium spaces, library reading rooms, football stadia, swimming spaces, churches, dining spaces, as well as urban open public spaces. The studies focus on the relationships between objective acoustic indices such as sound pressure level and reverberation time and perceptions of acoustic comfort. The results show that the acoustic atmosphere is an important consideration in such spaces and the evaluation of acoustic comfort may vary considerably even if the objective acoustic indices are the same. It is suggested that current guidelines and technical regulations are insufficient in terms of acoustic design of these spaces, and the relationships established from the case studies between objective and subjective aspects would be useful for developing further design guidelines. [Work supported partly by the British Academy.

  18. Automatic smoke evacuation in laparoscopic surgery: a simplified method for objective evaluation.

    PubMed

    Takahashi, Hidekazu; Yamasaki, Makoto; Hirota, Masashi; Miyazaki, Yasuaki; Moon, Jeong Ho; Souma, Yoshihito; Mori, Masaki; Doki, Yuichiro; Nakajima, Kiyokazu

    2013-08-01

    Although its theoretical usefulness has been reported, the true value of automatic smoke evacuation system in laparoscopic surgery remains unknown. This is mainly due to the lack of objective evaluation. The purpose of this study was to determine the efficacy of the automatic smoke evacuator in laparoscopic surgery, by real-time objective evaluation system using an industrial smoke-detection device. Six pigs were used in this study. Three surgical ports were placed and electrosurgical smoke was generated in a standard fashion, using either a high-frequency electrosurgical unit (HF-ESU) or laparosonic coagulating shears (LCS). The smoke was evacuated immediately in the evacuation group but not in the control nonevacuation group. The laparoscopic field-of-view was subjectively evaluated by ten independent surgeons. The composition of the surgical smoke was analyzed by mass spectrometry. The residual smoke in the abdominal cavity was aspirated manually into a smoke tester, and stains on a filter paper were image captured, digitized, and semiquantified. Subjective evaluation indicated superior field-of-view in the evacuation group, compared with the control, at 15 s after activation of the HF-ESU (P < 0.05). The smoke comprised various chemical compounds, including known carcinogens. The estimated volume of intra-abdominal residual smoke after activation of HF-ESU was significantly lower in the evacuation group (47.4 ± 16.6) than the control (76.7 ± 2.4, P = 0.0018). Only marginal amount of surgical smoke was detected in both groups after LCS when the tissue pad was free from burnt tissue deposits. However, the amount was significantly lower in the evacuation group (21.3 ± 10.7) than the control (75 ± 39.9, P = 0.044) when the tissue pad contained tissue sludge. Automatic smoke evacuation provides better field-of-view and reduces the risk of exposure to harmful compounds.

  19. Quantitative assessment in thermal image segmentation for artistic objects

    NASA Astrophysics Data System (ADS)

    Yousefi, Bardia; Sfarra, Stefano; Maldague, Xavier P. V.

    2017-07-01

    The application of the thermal and infrared technology in different areas of research is considerably increasing. These applications involve Non-destructive Testing (NDT), Medical analysis (Computer Aid Diagnosis/Detection- CAD), Arts and Archaeology among many others. In the arts and archaeology field, infrared technology provides significant contributions in term of finding defects of possible impaired regions. This has been done through a wide range of different thermographic experiments and infrared methods. The proposed approach here focuses on application of some known factor analysis methods such as standard Non-Negative Matrix Factorization (NMF) optimized by gradient-descent-based multiplicative rules (SNMF1) and standard NMF optimized by Non-negative least squares (NNLS) active-set algorithm (SNMF2) and eigen decomposition approaches such as Principal Component Thermography (PCT), Candid Covariance-Free Incremental Principal Component Thermography (CCIPCT) to obtain the thermal features. On one hand, these methods are usually applied as preprocessing before clustering for the purpose of segmentation of possible defects. On the other hand, a wavelet based data fusion combines the data of each method with PCT to increase the accuracy of the algorithm. The quantitative assessment of these approaches indicates considerable segmentation along with the reasonable computational complexity. It shows the promising performance and demonstrated a confirmation for the outlined properties. In particular, a polychromatic wooden statue and a fresco were analyzed using the above mentioned methods and interesting results were obtained.

  20. Objective evaluation of situation awareness for dynamic decision makers in teleoperations

    NASA Technical Reports Server (NTRS)

    Endsley, Mica R.

    1991-01-01

    Situation awareness, a current mental mode of the environment, is critical to the ability of operators to perform complex and dynamic tasks. This should be particularly true for teleoperators, who are separated from the situation they need to be aware of. The design of the man-machine interface must be guided by the goal of maintaining and enhancing situation awareness. The objective of this work has been to build a foundation upon which research in the area can proceed. A model of dynamic human decision making which is inclusive of situation awareness will be presented, along with a definition of situation awareness. A method for measuring situation awareness will also be presented as a tool for evaluating design concepts. The Situation Awareness Global Assessment Technique (SAGAT) is an objective measure of situation awareness originally developed for the fighter cockpit environment. The results of SAGAT validation efforts will be presented. Implications of this research for teleoperators and other operators of dynamic systems will be discussed.

  1. A Virtual Emergency Telemedicine Serious Game in Medical Training: A Quantitative, Professional Feedback-Informed Evaluation Study

    PubMed Central

    Constantinou, Riana; Marangos, Charis; Kyriacou, Efthyvoulos; Bamidis, Panagiotis; Dafli, Eleni; Pattichis, Constantinos S

    2015-01-01

    Background Serious games involving virtual patients in medical education can provide a controlled setting within which players can learn in an engaging way, while avoiding the risks associated with real patients. Moreover, serious games align with medical students’ preferred learning styles. The Virtual Emergency TeleMedicine (VETM) game is a simulation-based game that was developed in collaboration with the mEducator Best Practice network in response to calls to integrate serious games in medical education and training. The VETM game makes use of data from an electrocardiogram to train practicing doctors, nurses, or medical students for problem-solving in real-life clinical scenarios through a telemedicine system and virtual patients. The study responds to two gaps: the limited number of games in emergency cardiology and the lack of evaluations by professionals. Objective The objective of this study is a quantitative, professional feedback-informed evaluation of one scenario of VETM, involving cardiovascular complications. The study has the following research question: “What are professionals’ perceptions of the potential of the Virtual Emergency Telemedicine game for training people involved in the assessment and management of emergency cases?” Methods The evaluation of the VETM game was conducted with 90 professional ambulance crew nursing personnel specializing in the assessment and management of emergency cases. After collaboratively trying out one VETM scenario, participants individually completed an evaluation of the game (36 questions on a 5-point Likert scale) and provided written and verbal comments. The instrument assessed six dimensions of the game: (1) user interface, (2) difficulty level, (3) feedback, (4) educational value, (5) user engagement, and (6) terminology. Data sources of the study were 90 questionnaires, including written comments from 51 participants, 24 interviews with 55 participants, and 379 log files of their interaction with

  2. Using reusable learning objects (rlos) in injection skills teaching: Evaluations from multiple user types.

    PubMed

    Williams, Julia; O'Connor, Mórna; Windle, Richard; Wharrad, Heather J

    2015-12-01

    Clinical skills are a critical component of pre-registration nurse education in the United Kingdom, yet there is widespread concern about the clinical skills displayed by newly-qualified nurses. Novel means of supporting clinical skills education are required to address this. A package of Reusable Learning Objects (RLOs) was developed to supplement pre-registration teaching on the clinical skill of administering injection medication. RLOs are electronic resources addressing a single learning objective whose interactivity facilitates learning. This article evaluates a package of five injection RLOs across three studies: (1) questionnaires administered to pre-registration nursing students at University of Nottingham (UoN) (n=46) evaluating the RLO package as a whole; (2) individual RLOs evaluated in online questionnaires by educators and students from UoN; from other national and international institutions; and healthcare professionals (n=265); (3) qualitative evaluation of the RLO package by UoN injection skills tutors (n=6). Data from all studies were assessed for (1) access to, (2) usefulness, (3) impact and (4) integration of the RLOs. Study one found that pre-registration nursing students rate the RLO package highly across all categories, particularly underscoring the value of their self-test elements. Study two found high ratings in online assessments of individual RLOs by multiple users. The global reach is particularly encouraging here. Tutors reported insufficient levels of student-RLO access, which might be explained by the timing of their student exposure. Tutors integrate RLOs into teaching and agree on their use as teaching supplements, not substitutes for face-to-face education. This evaluation encompasses the first years postpackage release. Encouraging data on evaluative categories in this early review suggest that future evaluations are warranted to track progress as the package is adopted and evaluated more widely. Copyright © 2015 Elsevier Ltd

  3. Quantitative evaluation of hidden defects in cast iron components using ultrasound activated lock-in vibrothermography.

    PubMed

    Montanini, R; Freni, F; Rossi, G L

    2012-09-01

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.

  4. Adult Roles & Functions. Objective Based Evaluation System.

    ERIC Educational Resources Information Center

    West Virginia State Vocational Curriculum Lab., Cedar Lakes.

    This book of objective-based test items is designed to be used with the Adult Roles and Functions curriculum for a non-laboratory home economic course for grades eleven and twelve. It contains item banks for each cognitive objective in the curriculum. In addition, there is a form for the table of specifications to be developed for each unit. This…

  5. Performance evaluation method of electric energy data acquire system based on combination of subjective and objective weights

    NASA Astrophysics Data System (ADS)

    Gao, Chen; Ding, Zhongan; Deng, Bofa; Yan, Shengteng

    2017-10-01

    According to the characteristics of electric energy data acquire system (EEDAS), considering the availability of each index data and the connection between the index integrity, establishing the performance evaluation index system of electric energy data acquire system from three aspects as master station system, communication channel, terminal equipment. To determine the comprehensive weight of each index based on triangular fuzzy number analytic hierarchy process with entropy weight method, and both subjective preference and objective attribute are taken into consideration, thus realize the performance comprehensive evaluation more reasonable and reliable. Example analysis shows that, by combination with analytic hierarchy process (AHP) and triangle fuzzy numbers (TFN) to establish comprehensive index evaluation system based on entropy method, the evaluation results not only convenient and practical, but also more objective and accurate.

  6. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data

    PubMed Central

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi

    2015-01-01

    Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366

  7. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data.

    PubMed

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi

    2015-04-01

    Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  8. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  9. TU-AB-202-06: Quantitative Evaluation of Deformable Image Registration in MRI-Guided Adaptive Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mooney, K; Zhao, T; Green, O

    Purpose: To assess the performance of the deformable image registration algorithm used for MRI-guided adaptive radiation therapy using image feature analysis. Methods: MR images were collected from five patients treated on the MRIdian (ViewRay, Inc., Oakwood Village, OH), a three head Cobalt-60 therapy machine with an 0.35 T MR system. The images were acquired immediately prior to treatment with a uniform 1.5 mm resolution. Treatment sites were as follows: head/neck, lung, breast, stomach, and bladder. Deformable image registration was performed using the ViewRay software between the first fraction MRI and the final fraction MRI, and the DICE similarity coefficient (DSC)more » for the skin contours was reported. The SIFT and Harris feature detection and matching algorithms identified point features in each image separately, then found matching features in the other image. The target registration error (TRE) was defined as the vector distance between matched features on the two image sets. Each deformation was evaluated based on comparison of average TRE and DSC. Results: Image feature analysis produced between 2000–9500 points for evaluation on the patient images. The average (± standard deviation) TRE for all patients was 3.3 mm (±3.1 mm), and the passing rate of TRE<3 mm was 60% on the images. The head/neck patient had the best average TRE (1.9 mm±2.3 mm) and the best passing rate (80%). The lung patient had the worst average TRE (4.8 mm±3.3 mm) and the worst passing rate (37.2%). DSC was not significantly correlated with either TRE (p=0.63) or passing rate (p=0.55). Conclusions: Feature matching provides a quantitative assessment of deformable image registration, with a large number of data points for analysis. The TRE of matched features can be used to evaluate the registration of many objects throughout the volume, whereas DSC mainly provides a measure of gross overlap. We have a research agreement with ViewRay Inc.« less

  10. Evaluation of Content-Matched Range Monitoring Queries over Moving Objects in Mobile Computing Environments

    PubMed Central

    Jung, HaRim; Song, MoonBae; Youn, Hee Yong; Kim, Ung Mo

    2015-01-01

    A content-matched (CM) range monitoring query over moving objects continually retrieves the moving objects (i) whose non-spatial attribute values are matched to given non-spatial query values; and (ii) that are currently located within a given spatial query range. In this paper, we propose a new query indexing structure, called the group-aware query region tree (GQR-tree) for efficient evaluation of CM range monitoring queries. The primary role of the GQR-tree is to help the server leverage the computational capabilities of moving objects in order to improve the system performance in terms of the wireless communication cost and server workload. Through a series of comprehensive simulations, we verify the superiority of the GQR-tree method over the existing methods. PMID:26393613

  11. Evaluation of Content-Matched Range Monitoring Queries over Moving Objects in Mobile Computing Environments.

    PubMed

    Jung, HaRim; Song, MoonBae; Youn, Hee Yong; Kim, Ung Mo

    2015-09-18

    A content-matched (CM) rangemonitoring query overmoving objects continually retrieves the moving objects (i) whose non-spatial attribute values are matched to given non-spatial query values; and (ii) that are currently located within a given spatial query range. In this paper, we propose a new query indexing structure, called the group-aware query region tree (GQR-tree) for efficient evaluation of CMrange monitoring queries. The primary role of the GQR-tree is to help the server leverage the computational capabilities of moving objects in order to improve the system performance in terms of the wireless communication cost and server workload. Through a series of comprehensive simulations, we verify the superiority of the GQR-tree method over the existing methods.

  12. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  13. Quantitative Market Research Regarding Funding of District 8 Construction Projects

    DOT National Transportation Integrated Search

    1995-05-01

    The primary objective of this quantitative research is to provide information : for more effective decision making regarding the level of investment in various : transportation systems in District 8. : This objective was accomplished by establishing ...

  14. Psychosocial factors and sleep efficiency: discrepancies between subjective and objective evaluations of sleep.

    PubMed

    Jackowska, Marta; Dockray, Samantha; Hendrickx, Hilde; Steptoe, Andrew

    2011-01-01

    Self-reported sleep efficiency may not precisely reflect objective sleep patterns. We assessed whether psychosocial factors and affective responses are associated with discrepancies between subjective reports and objective measures of sleep efficiency. Participants were 199 working women aged 20 to 61 years. Standardized questionnaires were used to assess psychosocial characteristics and affect that included work stress, social support, happiness, and depressive symptoms. Objective measures of sleep were assessed on one week and one leisure night with an Actiheart monitor. Self-reported sleep efficiency was derived from the Jenkins Sleep Problems Scale. Discrepancies between self-reported and objective measures of sleep efficiency were computed by contrasting standardized measures of sleep problems with objectively measured sleep efficiency. Participants varied markedly in the discrepancies between self-reported and objective sleep measures. After adjustment for personal income, age, having children, marital status, body mass index, and negative affect, overcommitment (p = .002), low level of social support (p = .049), and poor self-rated heath (p = .02) were associated with overreporting of sleep difficulties and underestimation of sleep efficiency. Self-reported poor sleep efficiency was more prevalent among those more overcommitted at work (p = .009) and less happy (p = .02), as well as among those with lower level of social support (p = .03) and more depressive symptoms (p = .048), independently of covariates. Objective sleep efficiency was unrelated to psychosocial characteristics or affect. The extent to which self-reported evaluations of sleep efficiency reflect objective experience may be influenced by psychosocial characteristics and affect. Unless potential moderators of self-reported sleep efficiency are taken into account, associations between sleep and psychosocial factors relevant to health may be overestimated.

  15. A comparison of moving object detection methods for real-time moving object detection

    NASA Astrophysics Data System (ADS)

    Roshan, Aditya; Zhang, Yun

    2014-06-01

    Moving object detection has a wide variety of applications from traffic monitoring, site monitoring, automatic theft identification, face detection to military surveillance. Many methods have been developed across the globe for moving object detection, but it is very difficult to find one which can work globally in all situations and with different types of videos. The purpose of this paper is to evaluate existing moving object detection methods which can be implemented in software on a desktop or laptop, for real time object detection. There are several moving object detection methods noted in the literature, but few of them are suitable for real time moving object detection. Most of the methods which provide for real time movement are further limited by the number of objects and the scene complexity. This paper evaluates the four most commonly used moving object detection methods as background subtraction technique, Gaussian mixture model, wavelet based and optical flow based methods. The work is based on evaluation of these four moving object detection methods using two (2) different sets of cameras and two (2) different scenes. The moving object detection methods have been implemented using MatLab and results are compared based on completeness of detected objects, noise, light change sensitivity, processing time etc. After comparison, it is observed that optical flow based method took least processing time and successfully detected boundary of moving objects which also implies that it can be implemented for real-time moving object detection.

  16. Detecting Target Objects by Natural Language Instructions Using an RGB-D Camera

    PubMed Central

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Tang, Hongru; Xi, Ning

    2016-01-01

    Controlling robots by natural language (NL) is increasingly attracting attention for its versatility, convenience and no need of extensive training for users. Grounding is a crucial challenge of this problem to enable robots to understand NL instructions from humans. This paper mainly explores the object grounding problem and concretely studies how to detect target objects by the NL instructions using an RGB-D camera in robotic manipulation applications. In particular, a simple yet robust vision algorithm is applied to segment objects of interest. With the metric information of all segmented objects, the object attributes and relations between objects are further extracted. The NL instructions that incorporate multiple cues for object specifications are parsed into domain-specific annotations. The annotations from NL and extracted information from the RGB-D camera are matched in a computational state estimation framework to search all possible object grounding states. The final grounding is accomplished by selecting the states which have the maximum probabilities. An RGB-D scene dataset associated with different groups of NL instructions based on different cognition levels of the robot are collected. Quantitative evaluations on the dataset illustrate the advantages of the proposed method. The experiments of NL controlled object manipulation and NL-based task programming using a mobile manipulator show its effectiveness and practicability in robotic applications. PMID:27983604

  17. Quantitative assessment of upper extremities motor function in multiple sclerosis.

    PubMed

    Daunoraviciene, Kristina; Ziziene, Jurgita; Griskevicius, Julius; Pauk, Jolanta; Ovcinikova, Agne; Kizlaitiene, Rasa; Kaubrys, Gintaras

    2018-05-18

    Upper extremity (UE) motor function deficits are commonly noted in multiple sclerosis (MS) patients and assessing it is challenging because of the lack of consensus regarding its definition. Instrumented biomechanical analysis of upper extremity movements can quantify coordination with different spatiotemporal measures and facilitate disability rating in MS patients. To identify objective quantitative parameters for more accurate evaluation of UE disability and relate it to existing clinical scores. Thirty-four MS patients and 24 healthy controls (CG) performed a finger-to-nose test as fast as possible and, in addition, clinical evaluation kinematic parameters of UE were measured by using inertial sensors. Generally, a higher disability score was associated with an increase of several temporal parameters, like slower task performance. The time taken to touch their nose was longer when the task was fulfilled with eyes closed. Time to peak angular velocity significantly changed in MS patients (EDSS > 5.0). The inter-joint coordination significantly decreases in MS patients (EDSS 3.0-5.5). Spatial parameters indicated that maximal ROM changes were in elbow flexion. Our findings have revealed that spatiotemporal parameters are related to the UE motor function and MS disability level. Moreover, they facilitate clinical rating by supporting clinical decisions with quantitative data.

  18. OBJECTIVE EVALUATION OF HYPERACTIVATED MOTILITY IN RAT SPERMATOZA USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    EPA Science Inventory

    Objective evaluation of hyperactivated motility in rat spermatozoa using computer-assisted sperm analysis.

    Cancel AM, Lobdell D, Mendola P, Perreault SD.

    Toxicology Program, University of North Carolina, Chapel Hill, NC 27599, USA.

    The aim of this study was t...

  19. The book availability study as an objective measure of performance in a health sciences library.

    PubMed Central

    Kolner, S J; Welch, E C

    1985-01-01

    In its search for an objective overall diagnostic evaluation, the University of Illinois Library of the Health Sciences' Program Evaluation Committee selected a book availability measure; it is easy to administer and repeat, results are reproducible, and comparable data exist for other academic and health sciences libraries. The study followed the standard methodology in the literature with minor modifications. Patrons searching for particular books were asked to record item(s) needed and the outcome of the search. Library staff members then determined the reasons for failures in obtaining desired items. The results of the study are five performance scores. The first four represent the percentage probability of a library's operating with ideal effectiveness; the last provides an overall performance score. The scores of the Library of the Health Sciences demonstrated no unusual availability problems. The study was easy to implement and provided meaningful, quantitative, and objective data. PMID:3995202

  20. Quantitative Evaluation of PET Respiratory Motion Correction Using MR Derived Simulated Data

    NASA Astrophysics Data System (ADS)

    Polycarpou, Irene; Tsoumpas, Charalampos; King, Andrew P.; Marsden, Paul K.

    2015-12-01

    The impact of respiratory motion correction on quantitative accuracy in PET imaging is evaluated using simulations for variable patient specific characteristics such as tumor uptake and respiratory pattern. Respiratory patterns from real patients were acquired, with long quiescent motion periods (type-1) as commonly observed in most patients and with long-term amplitude variability as is expected under conditions of difficult breathing (type-2). The respiratory patterns were combined with an MR-derived motion model to simulate real-time 4-D PET-MR datasets. Lung and liver tumors were simulated with diameters of 10 and 12 mm and tumor-to-background ratio ranging from 3:1 to 6:1. Projection data for 6- and 3-mm PET resolution were generated for the Philips Gemini scanner and reconstructed without and with motion correction using OSEM (2 iterations, 23 subsets). Motion correction was incorporated into the reconstruction process based on MR-derived motion fields. Tumor peak standardized uptake values (SUVpeak) were calculated from 30 noise realizations. Respiratory motion correction improves the quantitative performance with the greatest benefit observed for patients of breathing type-2. For breathing type-1 after applying motion correction, SUVpeak of 12-mm liver tumor with 6:1 contrast was increased by 46% for a current PET resolution (i.e., 6 mm) and by 47% for a higher PET resolution (i.e., 3 mm). Furthermore, the results of this study indicate that the benefit of higher scanner resolution is small unless motion correction is applied. In particular, for large liver tumor (12 mm) with low contrast (3:1) after motion correction, the SUVpeak was increased by 34% for 6-mm resolution and by 50% for a higher PET resolution (i.e., 3-mm resolution. This investigation indicates that there is a high impact of respiratory motion correction on tumor quantitative accuracy and that motion correction is important in order to benefit from the increased resolution of future PET

  1. Quantitative fluorescence microscopy and image deconvolution.

    PubMed

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used

  2. Promoting ethical and objective practice in the medicolegal arena of disability evaluation.

    PubMed

    Martelli, M F; Zasler, N D; Johnson-Greene, D

    2001-08-01

    As providers of medical information and testimony, clinicians have ultimate responsibility for ethical conduct as it relates to this information. The authors offer the following recommendations for enhancing ethical relationships between expert clinicians and the courts. 1. Avoid or resist attorney efforts at enticement into joining the attorney-client team. Such compromises of scientific boundaries and ethical principles exist on a continuum ranging from standard attorney-client advocacy at the beginning of the expert consultation phase (e.g., promotional information at the forefront of retaining an expert, with either provision of selective or incomplete records or less than enthusiastic efforts to produce all records) and extending to completion of evaluation, when requests for changes in reports and documentation might be made. 2. Respect role boundaries and do not mix conflicting roles. Remember that the treating doctor possesses a bond with the patient but does not as a rule obtain complete preinjury and postinjury information in the context of assessing causality and apportionment. In contrast, the expert witness must conduct a thorough and multifaceted case analysis sans the physician-patient relationship in order to facilitate objectivity and allow optimum diagnostic formulations. Finally, the trial consultant's function in this adversarial process is to assist with critically scrutinizing and attacking positions of experts for the opposing side. These roles all represent inherently different interests, and mixing them can only reduce objectivity. 3. Insist on adequate time for thorough record review, evaluation, and report generation. Also insist on sufficient time and preparation for deposition and court appearances. 4. Work at building a reputation for general objectivity, reliance on multiple data sources, reaching opinions only after reviewing complete information from both sides, and completing the evaluation. 5. Spend a good amount of time actually

  3. Quantitative 4D Transcatheter Intraarterial Perfusion MR Imaging as a Method to Standardize Angiographic Chemoembolization Endpoints

    PubMed Central

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2011-01-01

    PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P < 0.001) intraprocedural perfusion reduction. CONCLUSION The SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520

  4. Methods for the field evaluation of quantitative G6PD diagnostics: a review.

    PubMed

    Ley, Benedikt; Bancone, Germana; von Seidlein, Lorenz; Thriemer, Kamala; Richards, Jack S; Domingo, Gonzalo J; Price, Ric N

    2017-09-11

    Individuals with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk of severe haemolysis following the administration of 8-aminoquinoline compounds. Primaquine is the only widely available 8-aminoquinoline for the radical cure of Plasmodium vivax. Tafenoquine is under development with the potential to simplify treatment regimens, but point-of-care (PoC) tests will be needed to provide quantitative measurement of G6PD activity prior to its administration. There is currently a lack of appropriate G6PD PoC tests, but a number of new tests are in development and are likely to enter the market in the coming years. As these are implemented, they will need to be validated in field studies. This article outlines the technical details for the field evaluation of novel quantitative G6PD diagnostics such as sample handling, reference testing and statistical analysis. Field evaluation is based on the comparison of paired samples, including one sample tested by the new assay at point of care and one sample tested by the gold-standard reference method, UV spectrophotometry in an established laboratory. Samples can be collected as capillary or venous blood; the existing literature suggests that potential differences in capillary or venous blood are unlikely to affect results substantially. The collection and storage of samples is critical to ensure preservation of enzyme activity, it is recommended that samples are stored at 4 °C and testing occurs within 4 days of collection. Test results can be visually presented as scatter plot, Bland-Altman plot, and a histogram of the G6PD activity distribution of the study population. Calculating the adjusted male median allows categorizing results according to G6PD activity to calculate standard performance indicators and to perform receiver operating characteristic (ROC) analysis.

  5. Evaluation Techniques for the Sandy Point Discovery Center, Great Bay National Estuarine Research Reserve.

    ERIC Educational Resources Information Center

    Heffernan, Bernadette M.

    1998-01-01

    Describes work done to provide staff of the Sandy Point Discovery Center with methods for evaluating exhibits and interpretive programming. Quantitative and qualitative evaluation measures were designed to assess the program's objective of estuary education. Pretest-posttest questionnaires and interviews are used to measure subjects' knowledge and…

  6. An Analysis on Usage Preferences of Learning Objects and Learning Object Repositories among Pre-Service Teachers

    ERIC Educational Resources Information Center

    Yeni, Sabiha; Ozdener, Nesrin

    2014-01-01

    The purpose of the study is to investigate how pre-service teachers benefit from learning objects repositories while preparing course content. Qualitative and quantitative data collection methods were used in a mixed methods approach. This study was carried out with 74 teachers from the Faculty of Education. In the first phase of the study,…

  7. A Preliminary Evaluation of Written Individualized Habilitation Objectives and Their Correspondence with Direct Implementation.

    ERIC Educational Resources Information Center

    DePaepe, Paris; And Others

    1994-01-01

    The individualized habilitation plans (IHP) for 11 adults with moderate to profound mental retardation living in community group homes were evaluated for correspondences with a subset of implemented objectives. A high degree of correspondence was found for two quality indicators (age appropriateness, functionality) but lower levels of…

  8. Object detection in MOUT: evaluation of a hybrid approach for confirmation and rejection of object detection hypotheses

    NASA Astrophysics Data System (ADS)

    Manger, Daniel; Metzler, Jürgen

    2014-03-01

    Military Operations in Urban Terrain (MOUT) require the capability to perceive and to analyze the situation around a patrol in order to recognize potential threats. A permanent monitoring of the surrounding area is essential in order to appropriately react to the given situation, where one relevant task is the detection of objects that can pose a threat. Especially the robust detection of persons is important, as in MOUT scenarios threats usually arise from persons. This task can be supported by image processing systems. However, depending on the scenario, person detection in MOUT can be challenging, e.g. persons are often occluded in complex outdoor scenes and the person detection also suffers from low image resolution. Furthermore, there are several requirements on person detection systems for MOUT such as the detection of non-moving persons, as they can be a part of an ambush. Existing detectors therefore have to operate on single images with low thresholds for detection in order to not miss any person. This, in turn, leads to a comparatively high number of false positive detections which renders an automatic vision-based threat detection system ineffective. In this paper, a hybrid detection approach is presented. A combination of a discriminative and a generative model is examined. The objective is to increase the accuracy of existing detectors by integrating a separate hypotheses confirmation and rejection step which is built by a discriminative and generative model. This enables the overall detection system to make use of both the discriminative power and the capability to detect partly hidden objects with the models. The approach is evaluated on benchmark data sets generated from real-world image sequences captured during MOUT exercises. The extension shows a significant improvement of the false positive detection rate.

  9. Quantitative ptychographic reconstruction by applying a probe constraint

    NASA Astrophysics Data System (ADS)

    Reinhardt, J.; Schroer, C. G.

    2018-04-01

    The coherent scanning technique X-ray ptychography has become a routine tool for high-resolution imaging and nanoanalysis in various fields of research such as chemistry, biology or materials science. Often the ptychographic reconstruction results are analysed in order to yield absolute quantitative values for the object transmission and illuminating probe function. In this work, we address a common ambiguity encountered in scaling the object transmission and probe intensity via the application of an additional constraint to the reconstruction algorithm. A ptychographic measurement of a model sample containing nanoparticles is used as a test data set against which to benchmark in the reconstruction results depending on the type of constraint used. Achieving quantitative absolute values for the reconstructed object transmission is essential for advanced investigation of samples that are changing over time, e.g., during in-situ experiments or in general when different data sets are compared.

  10. Quantitative and Qualitative Evaluation of Iranian Researchers’ Scientific Production in Dentistry Subfields

    PubMed Central

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-01-01

    Background: As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers’ scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. Methods: This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. Results: 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). Conclusions: The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields. PMID:26635439

  11. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  12. Using linked data to evaluate collisions with fixed objects in Pennsylvania : Crash Outcome Data Evaluation System (CODES) linked data demonstration project

    DOT National Transportation Integrated Search

    1998-10-01

    This report uses police-reported motor vehicle crash data linked to Emergency Medical Services data and hospital discharge data to evaluate the relative risk of injury posed by specific roadside objects in Pennsylvania. The report focuses primarily o...

  13. Structured decision making as a method for linking quantitative decision support to community fundamental objectives

    EPA Science Inventory

    Decision support intended to improve ecosystem sustainability requires that we link stakeholder priorities directly to quantitative tools and measures of desired outcomes. Actions taken at the community level can have large impacts on production and delivery of ecosystem service...

  14. Multi-sensor image fusion algorithm based on multi-objective particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Xie, Xia-zhu; Xu, Ya-wei

    2017-11-01

    On the basis of DT-CWT (Dual-Tree Complex Wavelet Transform - DT-CWT) theory, an approach based on MOPSO (Multi-objective Particle Swarm Optimization Algorithm) was proposed to objectively choose the fused weights of low frequency sub-bands. High and low frequency sub-bands were produced by DT-CWT. Absolute value of coefficients was adopted as fusion rule to fuse high frequency sub-bands. Fusion weights in low frequency sub-bands were used as particles in MOPSO. Spatial Frequency and Average Gradient were adopted as two kinds of fitness functions in MOPSO. The experimental result shows that the proposed approach performances better than Average Fusion and fusion methods based on local variance and local energy respectively in brightness, clarity and quantitative evaluation which includes Entropy, Spatial Frequency, Average Gradient and QAB/F.

  15. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  16. An Evaluation of a Community Health Intervention Programme Aimed at Improving Health and Wellbeing

    ERIC Educational Resources Information Center

    Strachan, G.; Wright, G. D.; Hancock, E.

    2007-01-01

    Objective: The objective of this evaluation was to examine the extent to which participants in the Tailor Made Leisure Package programme experienced any improvement in their health and wellbeing. Design: A quantitative survey. Setting: The Healthy Living Centre initiative is an example of a community-based intervention which was formalized as part…

  17. Application of shift-and-add algorithms for imaging objects within biological media

    NASA Astrophysics Data System (ADS)

    Aizert, Avishai; Moshe, Tomer; Abookasis, David

    2017-01-01

    The Shift-and-Add (SAA) technique is a simple mathematical operation developed to reconstruct, at high spatial resolution, atmospherically degraded solar images obtained from stellar speckle interferometry systems. This method shifts and assembles individual degraded short-exposure images into a single average image with significantly improved contrast and detail. Since the inhomogeneous refractive indices of biological tissue causes light scattering similar to that induced by optical turbulence in the atmospheric layers, we assume that SAA methods can be successfully implemented to reconstruct the image of an object within a scattering biological medium. To test this hypothesis, five SAA algorithms were evaluated for reconstructing images acquired from multiple viewpoints. After successfully retrieving the hidden object's shape, quantitative image quality metrics were derived, enabling comparison of imaging error across a spectrum of layer thicknesses, demonstrating the relative efficacy of each SAA algorithm for biological imaging.

  18. Returning to Work after Cancer: Quantitative Studies and Prototypical Narratives

    PubMed Central

    Steiner, John F.; Nowels, Carolyn T.; Main, Deborah S.

    2009-01-01

    Objective A combination of quantitative data and illustrative narratives may allow cancer survivorship researchers to disseminate their research findings more broadly. We identified recent, methodologically rigorous quantitative studies on return to work after cancer, summarized the themes from these studies, and illustrated those themes with narratives of individual cancer survivors. Methods We reviewed English-language studies of return to work for adult cancer survivors through June, 2008, and identified 13 general themes from papers that met methodological criteria (population-based sampling, prospective and longitudinal assessment, detailed assessment of work, evaluation of economic impact, assessment of moderators of work return, and large sample size). We drew survivorship narratives from a prior qualitative research study to illustrate these themes. Results Nine quantitative studies met 4 or more of our 6 methodological criteria. These studies suggested that most cancer survivors could return to work without residual disabilities. Cancer site, clinical prognosis, treatment modalities, socioeconomic status, and attributes of the job itself influenced the likelihood of work return. Three narratives - a typical survivor who returned to work after treatment, an individual unable to return to work, and an inspiring survivor who returned to work despite substantial barriers - illustrated many of the themes from the quantitative literature while providing additional contextual details. Conclusion Illustrative narratives can complement the findings of cancer survivorship research if researchers are rigorous and transparent in the selection, analysis, and retelling of those stories. PMID:19507264

  19. The Effect of Air Density on Atmospheric Electric Fields Required for Lightning Initiation from a Long Airborne Object

    NASA Technical Reports Server (NTRS)

    Bazelyan, E. M.; Aleksandrov, N. L.; Raizer, Yu. Pl.; Konchankov, A. M.

    2006-01-01

    The purpose of the work was to determine minimum atmospheric electric fields required for lightning initiation from an airborne vehicle at various altitudes up to 10 km. The problem was reduced to the determination of a condition for initiation of a viable positive leader from a conductive object in an ambient electric field. It was shown that, depending on air density and shape and dimensions of the object, critical atmospheric fields are governed by the condition for leader viability or that for corona onset. To establish quantitative criteria for reduced air densities, available observations of spark discharges in long laboratory gaps were analyzed, the effect of air density on leader velocity was discussed and evolution in time of the properties of plasma in the leader channel was numerically simulated. The results obtained were used to evaluate the effect of pressure on the quantitative relationships between the potential difference near the leader tip, leader current and its velocity; based on these relationships, criteria for steady development of a leader were determined for various air pressures. Atmospheric electric fields required for lightning initiation from rods and ellipsoidal objects of various dimensions were calculated at different air densities. It was shown that there is no simple way to extend critical ambient fields obtained for some given objects and pressures to other objects and pressures.

  20. Framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms in conjunction with 3D landmark localization and registration

    NASA Astrophysics Data System (ADS)

    Wörz, Stefan; Hoegen, Philipp; Liao, Wei; Müller-Eschner, Matthias; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik; Rohr, Karl

    2016-03-01

    We introduce a framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms. Phantoms are designed using a CAD system and created with a 3D printer, and comprise realistic shapes including branches and pathologies such as abdominal aortic aneurysms (AAA). To transfer ground truth information to the 3D image coordinate system, we use a landmark-based registration scheme utilizing fiducial markers integrated in the phantom design. For accurate 3D localization of the markers we developed a novel 3D parametric intensity model that is directly fitted to the markers in the images. We also performed a quantitative evaluation of different vessel segmentation approaches for a phantom of an AAA.

  1. Correlation of Objective Assessment Data With General Surgery Resident In-Training Evaluation Reports and Operative Volumes.

    PubMed

    Abdelsattar, Jad M; AlJamal, Yazan N; Ruparel, Raaj K; Rowse, Phillip G; Heller, Stephanie F; Farley, David R

    2018-05-14

    Faculty evaluations, ABSITE scores, and operative case volumes often tell little about true resident performance. We developed an objective structured clinical examination called the Surgical X-Games (5 rooms, 15 minutes each, 12-15 tests total, different for each postgraduate [PGY] level). We hypothesized that performance in X-Games will prove more useful in identifying areas of strength or weakness among general surgery (GS) residents than faculty evaluations, ABSITE scores, or operative cases volumes. PGY 2 to 5 GS residents (n = 35) were tested in a semiannual X-Games assessment using multiple simulation tasks: laparoscopic skills, bowel anastomosis, CT/CXR analysis, chest tube placement, etc. over 1 academic year. Resident scores were compared to their ABSITE, in-training evaluation reports, and operating room case numbers. Academic medical center. PGY-2, 3, 4, and 5 GS residents at Mayo Clinic in Rochester, MN. Results varied greatly within each class except for staff evaluations: in-training evaluation reports medians for PGY-2s were 5.3 (range: 5.0-6.0), PGY-3s 5.9 (5.5-6.3), PGY-4s 5.6 (5.0-6.0), and PGY-5s were 6.1 (5.6-6.9). Although ABSITE and operating room case volumes fluctated greatly with each PGY class, only X-Games scores (median: PGY-2 = 82, PGY-3 = 61, PGY-4 = 76, and PGY-5 = 60) correlated positively (p < 0.05) with operative case volume and negatively (p < 0.05) with staff evaluations. X-Games assessment generated wide differentiation of resident performance quickly, inexpensively, and objectively. Although "Minnesota-nice" surgical staff may feel all GS trainees are "above average," objective assessment tells us otherwise. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  2. Objective evaluation of surgical competency for minimally invasive surgery with a collection of simple tests

    PubMed Central

    Gonzalez-Neira, Eliana Maria; Jimenez-Mendoza, Claudia Patricia; Rugeles-Quintero, Saul

    2016-01-01

    Objective: This study aims at determining if a collection of 16 motor tests on a physical simulator can objectively discriminate and evaluate practitioners' competency level, i.e. novice, resident, and expert. Methods: An experimental design with three study groups (novice, resident, and expert) was developed to test the evaluation power of each of the 16 simple tests. An ANOVA and a Student Newman-Keuls (SNK) test were used to analyze results of each test to determine which of them can discriminate participants' competency level. Results: Four of the 16 tests used discriminated all of the three competency levels and 15 discriminated at least two of the three groups (α= 0.05). Moreover, other two tests differentiate beginners' level from intermediate, and other seven tests differentiate intermediate level from expert. Conclusion: The competency level of a practitioner of minimally invasive surgery can be evaluated by a specific collection of basic tests in a physical surgical simulator. Reduction of the number of tests needed to discriminate the competency level of surgeons can be the aim of future research. PMID:27226664

  3. The Integration of Evaluation Paradigms Through Metaphor.

    ERIC Educational Resources Information Center

    Felker, Roberta M.

    The point of view is presented that evaluation projects can be enriched by not using either an exclusively quantitative model or an exclusively qualitative model but by combining both models in one project. The concept of metaphor is used to clarify the usefulness of the combination. Iconic or holistic metaphors describe an object or event as…

  4. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  5. C-reactive protein estimation: a quantitative analysis for three nonsteroidal anti-inflammatory drugs: a randomized control trial.

    PubMed

    Salgia, Gaurav; Kulkarni, Deepak G; Shetty, Lakshmi

    2015-01-01

    C-reactive protein (CRP) estimation for quantitative analysis to assess anti-inflammatory action of nonsteroidal anti-inflammatory drugs (NSAIDs) after surgery in maxillofacial surgery. This study was to evaluate the efficacy of CRP as a quantitative analysis for objective assessment of efficacy of three NSAIDs in postoperative inflammation and pain control. The parallel study group design of randomization was done. Totally 60 patients were divided into three groups. CRP was evaluated at baseline and postoperatively (immediate and 72 h) after surgical removal of impacted lower third molar. The respective group received the drugs by random coding postoperatively. The assessment of pain control and inflammation using NSAIDs postoperatively after surgical removal of impacted lower third molar was qualitatively and quantitatively assessed with CRP levels. The blood sample of the patient was assessed immediate postoperatively and after 72 h. The visual analog scale (VAS) was used for assessment of pain and its correlation with CRP levels. Comparison of difference in levels of CRP levels had P < 0.05 with immediate postoperative and baseline levels. The duration of surgery with association of CRP levels P = 0.425 which was nonsignificant. The pain score was increased with mefenamic acid (P = 0.003), which was significant on VAS. Diclofenac had the best anti-inflammatory action. There was a significant increase in CRP levels in immediate postoperative values and 72 h. CRP test proved to be a useful indicator as a quantitative assessment tool for monitoring postsurgical inflammation and therapeutic effects of various anti-inflammatory drugs. CRP test is a useful indicator for quantitative assessment for comparative evaluation of NSAIDs.

  6. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  7. Quantitative evaluation of dual-flip-angle T1 mapping on DCE-MRI kinetic parameter estimation in head and neck

    PubMed Central

    Chow, Steven Kwok Keung; Yeung, David Ka Wai; Ahuja, Anil T; King, Ann D

    2012-01-01

    Purpose To quantitatively evaluate the kinetic parameter estimation for head and neck (HN) dynamic contrast-enhanced (DCE) MRI with dual-flip-angle (DFA) T1 mapping. Materials and methods Clinical DCE-MRI datasets of 23 patients with HN tumors were included in this study. T1 maps were generated based on multiple-flip-angle (MFA) method and different DFA combinations. Tofts model parameter maps of kep, Ktrans and vp based on MFA and DFAs were calculated and compared. Fitted parameter by MFA and DFAs were quantitatively evaluated in primary tumor, salivary gland and muscle. Results T1 mapping deviations by DFAs produced remarkable kinetic parameter estimation deviations in head and neck tissues. In particular, the DFA of [2º, 7º] overestimated, while [7º, 12º] and [7º, 15º] underestimated Ktrans and vp, significantly (P<0.01). [2º, 15º] achieved the smallest but still statistically significant overestimation for Ktrans and vp in primary tumors, 32.1% and 16.2% respectively. kep fitting results by DFAs were relatively close to the MFA reference compared to Ktrans and vp. Conclusions T1 deviations induced by DFA could result in significant errors in kinetic parameter estimation, particularly Ktrans and vp, through Tofts model fitting. MFA method should be more reliable and robust for accurate quantitative pharmacokinetic analysis in head and neck. PMID:23289084

  8. Automatic trajectory measurement of large numbers of crowded objects

    NASA Astrophysics Data System (ADS)

    Li, Hui; Liu, Ye; Chen, Yan Qiu

    2013-06-01

    Complex motion patterns of natural systems, such as fish schools, bird flocks, and cell groups, have attracted great attention from scientists for years. Trajectory measurement of individuals is vital for quantitative and high-throughput study of their collective behaviors. However, such data are rare mainly due to the challenges of detection and tracking of large numbers of objects with similar visual features and frequent occlusions. We present an automatic and effective framework to measure trajectories of large numbers of crowded oval-shaped objects, such as fish and cells. We first use a novel dual ellipse locator to detect the coarse position of each individual and then propose a variance minimization active contour method to obtain the optimal segmentation results. For tracking, cost matrix of assignment between consecutive frames is trainable via a random forest classifier with many spatial, texture, and shape features. The optimal trajectories are found for the whole image sequence by solving two linear assignment problems. We evaluate the proposed method on many challenging data sets.

  9. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    PubMed

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (p<0.05). However, the V e values decreased significantly only at week 9 (p=0.032), and no difference in the K ep was found between two groups. The BMD values of the OVX group decreased significantly compared with those of the control group from week 3 (p<0.05). Transmission electron microscopy showed tighter gaps between vascular endothelial cells with swollen mitochondria

  10. Minimum Objectives: A Measurement System to Provide Evaluation of Special Education in Regular Classrooms.

    ERIC Educational Resources Information Center

    Christie, Lu S.; McKenzie, Hugh S.

    Discussed is the use of minimum behavioral objectives to provide evaluation of special education in regular classrooms. Literature which supports the mainstreaming of moderately handicapped children is reviewed briefly. Application of the behavioral model of education on the community level is considered in terms of the basic skills which comprise…

  11. Taxonomy based analysis of force exchanges during object grasping and manipulation

    PubMed Central

    Martin-Brevet, Sandra; Jarrassé, Nathanaël; Burdet, Etienne

    2017-01-01

    The flexibility of the human hand in object manipulation is essential for daily life activities, but remains relatively little explored with quantitative methods. On the one hand, recent taxonomies describe qualitatively the classes of hand postures for object grasping and manipulation. On the other hand, the quantitative analysis of hand function has been generally restricted to precision grip (with thumb and index opposition) during lifting tasks. The aim of the present study is to fill the gap between these two kinds of descriptions, by investigating quantitatively the forces exerted by the hand on an instrumented object in a set of representative manipulation tasks. The object was a parallelepiped object able to measure the force exerted on the six faces and its acceleration. The grasping force was estimated from the lateral force and the unloading force from the bottom force. The protocol included eleven tasks with complementary constraints inspired by recent taxonomies: four tasks corresponding to lifting and holding the object with different grasp configurations, and seven to manipulating the object (rotation around each of its axis and translation). The grasping and unloading forces and object rotations were measured during the five phases of the actions: unloading, lifting, holding or manipulation, preparation to deposit, and deposit. The results confirm the tight regulation between grasping and unloading forces during lifting, and extend this to the deposit phase. In addition, they provide a precise description of the regulation of force exchanges during various manipulation tasks spanning representative actions of daily life. The timing of manipulation showed both sequential and overlapping organization of the different sub-actions, and micro-errors could be detected. This phenomenological study confirms the feasibility of using an instrumented object to investigate complex manipulative behavior in humans. This protocol will be used in the future to

  12. Clinical study of quantitative diagnosis of early cervical cancer based on the classification of acetowhitening kinetics

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.

    2010-03-01

    A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.

  13. Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.

    PubMed

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-02-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.

  14. Planning for people? An evaluation of objectives for managing visitors at wildlife refuges in the United States

    Treesearch

    Jeffrey J. Brooks; Robert Massengale

    2011-01-01

    This study evaluates the quality of planning objectives for visitor services as written in Comprehensive Conservation Plans for the National Wildlife Refuge System of the United States. Planners in the U.S. Fish and Wildlife Service are predominantly writing public use objectives that address wildlife recreation and education. Results indicate that planners are writing...

  15. Using Live-Crown Ratio to Control Wood Quality: An Example of Quantitative Silviculture

    Treesearch

    Thomas J. Dean

    1999-01-01

    Quantitative silviculture is the application of biological relationships in meeting specific, quantitative management objectives. It is a two-sided approach requiring the identification and application of biological relationships. An example of quantitative silviculture is presented that uses a relationship between average-live crown ratio and relative stand density...

  16. Quantitative evaluation of multi-parametric MR imaging marker changes post-laser interstitial ablation therapy (LITT) for epilepsy

    NASA Astrophysics Data System (ADS)

    Tiwari, Pallavi; Danish, Shabbar; Wong, Stephen; Madabhushi, Anant

    2013-03-01

    Laser-induced interstitial thermal therapy (LITT) has recently emerged as a new, less invasive alternative to craniotomy for treating epilepsy; which allows for focussed delivery of laser energy monitored in real time by MRI, for precise removal of the epileptogenic foci. Despite being minimally invasive, the effects of laser ablation on the epileptogenic foci (reflected by changes in MR imaging markers post-LITT) are currently unknown. In this work, we present a quantitative framework for evaluating LITT-related changes by quantifying per-voxel changes in MR imaging markers which may be more reflective of local treatment related changes (TRC) that occur post-LITT, as compared to the standard volumetric analysis which involves monitoring a more global volume change across pre-, and post-LITT MRI. Our framework focuses on three objectives: (a) development of temporal MRI signatures that characterize TRC corresponding to patients with seizure freedom by comparing differences in MR imaging markers and monitoring them over time, (b) identification of the optimal time point when early LITT induced effects (such as edema and mass effect) subside by monitoring TRC at subsequent time-points post-LITT, and (c) identification of contributions of individual MRI protocols towards characterizing LITT-TRC for epilepsy by identifying MR markers that change most dramatically over time and employ individual contributions to create a more optimal weighted MP-MRI temporal profile that can better characterize TRC compared to any individual imaging marker. A cohort of patients were monitored at different time points post-LITT via MP-MRI involving T1-w, T2-w, T2-GRE, T2-FLAIR, and apparent diffusion coefficient (ADC) protocols. Post affine registration of individual MRI protocols to a reference MRI protocol pre-LITT, differences in individual MR markers are computed on a per-voxel basis, at different time-points with respect to baseline (pre-LITT) MRI as well as across subsequent time

  17. Object Creation and Human Factors Evaluation for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1998-01-01

    The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.

  18. Quantitative Appearance Inspection for Film Coated Tablets.

    PubMed

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  19. Fractional versus ablative erbium:yttrium-aluminum-garnet laser resurfacing for facial rejuvenation: an objective evaluation.

    PubMed

    El-Domyati, Moetaz; Abd-El-Raheem, Talal; Abdel-Wahab, Hossam; Medhat, Walid; Hosam, Wael; El-Fakahany, Hasan; Al Anwer, Mustafa

    2013-01-01

    Laser is one of the main tools for skin resurfacing. Erbium:yttrium-aluminum-garnet (Er:YAG) was the second ablative laser, after carbon dioxide, emitting wavelength of 2940 nm. Fractional laser resurfacing has been developed to overcome the drawbacks of ablative lasers. We aimed to objectively evaluate the histopathological and immunohistochemical effects of Er:YAG 2940-nm laser for facial rejuvenation (multiple sessions of fractional vs single session of ablative Er:YAG laser). Facial resurfacing with single-session ablative Er:YAG laser was performed on 6 volunteers. Another 6 were resurfaced using fractional Er:YAG laser (4 sessions). Histopathological (hematoxylin-eosin, orcein, Masson trichrome, and picrosirius red stains) and immunohistochemical assessment for skin biopsy specimens were done before laser resurfacing and after 1 and 6 months. Histometry for epidermal thickness and quantitative assessment for neocollagen formation; collagen I, III, and VII; elastin; and tropoelastin were done for all skin biopsy specimens. Both lasers resulted in increased epidermal thickness. Dermal collagen showed increased neocollagen formation with increased concentration of collagen types I, III, and VII. Dermal elastic tissue studies revealed decreased elastin whereas tropoelastin concentration increased after laser resurfacing. Neither laser showed significant difference between their effects clinically and on dermal collagen. Changes in epidermal thickness, elastin, and tropoelastin were significantly more marked after ablative laser. The small number of patients is a limitation, yet the results show significant improvement. Multiple sessions of fractional laser have comparable effects to a single session of ablative Er:YAG laser on dermal collagen but ablative laser has more effect on elastic tissue and epidermis. Copyright © 2012 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.

  20. A Team Approach to Management by Objectives with Special Emphasis on Managerial Self-Evaluation.

    ERIC Educational Resources Information Center

    Alvir, Howard P.

    This kit contains everything needed to explain, criticize and plan, simulate, and evaluate a management by objectives (MBO) program. The kit has been field tested in state agencies, schools, businesses, and volunteer organizations. Rather than present only the strengths of MBO, this program defines MBO, presents its strong points in discussing the…

  1. Quantitative Analysis of Verbal Expressions in Comments from Evaluation Committee Reviewers in AIST between Fiscal Years 2001 and 2008

    ERIC Educational Resources Information Center

    Yamamoto, Tetsuya

    2010-01-01

    This article discusses the quantitative analysis of verbal expressions of comments from the evaluation committee reviewers for 8 years (FY2001-FY2008) at the Japanese Public Research Institute, National Institute of Advanced Industrial Science and Technology (AIST). First, the terms often appearing in the comment sheets were observed. Moreover,…

  2. Tophaceous gout: quantitative evaluation by direct physical measurement.

    PubMed

    Schumacher, H Ralph; Becker, Michael A; Palo, William A; Streit, Janet; MacDonald, Patricia A; Joseph-Ridge, Nancy

    2005-12-01

    The absence of accepted standardized methods for monitoring tophaceous gout limits the ability to track tophus progression or regression. This multicenter study assessed intra- and interrater reproducibility of a simple and direct physical measurement. The quantitative evaluation was the area (mm2) of each measurable tophus and was determined independently by 2 raters on 2 occasions within 10 days. Intra- and interrater reproducibilities were determined by calculating mean differences and average percentage differences (APD) in measurements of areas for the same tophus at each of 2 visits and by each rater, respectively. Fifty-two tophi were measured in 13 subjects: 22 on the hand/wrist, 16 on the elbow, and 14 on the foot/ankle. The mean (+/- SD) difference in tophus areas between visits was -0.2 +/- 835 mm2 (95% CI -162 to 162 mm2) and the mean (+/- SD) APD was 29% +/- 33%. The mean (+/- SD) APD between raters was 32% +/- 27%. The largest variations in measurements were noted for elbow tophi and variations were least for well demarcated tophi on the hands. This simple and reproducible method can be easily utilized in clinical trials and in practice as a measure of efficacy of urate-lowering treatment in tophaceous gout. Among factors contributing to variability in these measurements were the anatomic site of tophi and rater experience with the method. Restriction of measurements to well circumscribed hand or foot tophi could improve reliability, but major changes, as expected with effective therapy, can clearly be documented with this simple technique.

  3. Quantitative Evaluation of Brain Stem Atrophy Using Magnetic Resonance Imaging in Adult Patients with Alexander Disease.

    PubMed

    Yoshida, Tomokatsu; Yasuda, Rei; Mizuta, Ikuko; Nakagawa, Masanori; Mizuno, Toshiki

    2017-01-01

    Brain MRI in adult patients with Alexander disease (AxD) mainly shows atrophy in the medulla oblongata. However, currently there is no quantitative standard for assessing this atrophy. In this study, we quantitatively evaluated the brain stem of AxD patients with glial fibrillary acidic protein (GFAP) mutation using conventional MRI to evaluate its usefulness as an aid to diagnosing AxD in daily clinical practice. Nineteen AxD patients with GFAP mutation were compared with 14 patients negative for GFAP mutation in whom AxD was suspected due to "atrophy of the medulla oblongata." In the GFAP mutation-positive group, the sagittal diameter of the medulla oblongata, the ratio of the diameter of the medulla oblongata to that of the midbrain (MO/MB), and the ratio of the sagittal diameter of the medulla oblongata to that of the pons (MO/Po) were significantly smaller compared to those of the GFAP mutation-negative group (p < 0.01). The sensitivity and specificity of each parameter were 87.5 and 92.3%, 91.7 and 81.3%, and 88.2 and 100% with a sagittal diameter of the medulla oblongata <9.0 mm, MO/MB <0.60, and sagittal MO/Po <0.46, respectively. These parameters can provide very useful information to differentially diagnose AxD from other disorders associated with brain stem atrophy in adult patients. © 2017 S. Karger AG, Basel.

  4. Highly sensitive and quantitative evaluation of the EGFR T790M mutation by nanofluidic digital PCR.

    PubMed

    Iwama, Eiji; Takayama, Koichi; Harada, Taishi; Okamoto, Isamu; Ookubo, Fumihiko; Kishimoto, Junji; Baba, Eishi; Oda, Yoshinao; Nakanishi, Yoichi

    2015-08-21

    The mutation of T790M in EGFR is a major mechanism of resistance to treatment with EGFR-TKIs. Only qualitative detection (presence or absence) of T790M has been described to date, however. Digital PCR (dPCR) analysis has recently been applied to the quantitative detection of target molecules in cancer with high sensitivity. In the present study, 25 tumor samples (13 obtained before and 12 after EGFR-TKI treatment) from 18 NSCLC patients with activating EGFR mutations were evaluated for T790M with dPCR. The ratio of the number of T790M alleles to that of activating mutation alleles (T/A) was determined. dPCR detected T790M in all 25 samples. Although T790M was present in all pre-TKI samples from 13 patients, 10 of these patients had a low T/A ratio and manifested substantial tumor shrinkage during treatment with EGFR-TKIs. In six of seven patients for whom both pre- and post-TKI samples were available, the T/A ratio increased markedly during EGFR-TKI treatment. Highly sensitive dPCR thus detected T790M in all NSCLC patients harboring activating EGFR mutations whether or not they had received EGFR-TKI treatment. Not only highly sensitive but also quantitative detection of T790M is important for evaluation of the contribution of T790M to EGFR-TKI resistance.

  5. Non cardiopatic and cardiopatic beta thalassemic patients: quantitative and qualitative cardiac iron deposition evaluation with MRI.

    PubMed

    Macarini, L; Marini, S; Pietrapertosa, A; Scardapane, A; Ettorre, G C

    2005-01-01

    Cardiomyopathy is one of the major complications of b thalassaemia major as a result of transfusional iron overload. The aim of our study is to evaluate with MR if there is any difference of iron deposition signal intensity (SI) or distribution between non-cardiopathic and cardiopathic thalassaemic patients in order to establish if there is a relationship between cardiopathy and iron deposition. We studied 20 patients affected by b thalassaemia major, of whom 10 cardiopathic and 10 non-cardiopathic, and 10 healthy volunteers as control group. Serum ferritin and left ventricular ejection fraction were calculated in thalassaemic patients. All patients were examined using a 1.5 MR unit with ECG-gated GE cine-MR T2*-weighted, SE T1-weighted and GE T2*-weighted sequences. In all cases, using an adequate ROI, the myocardial and skeletal muscle signal intensity (SI), the myocardial/skeletal muscle signal intensity ratio (SIR) and the SI average of the myocardium and skeletal muscle were calculated for every study group. The qualitative evaluation of iron deposition distribution was independently performed by three radiologists who analyzed the extension, the site and the morphology of iron deposition on the MR images and reported their observations on the basis of a four-level rating scale: 0 (absent), 1 (limited), 2 (partial), 3 (widespread deposition). The result of quantitative and qualitative evaluations were analysed with statistical tests. Cardiac iron deposition was found in 8/10 non-cardiopathic thalassaemic patients and in all cardiopathic thalassaemic patients. We noticed a significant SI difference (p>0.05) between the healthy volunteer control group and the thalassaemic patients with iron deposition, but no significant SI difference in iron deposition between non-cardiopathic and cardiopathic thalassaemic patients in the areas evaluated. The qualitative evaluation revealed a different distribution of iron deposition between the two thalassaemic groups, with

  6. Nuclear medicine and imaging research (quantitative studies in radiopharmaceutical science)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1990-09-01

    This is a report of progress in Year Two (January 1, 1990--December 31, 1990) of Grant FG02-86ER60438, Quantitative Studies in Radiopharmaceutical Science,'' awarded for the three-year period January 1, 1989--December 31, 1991 as a competitive renewal following site visit in the fall of 1988. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further themore » development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 25 refs., 13 figs., 1 tab.« less

  7. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides

    PubMed Central

    Kniss, Andrew R.; Coburn, Carl W.

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman’s rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact. PMID:26121252

  8. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides.

    PubMed

    Kniss, Andrew R; Coburn, Carl W

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman's rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact.

  9. Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Leonard, Desiree M.

    1991-01-01

    Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.

  10. Consumer-based technology for distribution of surgical videos for objective evaluation.

    PubMed

    Gonzalez, Ray; Martinez, Jose M; Lo Menzo, Emanuele; Iglesias, Alberto R; Ro, Charles Y; Madan, Atul K

    2012-08-01

    The Global Operative Assessment of Laparoscopic Skill (GOALS) is one validated metric utilized to grade laparoscopic skills and has been utilized to score recorded operative videos. To facilitate easier viewing of these recorded videos, we are developing novel techniques to enable surgeons to view these videos. The objective of this study is to determine the feasibility of utilizing widespread current consumer-based technology to assist in distributing appropriate videos for objective evaluation. Videos from residents were recorded via a direct connection from the camera processor via an S-video output via a cable into a hub to connect to a standard laptop computer via a universal serial bus (USB) port. A standard consumer-based video editing program was utilized to capture the video and record in appropriate format. We utilized mp4 format, and depending on the size of the file, the videos were scaled down (compressed), their format changed (using a standard video editing program), or sliced into multiple videos. Standard available consumer-based programs were utilized to convert the video into a more appropriate format for handheld personal digital assistants. In addition, the videos were uploaded to a social networking website and video sharing websites. Recorded cases of laparoscopic cholecystectomy in a porcine model were utilized. Compression was required for all formats. All formats were accessed from home computers, work computers, and iPhones without difficulty. Qualitative analyses by four surgeons demonstrated appropriate quality to grade for these formats. Our preliminary results show promise that, utilizing consumer-based technology, videos can be easily distributed to surgeons to grade via GOALS via various methods. Easy accessibility may help make evaluation of resident videos less complicated and cumbersome.

  11. TU-H-CAMPUS-IeP2-01: Quantitative Evaluation of PROPELLER DWI Using QIBA Diffusion Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yung, J; Ai, H; Liu, H

    Purpose: The purpose of this study is to determine the quantitative variability of apparent diffusion coefficient (ADC) values when varying imaging parameters in a diffusion-weighted (DW) fast spin echo (FSE) sequence with Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction (PROPELLER) k-space trajectory. Methods: Using a 3T MRI scanner, a NIST traceable, quantitative magnetic resonance imaging (MRI) diffusion phantom (High Precision Devices, Inc, Boulder, Colorado) consisting of 13 vials filled with various concentrations of polymer polyvinylpyrrolidone (PVP) in aqueous solution was imaged with a standard Quantitative Imaging Biomarkers Alliance (QIBA) DWI spin echo, echo planar imaging (SE EPI) acquisition. Themore » same phantom was then imaged with a DWI PROPELLER sequence at varying echo train lengths (ETL) of 8, 20, and 32, as well as b-values of 400, 900, and 2000. QIBA DWI phantom analysis software was used to generate ADC maps and create region of interests (ROIs) for quantitative measurements of each vial. Mean and standard deviations of the ROIs were compared. Results: The SE EPI sequence generated ADC values that showed very good agreement with the known ADC values of the phantom (r2 = 0.9995, slope = 1.0061). The ADC values measured from the PROPELLER sequences were inflated, but were highly correlated with an r2 range from 0.8754 to 0.9880. The PROPELLER sequence with an ETL=20 and b-value of 0 and 2000 showed the closest agreement (r2 = 0.9034, slope = 0.9880). Conclusion: The DW PROPELLER sequence is promising for quantitative evaluation of ADC values. A drawback of the PROPELLER sequence is the longer acquisition time. The 180° refocusing pulses may also cause the observed increase in ADC values compared to the standard SE EPI DW sequence. However, the FSE sequence offers an advantage with in-plane motion and geometric distortion which will be investigated in future studies.« less

  12. Evaluation of acute ischemic stroke using quantitative EEG: a comparison with conventional EEG and CT scan.

    PubMed

    Murri, L; Gori, S; Massetani, R; Bonanni, E; Marcella, F; Milani, S

    1998-06-01

    The sensitivity of quantitative electroencephalogram (EEG) was compared with that of conventional EEG in patients with acute ischaemic stroke. In addition, a correlation between quantitative EEG data and computerized tomography (CT) scan findings was carried out for all the areas of lesion in order to reassess the actual role of EEG in the evaluation of stroke. Sixty-five patients were tested with conventional and quantitative EEG within 24 h from the onset of neurological symptoms, whereas CT scan was performed within 4 days from the onset of stroke. EEG was recorded from 19 electrodes placed upon the scalp according to the International 10-20 System. Spectral analysis was carried out on 30 artefact-free 4-sec epochs. For each channel absolute and relative power were calculated for the delta, theta, alpha and beta frequency bands and such data were successively represented in colour-coded maps. Ten patients with extensive lesions documented by CT scan were excluded. The results indicated that conventional EEG revealed abnormalities in 40 of 55 cases, while EEG mapping showed abnormalities in 46 of 55 cases: it showed focal abnormalities in five cases and nonfocal abnormalities in one of six cases which had appeared to be normal according to visual inspection of EEG. In a further 11 cases, where the conventional EEG revealed abnormalities in one hemisphere, the quantitative EEG and maps allowed to further localize abnormal activity in a more localized way. The sensitivity of both methods was higher for frontocentral, temporal and parieto-occipital cortical-subcortical infarctions than for basal ganglia and internal capsule lesions; however, quantitative EEG was more efficient for all areas of lesion in detecting cases that had appeared normal by visual inspection and was clearly superior in revealing focal abnormalities. When we considered the electrode related to which the maximum power of the delta frequency band is recorded, a fairly close correlation was found

  13. Qualitative and quantitative evaluation of human dental enamel after bracket debonding: a noncontact three-dimensional optical profilometry analysis.

    PubMed

    Ferreira, Fabiano G; Nouer, Darcy F; Silva, Nelson P; Garbui, Ivana U; Correr-Sobrinho, Lourenço; Nouer, Paulo R A

    2014-09-01

    The aim of this study was to undertake a qualitative and quantitative evaluation of changes on enamel surfaces after debonding of brackets followed by finishing procedures, using a high-resolution three-dimensional optical profiler and to investigate the accuracy of the technique. The labial surfaces of 36 extracted upper central incisors were examined. Before bonding, the enamel surfaces were subjected to profilometry, recording four amplitude parameters. Brackets were then bonded using two types of light-cured orthodontic adhesive: composite resin and resin-modified glass ionomer cement. Finishing was performed by three different methods: pumice on a rubber cup, fine and ultrafine aluminum oxide discs, and microfine diamond cups followed by silicon carbide brushes. The samples were subsequently re-analyzed by profilometry. Wilcoxon signed-rank test, Kruskal-Wallis test (p < 0.05) and a posteriori Mann-Whitney U test with Bonferroni correction (p < 0.0167) revealed a significant reduction of enamel roughness when diamond cups followed by silicon carbide brushes were used to finish surfaces that had remnants of resin-modified glass ionomer adhesive and when pumice was used to finish surfaces that had traces of composite resin. Enamel loss was minimal. The 3D optical profilometry technique was able to provide accurate qualitative and quantitative assessment of changes on the enamel surface after debonding. Morphological changes in the topography of dental surfaces, especially if related to enamel loss and roughness, are of considerable clinical importance. The quantitative evaluation method used herein enables a more comprehensive understanding of the effects of orthodontic bonding on teeth.

  14. Quantitative Hyperspectral Reflectance Imaging

    PubMed Central

    Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.

    2008-01-01

    Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831

  15. Combination of optically measured coordinates and displacements for quantitative investigation of complex objects

    NASA Astrophysics Data System (ADS)

    Andrae, Peter; Beeck, Manfred-Andreas; Jueptner, Werner P. O.; Nadeborn, Werner; Osten, Wolfgang

    1996-09-01

    Holographic interferometry makes it possible to measure high precision displacement data in the range of the wavelength of the used laser light. However, the determination of 3D- displacement vectors of objects with complex surfaces requires the measurement of 3D-object coordinates not only to consider local sensitivities but to distinguish between in-plane deformation, i.e. strains, and out-of-plane components, i.e. shears, too. To this purpose both the surface displacement and coordinates have to be combined and it is advantageous to make the data available for CAE- systems. The object surface has to be approximated analytically from the measured point cloud to generate a surface mesh. The displacement vectors can be assigned to the nodes of this surface mesh for visualization of the deformation of the object under test. They also can be compared to the results of FEM-calculations or can be used as boundary conditions for further numerical investigations. Here the 3D-object coordinates are measured in a separate topometric set-up using a modified fringe projection technique to acquire absolute phase values and a sophisticated geometrical model to map these phase data onto coordinates precisely. The determination of 3D-displacement vectors requires the measurement of several interference phase distributions for at least three independent sensitivity directions depending on the observation and illumination directions as well as the 3D-position of each measuring point. These geometric quantities have to be transformed into a reference coordinate system of the interferometric set-up in order to calculate the geometric matrix. The necessary transformation can be realized by means of a detection of object features in both data sets and a subsequent determination of the external camera orientation. This paper presents a consistent solution for the measurement and combination of shape and displacement data including their transformation into simulation systems. The

  16. Evaluation of Genetic Algorithm Concepts using Model Problems. Part 1; Single-Objective Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2003-01-01

    A genetic-algorithm-based optimization approach is described and evaluated using a simple hill-climbing model problem. The model problem utilized herein allows for the broad specification of a large number of search spaces including spaces with an arbitrary number of genes or decision variables and an arbitrary number hills or modes. In the present study, only single objective problems are considered. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all problems attempted. The most difficult problems - those with large hyper-volumes and multi-mode search spaces containing a large number of genes - require a large number of function evaluations for GA convergence, but they always converge.

  17. Design and evaluation of an ultra-slim objective for in-vivo deep optical biopsy

    PubMed Central

    Landau, Sara M.; Liang, Chen; Kester, Robert T.; Tkaczyk, Tomasz S.; Descour, Michael R.

    2010-01-01

    An estimated 1.6 million breast biopsies are performed in the US each year. In order to provide real-time, in-vivo imaging with sub-cellular resolution for optical biopsies, we have designed an ultra-slim objective to fit inside the 1-mm-diameter hypodermic needles currently used for breast biopsies to image tissue stained by the fluorescent probe proflavine. To ensure high-quality imaging performance, experimental tests were performed to characterize fiber bundle’s light-coupling efficiency and simulations were performed to evaluate the impact of candidate lens materials’ autofluorescence. A prototype of NA = 0.4, 250-µm field of view, ultra-slim objective optics was built and tested, yielding diffraction-limited performance and estimated resolution of 0.9 µm. When used in conjunction with a commercial coherent fiber bundle to relay the image formed by the objective, the measured resolution was 2.5 µm. PMID:20389489

  18. Radiographic changes of the pelvis in Labrador and Golden Retrievers after juvenile pubic symphysiodesis: objective and subjective evaluation.

    PubMed

    Boiocchi, S; Vezzoni, L; Vezzoni, A; Bronzo, V; Rossi, F

    2013-01-01

    The hypothesis of this study was that juvenile pubic symphysiodesis (JPS) results in pelvic changes that can be identified radiographically in adult dogs. The medical records at the Clinica Veterinaria Vezzoni were searched for standard ventro-dorsal views of the pelvis of adult Labrador and Golden Retrievers that had undergone JPS or had not undergone surgery. The objective assessment of radiographs included the analysis of various pelvic measurements. Subjective evaluation of radiographs was undertaken by 18 specialists and 21 general practitioners and was based on five criteria relating to 1) the acetabular fossae, 2) the pubic symphysis, 3) the margin of the cranial pubic area, 4) the pubic rami, and 5) the obturator foramen. The radiographs of 42 Labrador Retrievers and 16 Golden Retrievers were evaluated. The most useful criteria were the radiographic measurement of the shape of the obturator foramen and two different ratios of length to width of the pubic rami; these values were significantly smaller in dogs after JPS. The pelvic canal width was the same in both groups. All objective measurements were repeatable within and between evaluators. The most reliable subjective criterion was number 4, followed by number 5 in Golden Retrievers and by 2 in Labrador Retrievers. Our objective and subjective evaluations were simple and yielded useful and repeatable results. There was no significant difference between general practitioners and specialists with regard to subjective evaluation, which indicates that these evaluation criteria can be used by small animal clinicians after minimal training.

  19. 99mTc-sestamibi scintigraphy used to evaluate tumor response to neoadjuvant chemotherapy in locally advanced breast cancer: A quantitative analysis

    PubMed Central

    KOGA, KATIA HIROMOTO; MORIGUCHI, SONIA MARTA; NETO, JORGE NAHÁS; PERES, STELA VERZINHASSE; SILVA, EDUARDO TINÓIS DA; SARRI, ALMIR JOSÉ; MICHELIN, ODAIR CARLITO; MARQUES, MARIANGELA ESTHER ALENCAR; GRIVA, BEATRIZ LOTUFO

    2010-01-01

    To evaluate the tumor response to neoadjuvant chemotherapy, 99mTc-sestamibi breast scintigraphy was proposed as a quantitative method. Fifty-five patients with ductal carcinoma were studied. They underwent breast scintigraphy before and after neoadjuvant chemotherapy, along with clinical assessment and surgical specimen analysis. The regions of interest on the lesion and contralateral breast were identified, and the pixel counts were used to evaluate lesion uptake in relation to background radiation. The ratio of these counts before to after neoadjuvant chemotherapy was assessed. The decrease in uptake rate due to chemotherapy characterized the scintigraphy tumor response. The Kruskal-Wallis test was used to compare the mean scintigraphic tumor response and histological type. Dunn’s multiple comparison test was used to detect differences between histological types. The Mann-Whitney test was used to compare means between quantitative and qualitative variables: scintigraphic tumor response vs. clinical response and uptake before chemotherapy vs. scintigraphic tumor response. The Spearman’s test was used to correlate the quantitative variables of clinical reduction in tumor size and scintigraphic tumor response. All of the variables compared presented significant differences. The change in 99mTc-sestamibi uptake noted on breast scintigraphy, before to after neoadjuvant chemotherapy, may be used as an effective method for evaluating the response to neoadjuvant chemotherapy, since this quantification reflects the biological behavior of the tumor towards the chemotherapy regimen. Furthermore, additional analysis on the uptake rate before chemotherapy may accurately predict treatment response. PMID:22966312

  20. Problems with Piagetian Conservation and Musical Objects.

    ERIC Educational Resources Information Center

    Bartholomew, Douglas

    1987-01-01

    Notes that Piaget's theory of cognitive development was based on the child's interaction with material objects and quantitative relationships. Examines the applicability of Piaget's concept of operational intelligence and conservation to music learning. Concludes that a theory of music learning must apply equally to the non-material and…