Sample records for objective quantitative evaluation

  1. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  2. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  3. FE-ANN based modeling of 3D Simple Reinforced Concrete Girders for Objective Structural Health Evaluation : Tech Transfer Summary

    DOT National Transportation Integrated Search

    2017-06-01

    The objective of this study was to develop an objective, quantitative method for evaluating damage to bridge girders by using artificial neural networks (ANNs). This evaluation method, which is a supplement to visual inspection, requires only the res...

  4. Integrated Approach To Design And Analysis Of Systems

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Iverson, David L.

    1993-01-01

    Object-oriented fault-tree representation unifies evaluation of reliability and diagnosis of faults. Programming/fault tree described more fully in "Object-Oriented Algorithm For Evaluation Of Fault Trees" (ARC-12731). Augmented fault tree object contains more information than fault tree object used in quantitative analysis of reliability. Additional information needed to diagnose faults in system represented by fault tree.

  5. Quantitative evaluation of his-tag purification and immunoprecipitation of tristetraprolin and its mutant proteins from transfected human cells

    USDA-ARS?s Scientific Manuscript database

    Histidine (His)-tag is widely used for affinity purification of recombinant proteins, but the yield and purity of expressed proteins are quite different. Little information is available about quantitative evaluation of this procedure. The objective of the current study was to evaluate the His-tag pr...

  6. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  7. Design and Implementation of Performance Metrics for Evaluation of Assessments Data

    ERIC Educational Resources Information Center

    Ahmed, Irfan; Bhatti, Arif

    2016-01-01

    Evocative evaluation of assessment data is essential to quantify the achievements at course and program levels. The objective of this paper is to design performance metrics and respective formulas to quantitatively evaluate the achievement of set objectives and expected outcomes at the course levels for program accreditation. Even though…

  8. The Positive Alternative Credit Experience (PACE) Program a Quantitative Comparative Study

    ERIC Educational Resources Information Center

    Warren, Rebecca Anne

    2011-01-01

    The purpose of this quantitative comparative study was to evaluate the Positive Alternative Credit Experience (PACE) Program using an objectives-oriented approach to a formative program evaluation. The PACE Program was a semester-long high school alternative education program designed to serve students at-risk for academic failure or dropping out…

  9. Matrix evaluation of science objectives

    NASA Technical Reports Server (NTRS)

    Wessen, Randii R.

    1994-01-01

    The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.

  10. Object-Oriented Algorithm For Evaluation Of Fault Trees

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1992-01-01

    Algorithm for direct evaluation of fault trees incorporates techniques of object-oriented programming. Reduces number of calls needed to solve trees with repeated events. Provides significantly improved software environment for such computations as quantitative analyses of safety and reliability of complicated systems of equipment (e.g., spacecraft or factories).

  11. Quantitative endoscopy: initial accuracy measurements.

    PubMed

    Truitt, T O; Adelman, R A; Kelly, D H; Willging, J P

    2000-02-01

    The geometric optics of an endoscope can be used to determine the absolute size of an object in an endoscopic field without knowing the actual distance from the object. This study explores the accuracy of a technique that estimates absolute object size from endoscopic images. Quantitative endoscopy involves calibrating a rigid endoscope to produce size estimates from 2 images taken with a known traveled distance between the images. The heights of 12 samples, ranging in size from 0.78 to 11.80 mm, were estimated with this calibrated endoscope. Backup distances of 5 mm and 10 mm were used for comparison. The mean percent error for all estimated measurements when compared with the actual object sizes was 1.12%. The mean errors for 5-mm and 10-mm backup distances were 0.76% and 1.65%, respectively. The mean errors for objects <2 mm and > or =2 mm were 0.94% and 1.18%, respectively. Quantitative endoscopy estimates endoscopic image size to within 5% of the actual object size. This method remains promising for quantitatively evaluating object size from endoscopic images. It does not require knowledge of the absolute distance of the endoscope from the object, rather, only the distance traveled by the endoscope between images.

  12. Quantitative acoustic emission monitoring of fatigue cracks in fracture critical steel bridges.

    DOT National Transportation Integrated Search

    2014-01-01

    The objective of this research is to evaluate the feasibility to employ quantitative acoustic : emission (AE) techniques for monitoring of fatigue crack initiation and propagation in steel : bridge members. Three A36 compact tension steel specimens w...

  13. A GIS-BASED METHOD FOR MULTI-OBJECTIVE EVALUATION OF PARK VEGETATION. (R824766)

    EPA Science Inventory

    Abstract

    In this paper we describe a method for evaluating the concordance between a set of mapped landscape attributes and a set of quantitatively expressed management priorities. The method has proved to be useful in planning urban green areas, allowing objectively d...

  14. A new method to evaluate image quality of CBCT images quantitatively without observers

    PubMed Central

    Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori

    2017-01-01

    Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343

  15. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  16. An Evaluation of a Community Health Intervention Programme Aimed at Improving Health and Wellbeing

    ERIC Educational Resources Information Center

    Strachan, G.; Wright, G. D.; Hancock, E.

    2007-01-01

    Objective: The objective of this evaluation was to examine the extent to which participants in the Tailor Made Leisure Package programme experienced any improvement in their health and wellbeing. Design: A quantitative survey. Setting: The Healthy Living Centre initiative is an example of a community-based intervention which was formalized as part…

  17. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  18. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.

  19. CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis

    PubMed Central

    Choi, Kyoungah; Lee, Impyeong

    2015-01-01

    We propose a novel approach to evaluating how effectively a closed circuit television (CCTV) system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system. PMID:26389909

  20. Quantitative phase-contrast digital holographic microscopy for cell dynamic evaluation

    NASA Astrophysics Data System (ADS)

    Yu, Lingfeng; Mohanty, Samarendra; Berns, Michael W.; Chen, Zhongping

    2009-02-01

    The laser microbeam uses lasers to alter and/or to ablate intracellular organelles and cellular and tissue samples, and, today, has become an important tool for cell biologists to study the molecular mechanism of complex biological systems by removing individual cells or sub-cellular organelles. However, absolute quantitation of the localized alteration/damage to transparent phase objects, such as the cell membrane or chromosomes, was not possible using conventional phase-contrast or differential interference contrast microscopy. We report the development of phase-contrast digital holographic microscopy for quantitative evaluation of cell dynamic changes in real time during laser microsurgery. Quantitative phase images are recorded during the process of laser microsurgery and thus, the dynamic change in phase can be continuously evaluated. Out-of-focus organelles are re-focused by numerical reconstruction algorithms.

  1. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    PubMed

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  2. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  3. Objective, Quantitative, Data-Driven Assessment of Chemical Probes.

    PubMed

    Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan

    2018-02-15

    Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Evaluation Processes Used to Assess the Effectiveness of Vocational-Technical Programs.

    ERIC Educational Resources Information Center

    Bruhns, Arthur E.

    Evaluation is quantitative or qualitative, the criteria determined by or given to the student. The criteria show how close he has come to the program's objectives and the ranking of individual performance. Vocational education programs susceptible to evaluation are listed and relevant evaluative techniques discussed. Graduate interviews concerning…

  5. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    PubMed

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  6. Evaluation of background parenchymal enhancement on breast MRI: a systematic review

    PubMed Central

    Signori, Alessio; Valdora, Francesca; Rossi, Federica; Calabrese, Massimo; Durando, Manuela; Mariscotto, Giovanna; Tagliafico, Alberto

    2017-01-01

    Objective: To perform a systematic review of the methods used for background parenchymal enhancement (BPE) evaluation on breast MRI. Methods: Studies dealing with BPE assessment on breast MRI were retrieved from major medical libraries independently by four reviewers up to 6 October 2015. The keywords used for database searching are “background parenchymal enhancement”, “parenchymal enhancement”, “MRI” and “breast”. The studies were included if qualitative and/or quantitative methods for BPE assessment were described. Results: Of the 420 studies identified, a total of 52 articles were included in the systematic review. 28 studies performed only a qualitative assessment of BPE, 13 studies performed only a quantitative assessment and 11 studies performed both qualitative and quantitative assessments. A wide heterogeneity was found in the MRI sequences and in the quantitative methods used for BPE assessment. Conclusion: A wide variability exists in the quantitative evaluation of BPE on breast MRI. More studies focused on a reliable and comparable method for quantitative BPE assessment are needed. Advances in knowledge: More studies focused on a quantitative BPE assessment are needed. PMID:27925480

  7. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  8. Conflicts Management Model in School: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The object of this study is to evaluate the reasons for conflicts occurring in school according to perceptions and views of teachers and resolution strategies used for conflicts and to build a model based on the results obtained. In the research, explanatory design including quantitative and qualitative methods has been used. The quantitative part…

  9. Evaluation of Instructional Materials for Exceptional Children and Youth: A Preliminary Instrument.

    ERIC Educational Resources Information Center

    Eash, Maurice

    An instrument for the evaluation of instructional materials is presented. Evaluative items are arranged under four constructs: objectives, organization of material (both scope and sequence), methodology, and evaluation. A section is also provided for summary quantitative judgment. A glossary of terms used in the instrument is included. A training…

  10. Teaching Research and Practice Evaluation Skills to Graduate Social Work Students

    ERIC Educational Resources Information Center

    Wong, Stephen E.; Vakharia, Sheila P.

    2012-01-01

    Objective: The authors examined outcomes of a graduate course on evaluating social work practice that required students to use published research, quantitative measures, and single-system designs in a simulated practice evaluation project. Method: Practice evaluation projects from a typical class were analyzed for the number of research references…

  11. [Information value of "additional tasks" method to evaluate pilot's work load].

    PubMed

    Gorbunov, V V

    2005-01-01

    "Additional task" method was used to evaluate pilot's work load in prolonged flight. Calculated through durations of latent periods of motor responses, quantitative criterion of work load is more informative for objective evaluation of pilot's involvement in his piloting functions rather than of other registered parameters.

  12. Compromise Programming in forest management

    Treesearch

    Boris A. Poff; Aregai Tecle; Daniel G. Neary; Brian Geils

    2010-01-01

    Multi-objective decision-making (MODM) is an appropriate approach for evaluating a forest management scenario involving multiple interests. Today's land managers must accommodate commercial as well as non-commercial objectives that may be expressed quantitatively and/or qualitatively, and respond to social, political, economic and cultural changes. The spatial and...

  13. Fuzzy object models for newborn brain MR image segmentation

    NASA Astrophysics Data System (ADS)

    Kobashi, Syoji; Udupa, Jayaram K.

    2013-03-01

    Newborn brain MR image segmentation is a challenging problem because of variety of size, shape and MR signal although it is the fundamental study for quantitative radiology in brain MR images. Because of the large difference between the adult brain and the newborn brain, it is difficult to directly apply the conventional methods for the newborn brain. Inspired by the original fuzzy object model introduced by Udupa et al. at SPIE Medical Imaging 2011, called fuzzy shape object model (FSOM) here, this paper introduces fuzzy intensity object model (FIOM), and proposes a new image segmentation method which combines the FSOM and FIOM into fuzzy connected (FC) image segmentation. The fuzzy object models are built from training datasets in which the cerebral parenchyma is delineated by experts. After registering FSOM with the evaluating image, the proposed method roughly recognizes the cerebral parenchyma region based on a prior knowledge of location, shape, and the MR signal given by the registered FSOM and FIOM. Then, FC image segmentation delineates the cerebral parenchyma using the fuzzy object models. The proposed method has been evaluated using 9 newborn brain MR images using the leave-one-out strategy. The revised age was between -1 and 2 months. Quantitative evaluation using false positive volume fraction (FPVF) and false negative volume fraction (FNVF) has been conducted. Using the evaluation data, a FPVF of 0.75% and FNVF of 3.75% were achieved. More data collection and testing are underway.

  14. Products of combustion of non-metallic materials

    NASA Technical Reports Server (NTRS)

    Perry, Cortes L.

    1995-01-01

    The objective of this project is to evaluate methodologies for the qualitative and quantitative determination of the gaseous products of combustion of non-metallic materials of interest to the aerospace community. The goal is to develop instrumentation and analysis procedures which qualitatively and quantitatively identify gaseous products evolved by thermal decomposition and provide NASA a detailed system operating procedure.

  15. High Resolution Qualitative and Quantitative MR Evaluation of the Glenoid Labrum

    PubMed Central

    Iwasaki, Kenyu; Tafur, Monica; Chang, Eric Y.; SherondaStatum; Biswas, Reni; Tran, Betty; Bae, Won C.; Du, Jiang; Bydder, Graeme M.; Chung, Christine B.

    2015-01-01

    Objective To implement qualitative and quantitative MR sequences for the evaluation of labral pathology. Methods Six glenoid labra were dissected and the anterior and posterior portions were divided into normal, mildly degenerated, or severely degenerated groups using gross and MR findings. Qualitative evaluation was performed using T1-weighted, proton density-weighted (PD), spoiled gradient echo (SPGR) and ultra-short echo time (UTE) sequences. Quantitative evaluation included T2 and T1rho measurements as well as T1, T2*, and T1rho measurements acquired with UTE techniques. Results SPGR and UTE sequences best demonstrated labral fiber structure. Degenerated labra had a tendency towards decreased T1 values, increased T2/T2* values and increased T1 rho values. T2* values obtained with the UTE sequence allowed for delineation between normal, mildly degenerated and severely degenerated groups (p<0.001). Conclusion Quantitative T2* measurements acquired with the UTE technique are useful for distinguishing between normal, mildly degenerated and severely degenerated labra. PMID:26359581

  16. Introduction of a method for quantitative evaluation of spontaneous motor activity development with age in infants.

    PubMed

    Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter

    2012-04-01

    Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.

  17. A new approach for the quantitative evaluation of drawings in children with learning disabilities.

    PubMed

    Galli, Manuela; Vimercati, Sara Laura; Stella, Giacomo; Caiazzo, Giorgia; Norveti, Federica; Onnis, Francesca; Rigoldi, Chiara; Albertini, Giorgio

    2011-01-01

    A new method for a quantitative and objective description of drawing and for the quantification of drawing ability in children with learning disabilities (LD) is hereby presented. Twenty-four normally developing children (N) (age 10.6 ± 0.5) and 18 children with learning disabilities (LD) (age 10.3 ± 2.4) took part to the study. The drawing tasks were chosen among those already used in clinical daily experience (Denver Developmental Screening Test). Some parameters were defined in order to quantitatively describe the features of the children's drawings, introducing new objective measurements beside the subjective standard clinical evaluation. The experimental set-up revealed to be valid for clinical application with LD children. The parameters highlighted the presence of differences in the drawing features of N and LD children. This paper suggests the applicability of this protocol to other fields of motor and cognitive valuation, as well as the possibility to study the upper limbs position and muscle activation during drawing. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Evaluation Techniques for the Sandy Point Discovery Center, Great Bay National Estuarine Research Reserve.

    ERIC Educational Resources Information Center

    Heffernan, Bernadette M.

    1998-01-01

    Describes work done to provide staff of the Sandy Point Discovery Center with methods for evaluating exhibits and interpretive programming. Quantitative and qualitative evaluation measures were designed to assess the program's objective of estuary education. Pretest-posttest questionnaires and interviews are used to measure subjects' knowledge and…

  19. Application of quantitative microbial risk assessments for estimation of risk management metrics: Clostridium perfringens in ready-to-eat and partially cooked meat and poultry products as an example.

    PubMed

    Crouch, Edmund A; Labarre, David; Golden, Neal J; Kause, Janell R; Dearfield, Kerry L

    2009-10-01

    The U.S. Department of Agriculture, Food Safety and Inspection Service is exploring quantitative risk assessment methodologies to incorporate the use of the Codex Alimentarius' newly adopted risk management metrics (e.g., food safety objectives and performance objectives). It is suggested that use of these metrics would more closely tie the results of quantitative microbial risk assessments (QMRAs) to public health outcomes. By estimating the food safety objective (the maximum frequency and/or concentration of a hazard in a food at the time of consumption) and the performance objective (the maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption), risk managers will have a better understanding of the appropriate level of protection (ALOP) from microbial hazards for public health protection. We here demonstrate a general methodology that allows identification of an ALOP and evaluation of corresponding metrics at appropriate points in the food chain. It requires a two-dimensional probabilistic risk assessment, the example used being the Monte Carlo QMRA for Clostridium perfringens in ready-to eat and partially cooked meat and poultry products, with minor modifications to evaluate and abstract required measures. For demonstration purposes, the QMRA model was applied specifically to hot dogs produced and consumed in the United States. Evaluation of the cumulative uncertainty distribution for illness rate allows a specification of an ALOP that, with defined confidence, corresponds to current industry practices.

  20. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies.

    PubMed

    Madill, A; Jordan, A; Shirley, C

    2000-02-01

    The effect of the individual analyst on research findings can create a credibility problem for qualitative approaches from the perspective of evaluative criteria utilized in quantitative psychology. This paper explicates the ways in which objectivity and reliability are understood in qualitative analysis conducted from within three distinct epistemological frameworks: realism, contextual constructionism, and radical constructionism. It is argued that quality criteria utilized in quantitative psychology are appropriate to the evaluation of qualitative analysis only to the extent that it is conducted within a naive or scientific realist framework. The discussion is illustrated with reference to the comparison of two independent grounded theory analyses of identical material. An implication of this illustration is to identify the potential to develop a radical constructionist strand of grounded theory.

  1. Quantification of EEG reactivity in comatose patients

    PubMed Central

    Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas

    2016-01-01

    Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757

  2. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior.

    PubMed

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-01-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  3. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  4. Evaluation of the Impact of an Additive Manufacturing Enhanced CubeSat Architecture on the CubeSat Development Process

    DTIC Science & Technology

    2016-09-15

    Investigative Questions This research will quantitatively address the impact of proposed benefits of a 3D printed satellite architecture on the...subsystems of a CubeSat. The objective of this research is to bring a quantitative analysis to the discussion of whether a fully 3D printed satellite...manufacturers to quantitatively address what impact the architecture would have on the subsystems of a CubeSat. Summary of Research Gap, Research Questions, and

  5. Miramar College Program Evaluation: Aviation Maintenance.

    ERIC Educational Resources Information Center

    Moriyama, Bruce; Brumley, Leslie

    Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in aviation maintenance. The report first provides the results of an interview with the program chairperson, which sought information on program objectives and goals and their determination, the extent…

  6. A benchmark for comparison of dental radiography analysis algorithms.

    PubMed

    Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia

    2016-07-01

    Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  8. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  9. Miramar College Program Evaluation: Criminal Justice.

    ERIC Educational Resources Information Center

    Moriyama, Bruce; Brumley, Leslie

    Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in criminal justice. The report first outlines the information gathered in an interview with the program chairperson, conducted to determine program objectives and goals and how they were determined,…

  10. Miramar College Program Evaluation: Fire Science.

    ERIC Educational Resources Information Center

    Moriyama, Bruce; Brumley, Leslie

    Qualitative and quantitative data are presented in this evaluation of the curricular, personnel, and financial status of Miramar College's program in fire sciences. The report first provides the results of an interview with the program chairperson, which sought information on program objectives and goals and their determination, the extent to…

  11. Evaluating hybrid bermudagrass using spectral reflectance under different mowing heights and trinexapac-ethyl applications

    USDA-ARS?s Scientific Manuscript database

    Quantitative spectral reflectance data has the potential to improve the evaluation of turfgrass variety trials when management practices are factors in the testing of turf aesthetics and functionality. However, the practical application of this methodology has not been well-developed. The objectives...

  12. Are E-Businesses Trustworthy?

    ERIC Educational Resources Information Center

    Hulsey, John D.

    2010-01-01

    This study uses a quantitative approach to evaluate the trustworthiness of e-businesses as measured by the E-business Trustworthy Index, EBTI, developed as part of this research. The problem is that despite the importance of e-business trustworthiness and the findings from many studies, there are few if any objective measures that evaluate the…

  13. Do Different Training Conditions Facilitate Team Implementation? A Quasi-Experimental Mixed Methods Study

    ERIC Educational Resources Information Center

    Nielsen, Karina; Randall, Raymond; Christensen, Karl B.

    2017-01-01

    A mixed methods approach was applied to examine the effects of a naturally occurring teamwork intervention supported with training. The first objective was to integrate qualitative process evaluation and quantitative effect evaluation to examine "how" and "why" the training influence intervention outcomes. The intervention (N =…

  14. 78 FR 72119 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-02

    ... objective, independent, third party to evaluate that the questionnaire has a format and scope that minimizes... impact indicators. These indicators are both quantitative and descriptive and may include, for example...

  15. A New Approach for the Quantitative Evaluation of Drawings in Children with Learning Disabilities

    ERIC Educational Resources Information Center

    Galli, Manuela; Vimercati, Sara Laura; Stella, Giacomo; Caiazzo, Giorgia; Norveti, Federica; Onnis, Francesca; Rigoldi, Chiara; Albertini, Giorgio

    2011-01-01

    A new method for a quantitative and objective description of drawing and for the quantification of drawing ability in children with learning disabilities (LD) is hereby presented. Twenty-four normally developing children (N) (age 10.6 [plus or minus] 0.5) and 18 children with learning disabilities (LD) (age 10.3 [plus or minus] 2.4) took part to…

  16. An Objective Measure of Interconnection Usage for High Levels of Wind Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yasuda, Yoh; Gomez-Lazaro, Emilio; Holttinen, Hannele

    2014-11-13

    This paper analyzes selected interconnectors in Europe using several evaluation factors; capacity factor, congested time, and congestion ratio. In a quantitative and objective evaluation, the authors propose to use publically available data on maximum net transmission capacity (NTC) levels during a single year to study congestion rates, realizing that the capacity factor depends upon the chosen capacity of the selected interconnector. This value will be referred to as 'the annual maximum transmission capacity (AMTC)', which gives a transparent and objective evaluation of interconnector usage based on the published grid data. While the method is general, its initial application is motivatedmore » by transfer of renewable energy.« less

  17. Nuclear medicine and quantitative imaging research (instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1990-09-01

    This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less

  18. Classroom versus Computer-Based CPR Training: A Comparison of the Effectiveness of Two Instructional Methods

    ERIC Educational Resources Information Center

    Rehberg, Robb S.; Gazzillo Diaz, Linda; Middlemas, David A.

    2009-01-01

    Objective: The objective of this study was to determine whether computer-based CPR training is comparable to traditional classroom training. Design and Setting: This study was quantitative in design. Data was gathered from a standardized examination and skill performance evaluation which yielded numerical scores. Subjects: The subjects were 64…

  19. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  20. Three-Dimensional Registration for Handheld Profiling Systems Based on Multiple Shot Structured Light

    PubMed Central

    Ayaz, Shirazi Muhammad; Kim, Min Young

    2018-01-01

    In this article, a multi-view registration approach for the 3D handheld profiling system based on the multiple shot structured light technique is proposed. The multi-view registration approach is categorized into coarse registration and point cloud refinement using the iterative closest point (ICP) algorithm. Coarse registration of multiple point clouds was performed using relative orientation and translation parameters estimated via homography-based visual navigation. The proposed system was evaluated using an artificial human skull and a paper box object. For the quantitative evaluation of the accuracy of a single 3D scan, a paper box was reconstructed, and the mean errors in its height and breadth were found to be 9.4 μm and 23 μm, respectively. A comprehensive quantitative evaluation and comparison of proposed algorithm was performed with other variants of ICP. The root mean square error for the ICP algorithm to register a pair of point clouds of the skull object was also found to be less than 1 mm. PMID:29642552

  1. Evaluating the role of landscape in the spread of invasive species: the case of the biomass crop

    USDA-ARS?s Scientific Manuscript database

    As the development and cultivation of new bioeconomy crops and in particular biofuel feedstocks expands there is a pressing need for objective and quantitative methods to evaluate risks and benefits of their production. In particular, the traits being selected for in biofuel crops are highly aligned...

  2. DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.

    ERIC Educational Resources Information Center

    KUSEWITT, J.B.

    THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…

  3. The Integration of Evaluation Paradigms Through Metaphor.

    ERIC Educational Resources Information Center

    Felker, Roberta M.

    The point of view is presented that evaluation projects can be enriched by not using either an exclusively quantitative model or an exclusively qualitative model but by combining both models in one project. The concept of metaphor is used to clarify the usefulness of the combination. Iconic or holistic metaphors describe an object or event as…

  4. Process Evaluation of a Parenting Program for Low-Income Families in South Africa

    ERIC Educational Resources Information Center

    Lachman, Jamie M.; Kelly, Jane; Cluver, Lucie; Ward, Catherine L.; Hutchings, Judy; Gardner, Frances

    2018-01-01

    Objective: This mixed-methods process evaluation examined the feasibility of a parenting program delivered by community facilitators to reduce the risk of child maltreatment in low-income families with children aged 3-8 years in Cape Town, South Africa (N = 68). Method: Quantitative measures included attendance registers, fidelity checklists,…

  5. Habitat Features Affecting Smallmouth Bass Micropterus dolomieu Nesting Success in Four Northern Wisconsin Lakes

    Treesearch

    Rory Saunders; Michael A. Bozek; Clayton J. Edwards; Martin J. Jennings; Steven P. Newman

    2002-01-01

    Evaluating spawning success in relation to habitat characteristics of nests sites provides critical information necessary to assess the effects riparian and littoral zone habitat alterations have on smallmouth bass Micropterus dolomieu survival and recruitment. The objective of this study was to quantitatively evaluate smallmouth bass nest site...

  6. Objective assessment of skin tightening in Asians using a water-filtered near-infrared (1,000–1,800 nm) device with contact-cooling and freezer-stored gel

    PubMed Central

    Tanaka, Yohei; Tsunemi, Yuichiro; Kawashima, Makoto; Tatewaki, Naoto; Nishida, Hiroshi

    2013-01-01

    Background Near-infrared has been shown to penetrate deeper than optical light sources independent of skin color, allowing safer treatment for the Asian skin type. Many studies have indicated the efficacy of various types of devices, but have not included a sufficiently objective evaluation. In this study, we used three-dimensional imaging for objective evaluation of facial skin tightening using a water-filtered near-infrared device. Methods Twenty Japanese patients were treated with the water-filtered near-infrared (1,000–1,800 nm) device using a contact-cooling and nonfreezing gel stored in a freezer. Three-dimensional imaging was performed, and quantitative volume measurements were taken to evaluate the change in post-treatment volume. The patients then provided their subjective assessments. Results Objective assessments of the treated cheek volume evaluated by a three-dimensional color schematic representation with quantitative volume measurements showed significant improvement 3 months after treatment. The mean volume reduction at the last post-treatment visit was 2.554 ± 0.999 mL. The post-treatment volume was significantly reduced compared with the pretreatment volume in all patients (P < 0.0001). Eighty-five percent of patients reported satisfaction with the improvement of skin laxity, and 80% of patients reported satisfaction with improvement of rhytids, such as the nasolabial folds. Side effects, such as epidermal burns and scar formation, were not observed throughout the study. Conclusion The advantages of this water-filtered near-infrared treatment are its high efficacy for skin tightening, associated with a minimal level of discomfort and minimal side effects. Together, these characteristics facilitate our ability to administer repeated treatments and provide alternative or adjunctive treatment for patients, with improved results. This study provides a qualitative and quantitative volumetric assessment, establishing the ability of this technology to reduce volume through noninvasive skin tightening. PMID:23837000

  7. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges.

    PubMed

    Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.

  8. Content-Related Issues Pertaining to Teaching Statistics: Making Decisions about Educational Objectives in Statistics Courses.

    ERIC Educational Resources Information Center

    Bliss, Leonard B.; Tashakkori, Abbas

    This paper discusses the objectives that would be appropriate for statistics classes for students who are not majoring in statistics, evaluation, or quantitative research design. These "non-majors" should be able to choose appropriate analytical methods for specific sets of data based on the research question and the nature of the data, and they…

  9. Development and Implementation of a Learning Object Repository for French Teaching and Learning: Issues and Promises

    ERIC Educational Resources Information Center

    Caws, Catherine

    2008-01-01

    This paper discusses issues surrounding the development of a learning object repository (FLORE) for teaching and learning French at the postsecondary level. An evaluation based on qualitative and quantitative data was set up in order to better assess how second-language (L2) students in French perceived the integration of this new repository into…

  10. QUANTITATIVE EVALUATION OF THE HYPOTHESIS THAT BL LACERTAE OBJECTS ARE QSO REMNANTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borra, E. F.

    2014-11-20

    We evaluate with numerical simulations the hypothesis that BL Lacertae objects (BLLs) are the remnants of quasi-stellar objects. This hypothesis is based on their highly peculiar redshift evolution. They have a comoving space density that increases with decreasing redshift, contrary to all other active galactic nuclei. We assume that relativistic jets are below detection in young radio-quiet quasars and increase in strength with cosmic time so that they eventually are detected as BLLs. Our numerical simulations fit very well the observed redshift distributions of BLLs. There are strong indications that only the high-synchrotron-peaked BLLs could be QSO remnants.

  11. 3D methodology for evaluating rail crossing roughness.

    DOT National Transportation Integrated Search

    2015-03-02

    Description of Research Project The overall objective of this project is to investigate develop a quantitative method or measure for determining the need to rehabilitate rail crossings. The scope of the project includes investigation of sensor capabi...

  12. Towards standardized assessment of endoscope optical performance: geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Desai, Viraj N.; Ngo, Ying Z.; Cheng, Wei-Chung; Pfefer, Joshua

    2013-12-01

    Technological advances in endoscopes, such as capsule, ultrathin and disposable devices, promise significant improvements in safety, clinical effectiveness and patient acceptance. Unfortunately, the industry lacks test methods for preclinical evaluation of key optical performance characteristics (OPCs) of endoscopic devices that are quantitative, objective and well-validated. As a result, it is difficult for researchers and developers to compare image quality and evaluate equivalence to, or improvement upon, prior technologies. While endoscope OPCs include resolution, field of view, and depth of field, among others, our focus in this paper is geometric image distortion. We reviewed specific test methods for distortion and then developed an objective, quantitative test method based on well-defined experimental and data processing steps to evaluate radial distortion in the full field of view of an endoscopic imaging system. Our measurements and analyses showed that a second-degree polynomial equation could well describe the radial distortion curve of a traditional endoscope. The distortion evaluation method was effective for correcting the image and can be used to explain other widely accepted evaluation methods such as picture height distortion. Development of consensus standards based on promising test methods for image quality assessment, such as the method studied here, will facilitate clinical implementation of innovative endoscopic devices.

  13. Validation of virtual learning object to support the teaching of nursing care systematization.

    PubMed

    Salvador, Pétala Tuani Candido de Oliveira; Mariz, Camila Maria Dos Santos; Vítor, Allyne Fortes; Ferreira Júnior, Marcos Antônio; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2018-01-01

    to describe the content validation process of a Virtual Learning Object to support the teaching of nursing care systematization to nursing professionals. methodological study, with quantitative approach, developed according to the methodological reference of Pasquali's psychometry and conducted from March to July 2016, from two-stage Delphi procedure. in the Delphi 1 stage, eight judges evaluated the Virtual Object; in Delphi 2 stage, seven judges evaluated it. The seven screens of the Virtual Object were analyzed as to the suitability of its contents. The Virtual Learning Object to support the teaching of nursing care systematization was considered valid in its content, with a Total Content Validity Coefficient of 0.96. it is expected that the Virtual Object can support the teaching of nursing care systematization in light of appropriate and effective pedagogical approaches.

  14. Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders

    PubMed Central

    Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu

    2014-01-01

    Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709

  15. Investigation of the feasibility of non-invasive optical sensors for the quantitative assessment of dehydration.

    PubMed

    Visser, Cobus; Kieser, Eduard; Dellimore, Kiran; van den Heever, Dawie; Smith, Johan

    2017-10-01

    This study explores the feasibility of prospectively assessing infant dehydration using four non-invasive, optical sensors based on the quantitative and objective measurement of various clinical markers of dehydration. The sensors were investigated to objectively and unobtrusively assess the hydration state of an infant based on the quantification of capillary refill time (CRT), skin recoil time (SRT), skin temperature profile (STP) and skin tissue hydration by means of infrared spectrometry (ISP). To evaluate the performance of the sensors a clinical study was conducted on a cohort of 10 infants (aged 6-36 months) with acute gastroenteritis. High sensitivity and specificity were exhibited by the sensors, in particular the STP and SRT sensors, when combined into a fusion regression model (sensitivity: 0.90, specificity: 0.78). The SRT and STP sensors and the fusion model all outperformed the commonly used "gold standard" clinical dehydration scales including the Gorelick scale (sensitivity: 0.56, specificity: 0.56), CDS scale (sensitivity: 1.0, specificity: 0.2) and WHO scale (sensitivity: 0.13, specificity: 0.79). These results suggest that objective and quantitative assessment of infant dehydration may be possible using the sensors investigated. However, further evaluation of the sensors on a larger sample population is needed before deploying them in a clinical setting. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  16. Evaluation of the TBET model for potential improvement of southern P indices

    USDA-ARS?s Scientific Manuscript database

    Due to a shortage of available phosphorus (P) loss data sets, simulated data from a quantitative P transport model could be used to evaluate a P-index. However, the model would need to accurately predict the P loss data sets that are available. The objective of this study was to compare predictions ...

  17. Quantitative relationship between crash risks and pavement skid resistance.

    DOT National Transportation Integrated Search

    2014-05-01

    Faced with continuously increasing maintenance due to aging infrastructure, the Texas Department of : Transportation (TxDOT) is evaluating the potential impact of reduced funding on highway safety. The main : objective of this report is to develop a ...

  18. A novel, objective, quantitative method of evaluation of the back pain component using comparative computerized multi-parametric tactile mapping before/after spinal cord stimulation and database analysis: the "Neuro-Pain't" software.

    PubMed

    Rigoard, P; Nivole, K; Blouin, P; Monlezun, O; Roulaud, M; Lorgeoux, B; Bataille, B; Guetarni, F

    2015-03-01

    One of the major challenges of neurostimulation is actually to address the back pain component in patients suffering from refractory chronic back and leg pain. Facing a tremendous expansion of neurostimulation techniques and available devices, implanters and patients can still remain confused as they need to select the right tool for the right indication. To be able to evaluate and compare objectively patient outcomes, depending on therapeutical strategies, it appears essential to develop a rational and quantitative approach to pain assessment for those who undergo neurostimulation implantation. We developed a touch screen interface, in Poitiers University Hospital and N(3)Lab, called the "Neuro-Pain'T", to detect, record and quantify the painful area surface and intensity changes in an implanted patient within time. The second aim of this software is to analyse the link between a paraesthesia coverage generated by a type of neurostimulation and a potential analgesic effect, measured by pain surface reduction, pain intensity reduction within the painful surface and local change in pain characteristics distribution. The third aim of Neuro-Pain'T is to correlate these clinical parameters to global patient data and functional outcome analysis, via a network database (Neuro-Database), to be able to provide a concise but objective approach of the neurostimulation efficacy, summarized by an index called "RFG Index". This software has been used in more than 190 patients since 2012, leading us to define three clinical parameters grouped as a clinical component of the RFG Index, which might be helpful to assess neurostimulation efficacy and compare implanted devices. The Neuro-Pain'T is an original software designed to objectively and quantitatively characterize reduction of a painful area in a given individual, in terms of intensity, surface and pain typology, in response to a treatment strategy or implantation of an analgesic device. Because pain is a physical sensation, which integrates a psychological dimension, its assessment justifies the use of multidimensional and global evaluation scales. However, in the context of neurostimulation and comparative clinical trials designed to test the technical efficacy of a given device, a simple, objective and quantitative evaluation tool could help to guide tomorrow's treatment options by transforming personal convictions into a more robust scientific rationale based on data collection and data mining techniques. Copyright © 2014. Published by Elsevier Masson SAS.

  19. An experimental apparatus to simulate body-powered prosthetic usage: Development and preliminary evaluation.

    PubMed

    Gao, Fan; Rodriguez, Johanan; Kapp, Susan

    2016-06-01

    Harness fitting in the body-powered prosthesis remains more art than science due to a lack of consistent and quantitative evaluation. The aim of this study was to develop a mechanical, human-body-shaped apparatus to simulate body-powered upper limb prosthetic usage and evaluate its capability of quantitative examination of harness configuration. The apparatus was built upon a torso of a wooden mannequin and integrated major mechanical joints to simulate terminal device operation. Sensors were used to register cable tension, cable excursion, and grip force simultaneously. The apparatus allowed the scapula to move up to 127 mm laterally and the load cell can measure the cable tension up to 445 N. Our preliminary evaluation highlighted the needs and importance of investigating harness configurations in a systematic and controllable manner. The apparatus allows objective, systematic, and quantitative evaluation of effects of realistic harness configurations and will provide insightful and working knowledge on harness fitting in upper limb amputees using body-powered prosthesis. © The International Society for Prosthetics and Orthotics 2015.

  20. Lessons from mouse chimaera experiments with a reiterated transgene marker: revised marker criteria and a review of chimaera markers.

    PubMed

    Keighren, Margaret A; Flockhart, Jean; Hodson, Benjamin A; Shen, Guan-Yi; Birtley, James R; Notarnicola-Harwood, Antonio; West, John D

    2015-08-01

    Recent reports of a new generation of ubiquitous transgenic chimaera markers prompted us to consider the criteria used to evaluate new chimaera markers and develop more objective assessment methods. To investigate this experimentally we used several series of fetal and adult chimaeras, carrying an older, multi-copy transgenic marker. We used two additional independent markers and objective, quantitative criteria for cell selection and cell mixing to investigate quantitative and spatial aspects of developmental neutrality. We also suggest how the quantitative analysis we used could be simplified for future use with other markers. As a result, we recommend a five-step procedure for investigators to evaluate new chimaera markers based partly on criteria proposed previously but with a greater emphasis on examining the developmental neutrality of prospective new markers. These five steps comprise (1) review of published information, (2) evaluation of marker detection, (3) genetic crosses to check for effects on viability and growth, (4) comparisons of chimaeras with and without the marker and (5) analysis of chimaeras with both cell populations labelled. Finally, we review a number of different chimaera markers and evaluate them using the extended set of criteria. These comparisons indicate that, although the new generation of ubiquitous fluorescent markers are the best of those currently available and fulfil most of the criteria required of a chimaera marker, further work is required to determine whether they are developmentally neutral.

  1. Methodology for determining the investment attractiveness of construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana

    2018-03-01

    The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.

  2. Quantitative Image Informatics for Cancer Research (QIICR) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.

  3. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  4. Non-animal approaches for toxicokinetics in risk evaluations of food chemicals.

    PubMed

    Punt, Ans; Peijnenburg, Ad A C M; Hoogenboom, Ron L A P; Bouwmeester, Hans

    2017-01-01

    The objective of the present work was to review the availability and predictive value of non-animal toxicokinetic approaches and to evaluate their current use in European risk evaluations of food contaminants, additives and food contact materials, as well as pesticides and medicines. Results revealed little use of quantitative animal or human kinetic data in risk evaluations of food chemicals, compared with pesticides and medicines. Risk evaluations of medicines provided sufficient in vivo kinetic data from different species to evaluate the predictive value of animal kinetic data for humans. These data showed a relatively poor correlation between the in vivo bioavailability in rats and dogs versus that in humans. In contrast, in vitro (human) kinetic data have been demonstrated to provide adequate predictions of the fate of compounds in humans, using appropriate in vitro-in vivo scalers and by integration of in vitro kinetic data with in silico kinetic modelling. Even though in vitro kinetic data were found to be occasionally included within risk evaluations of food chemicals, particularly results from Caco-2 absorption experiments and in vitro data on gut-microbial conversions, only minor use of in vitro methods for metabolism and quantitative in vitro-in vivo extrapolation methods was identified. Yet, such quantitative predictions are essential in the development of alternatives to animal testing as well as to increase human relevance of toxicological risk evaluations. Future research should aim at further improving and validating quantitative alternative methods for kinetics, thereby increasing regulatory acceptance of non-animal kinetic data.

  5. A novel quantified bitterness evaluation model for traditional Chinese herbs based on an animal ethology principle.

    PubMed

    Han, Xue; Jiang, Hong; Han, Li; Xiong, Xi; He, Yanan; Fu, Chaomei; Xu, Runchun; Zhang, Dingkun; Lin, Junzhi; Yang, Ming

    2018-03-01

    Traditional Chinese herbs (TCH) are currently gaining attention in disease prevention and health care plans. However, their general bitter taste hinders their use. Despite the development of a variety of taste evaluation methods, it is still a major challenge to establish a quantitative detection technique that is objective, authentic and sensitive. Based on the two-bottle preference test (TBP), we proposed a novel quantitative strategy using a standardized animal test and a unified quantitative benchmark. To reduce the difference of results, the methodology of TBP was optimized. The relationship between the concentration of quinine and animal preference index (PI) was obtained. Then the PI of TCH was measured through TBP, and bitterness results were converted into a unified numerical system using the relationship of concentration and PI. To verify the authenticity and sensitivity of quantified results, human sensory testing and electronic tongue testing were applied. The quantified results showed a good discrimination ability. For example, the bitterness of Coptidis Rhizoma was equal to 0.0579 mg/mL quinine, and Nelumbinis Folium was equal to 0.0001 mg/mL. The validation results proved that the new assessment method for TCH was objective and reliable. In conclusion, this study provides an option for the quantification of bitterness and the evaluation of taste masking effects.

  6. Automated characterization of normal and pathologic lung tissue by topological texture analysis of multidetector CT

    NASA Astrophysics Data System (ADS)

    Boehm, H. F.; Fink, C.; Becker, C.; Reiser, M.

    2007-03-01

    Reliable and accurate methods for objective quantitative assessment of parenchymal alterations in the lung are necessary for diagnosis, treatment and follow-up of pulmonary diseases. Two major types of alterations are pulmonary emphysema and fibrosis, emphysema being characterized by abnormal enlargement of the air spaces distal to the terminal, nonrespiratory bronchiole, accompanied by destructive changes of the alveolar walls. The main characteristic of fibrosis is coursening of the interstitial fibers and compaction of the pulmonary tissue. With the ability to display anatomy free from superimposing structures and greater visual clarity, Multi-Detector-CT has shown to be more sensitive than the chest radiograph in identifying alterations of lung parenchyma. In automated evaluation of pulmonary CT-scans, quantitative image processing techniques are applied for objective evaluation of the data. A number of methods have been proposed in the past, most of which utilize simple densitometric tissue features based on the mean X-ray attenuation coefficients expressed in terms of Hounsfield Units [HU]. Due to partial volume effects, most of the density-based methodologies tend to fail, namely in cases, where emphysema and fibrosis occur within narrow spatial limits. In this study, we propose a methodology based upon the topological assessment of graylevel distribution in the 3D image data of lung tissue which provides a way of improving quantitative CT evaluation. Results are compared to the more established density-based methods.

  7. Automatized image processing of bovine blastocysts produced in vitro for quantitative variable determination

    NASA Astrophysics Data System (ADS)

    Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Maserati, Marc Peter, Jr.; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia

    2017-12-01

    There is currently no objective, real-time and non-invasive method for evaluating the quality of mammalian embryos. In this study, we processed images of in vitro produced bovine blastocysts to obtain a deeper comprehension of the embryonic morphological aspects that are related to the standard evaluation of blastocysts. Information was extracted from 482 digital images of blastocysts. The resulting imaging data were individually evaluated by three experienced embryologists who graded their quality. To avoid evaluation bias, each image was related to the modal value of the evaluations. Automated image processing produced 36 quantitative variables for each image. The images, the modal and individual quality grades, and the variables extracted could potentially be used in the development of artificial intelligence techniques (e.g., evolutionary algorithms and artificial neural networks), multivariate modelling and the study of defined structures of the whole blastocyst.

  8. Histopathological image analysis of chemical-induced hepatocellular hypertrophy in mice.

    PubMed

    Asaoka, Yoshiji; Togashi, Yuko; Mutsuga, Mayu; Imura, Naoko; Miyoshi, Tomoya; Miyamoto, Yohei

    2016-04-01

    Chemical-induced hepatocellular hypertrophy is frequently observed in rodents, and is mostly caused by the induction of phase I and phase II drug metabolic enzymes and peroxisomal lipid metabolic enzymes. Liver weight is a sensitive and commonly used marker for detecting hepatocellular hypertrophy, but is also increased by a number of other factors. Histopathological observations subjectively detect changes such as hepatocellular hypertrophy based on the size of a hepatocyte. Therefore, quantitative microscopic observations are required to evaluate histopathological alterations objectively. In the present study, we developed a novel quantitative method for an image analysis of hepatocellular hypertrophy using liver sections stained with hematoxylin and eosin, and demonstrated its usefulness for evaluating hepatocellular hypertrophy induced by phenobarbital (a phase I and phase II enzyme inducer) and clofibrate (a peroxisomal enzyme inducer) in mice. The algorithm of this imaging analysis was designed to recognize an individual hepatocyte through a combination of pixel-based and object-based analyses. Hepatocellular nuclei and the surrounding non-hepatocellular cells were recognized by the pixel-based analysis, while the areas of the recognized hepatocellular nuclei were then expanded until they ran against their expanding neighboring hepatocytes and surrounding non-hepatocellular cells by the object-based analysis. The expanded area of each hepatocellular nucleus was regarded as the size of an individual hepatocyte. The results of this imaging analysis showed that changes in the sizes of hepatocytes corresponded with histopathological observations in phenobarbital and clofibrate-treated mice, and revealed a correlation between hepatocyte size and liver weight. In conclusion, our novel image analysis method is very useful for quantitative evaluations of chemical-induced hepatocellular hypertrophy. Copyright © 2015 Elsevier GmbH. All rights reserved.

  9. Physical therapy in Huntington's disease--toward objective assessments?

    PubMed

    Bohlen, S; Ekwall, C; Hellström, K; Vesterlin, H; Björnefur, M; Wiklund, L; Reilmann, R

    2013-02-01

    Physical therapy is recommended for the treatment of Huntington's disease, but reliable studies investigating its efficacy are almost non-existent. This may in part be due to the lack of suitable outcome measures. Therefore, we investigated the applicability of novel quantitative and objective assessments of motor dysfunction in the evaluation of physical therapy interventions aimed at improving gait and posture. Twelve patients with Huntington disease received a predefined twice-weekly intervention focusing on posture and gait over 6 weeks. The GAITRite mat and a force plate were used for objective and quantitative assessments. The Unified Huntingtons Disease Rating Scale Total Motor Score, the timed Up &Go test, and the Berg Balance Scale were used as clinical outcome measures. Significant improvements were seen in GAITRite measures after therapy. Improvements were also seen in the Up & Go test and Berg Balance Scale, whereas force plate measures and Total Motor Scores did not change. The results suggest that physical therapy has a positive effect on gait in Huntington's disease. The study shows that objective and quantitative measures of gait and posture may serve as endpoints in trials assessing the efficacy of physical therapy. They should be explored further in larger trials applying a randomized controlled setting. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.

  10. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  11. Subjective and objective scales to assess the development of children cerebral palsy.

    PubMed

    Pietrzak, S; Jóźwiak, M

    2001-01-01

    Many scoring systems hale been constructed to assess the motor development of cerebral palsy children and to evaluate the effectiveness of treatment. According to the purposes they fulfill, these instruments may be divided into three types: discriminative, evaluative and predictive. The design and measurement methodology are the criteria that determine whether a given scale is quantitative or qualitative in nature, and whether is should be considered to be objective or subjective. The article presents the "reaching, losing and regaining" scale (constructed by the authors to assess functional development and its changes in certain periods of time), the Munich Functional Development Diagnostics, and the Gross Motor Function Measure (GMFM). Special attention is given to the GMFM, its methods, evaluation of results, and application. A comparison of subjective and objective assessment of two cerebral palsy children is included.

  12. Objective evaluation of cutaneous thermal sensivity

    NASA Technical Reports Server (NTRS)

    Vanbeaumont, W.

    1972-01-01

    The possibility of obtaining reliable and objective quantitative responses was investigated under conditions where only temperature changes in localized cutaneous areas evoked measurable changes in remote sudomotor activity. Both male and female subjects were studied to evaluate sex difference in thermal sensitivity. The results discussed include: sweat rate responses to contralateral cooling, comparison of sweat rate responses between men and women to contralateral cooling, influence of the menstrual cycle on the sweat rate responses to contralateral cooling, comparison of threshold of sweating responses between men and women, and correlation of latency to threshold for whole body sweating. It is concluded that the quantitative aspects of the reflex response is affected by both the density and activation of receptors as well as the rate of heat loss; men responded 8-10% more frequently than women to thermode cooling, the magnitude of responses being greater for men; and women responded 7-9% more frequently to thermode cooling on day 1 of menstruation, as compared to day 15.

  13. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  14. The Basic Shelf Experience: a comprehensive evaluation.

    PubMed

    Dewolfe, Judith A; Greaves, Gaye

    2003-01-01

    The Basic Shelf Experience is a program designed to assist people living on limited incomes to make better use of their food resources. The purpose of this research was to learn if the Basic Shelf Experience program helps such people to 1. utilize food resources more effectively and 2. cope, through group support, with poverty-associated stressors that influence food security. Both quantitative and qualitative methods were used to evaluate the program objectives. Participants completed a questionnaire at the beginning and end of the six-week program. The questionnaire asked about their food access, food security, and feelings about themselves. Participants returned for a focus group discussion and completed the questionnaire again three months after the program ended. The focus group was designed to elicit information about perceived changes, if any, attributed to the program. Forty-two people completed the questionnaires pre-program and 20 post-program; 17 participated in the three-month follow-up session. While results from quantitative data analysis indicate that program objectives were not met, qualitative data provide evidence that the program did achieve its stated objectives. Our results suggest such programs as the Basic Shelf Experience can assist people living on limited incomes to achieve food security.

  15. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  16. Evaluative procedures to detect, characterize, and assess the severity of diabetic neuropathy.

    PubMed

    Dyck, P J

    1991-01-01

    Minimal criteria for diabetic neuropathy need to be defined and universally applied. Standardized evaluative procedures need to be agreed and normal ranges determined from healthy volunteers. Types and stages of neuropathy should be established and assessments performed on representative populations of both Type 1 and Type 2 diabetic patients. Potential minimal criteria include absent ankle reflexes and vibratory sensation, and abnormalities of nerve conduction. However, the preferred criterion is the identification of more than two statistically defined abnormalities among symptoms and deficits, nerve conduction, quantitative sensory examination or quantitative autonomic examination. Various evaluative procedures are available. Symptoms should be assessed and scores can be assigned to neurological deficits. However, assessments of nerve conduction provide the most specific, objective, sensitive, and repeatable procedures, although these may be the least meaningful. Many techniques are available for quantitative sensory examination, but are poorly standardized and normal values are not available. For quantitative autonomic examination, tests are available for the adequacy of cardiovascular and peripheral vascular reflexes and increasingly for other autonomic functions. In any assessment of nerve function the conditions should be optimized and standardized, and stimuli defined. Specific instructions should be given and normal ranges established in healthy volunteers.

  17. Image restoration using aberration taken by a Hartmann wavefront sensor on extended object, towards real-time deconvolution

    NASA Astrophysics Data System (ADS)

    Darudi, Ahmad; Bakhshi, Hadi; Asgari, Reza

    2015-05-01

    In this paper we present the results of image restoration using the data taken by a Hartmann sensor. The aberration is measure by a Hartmann sensor in which the object itself is used as reference. Then the Point Spread Function (PSF) is simulated and used for image reconstruction using the Lucy-Richardson technique. A technique is presented for quantitative evaluation the Lucy-Richardson technique for deconvolution.

  18. Relative Navigation Light Detection and Ranging (LIDAR) Sensor Development Test Objective (DTO) Performance Verification

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) received a request from the NASA Associate Administrator (AA) for Human Exploration and Operations Mission Directorate (HEOMD), to quantitatively evaluate the individual performance of three light detection and ranging (LIDAR) rendezvous sensors flown as orbiter's development test objective on Space Transportation System (STS)-127, STS-133, STS-134, and STS-135. This document contains the outcome of the NESC assessment.

  19. Evaluation of macrozone dimensions by ultrasound and EBSD techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreau, Andre, E-mail: Andre.Moreau@cnrc-nrc.gc.ca; Toubal, Lotfi; Ecole de technologie superieure, 1100, rue Notre-Dame Ouest, Montreal, QC, Canada H3C 1K3

    2013-01-15

    Titanium alloys are known to have texture heterogeneities, i.e. regions much larger than the grain dimensions, where the local orientation distribution of the grains differs from one region to the next. The electron backscattering diffraction (EBSD) technique is the method of choice to characterize these macro regions, which are called macrozones. Qualitatively, the images obtained by EBSD show that these macrozones may be larger or smaller, elongated or equiaxed. However, often no well-defined boundaries are observed between the macrozones and it is very hard to obtain objective and quantitative estimates of the macrozone dimensions from these data. In the presentmore » work, we present a novel, non-destructive ultrasonic technique that provides objective and quantitative characteristic dimensions of the macrozones. The obtained dimensions are based on the spatial autocorrelation function of fluctuations in the sound velocity. Thus, a pragmatic definition of macrozone dimensions naturally arises from the ultrasonic measurement. This paper has three objectives: 1) to disclose the novel, non-destructive ultrasonic technique to measure macrozone dimensions, 2) to propose a quantitative and objective definition of macrozone dimensions adapted to and arising from the ultrasonic measurement, and which is also applicable to the orientation data obtained by EBSD, and 3) to compare the macrozone dimensions obtained using the two techniques on two samples of the near-alpha titanium alloy IMI834. In addition, it was observed that macrozones may present a semi-periodical arrangement. - Highlights: Black-Right-Pointing-Pointer Discloses a novel, ultrasonic NDT technique to measure macrozone dimensions Black-Right-Pointing-Pointer Proposes a quantitative and objective definition of macrozone dimensions Black-Right-Pointing-Pointer Compares macrozone dimensions obtained using EBSD and ultrasonics on 2 Ti samples Black-Right-Pointing-Pointer Observes that macrozones may have a semi-periodical arrangement.« less

  20. Digital learning objects in nursing consultation: technology assessment by undergraduate students.

    PubMed

    Silveira, DeniseTolfo; Catalan, Vanessa Menezes; Neutzling, Agnes Ludwig; Martinato, Luísa Helena Machado

    2010-01-01

    This study followed the teaching-learning process about the nursing consultation, based on digital learning objects developed through the active Problem Based Learning method. The goals were to evaluate the digital learning objects about nursing consultation, develop cognitive skills on the subject using problem based learning and identify the students' opinions on the use of technology. This is an exploratory and descriptive study with a quantitative approach. The sample consisted of 71 students in the sixth period of the nursing program at the Federal University of Rio Grande do Sul. The data was collected through a questionnaire to evaluate the learning objects. The results showed positive agreement (58%) on the content, usability and didactics of the proposed computer-mediated activity regarding the nursing consultation. The application of materials to the students is considered positive.

  1. Systems Operations Studies for Automated Guideway Transit Systems : Quantitative Analysis of Alternative AGT Operational Control Strategies

    DOT National Transportation Integrated Search

    1981-10-01

    The objectives of the Systems Operation Studies (SOS) for automated guideway transit (AGT) systems are to develop models for the analysis of system operations, to evaluate performance and cost, and to establish guidelines for the design and operation...

  2. Objective and quantitative equilibriometric evaluation of individual locomotor behaviour in schizophrenia: Translational and clinical implications.

    PubMed

    Haralanov, Svetlozar; Haralanova, Evelina; Milushev, Emil; Shkodrova, Diana; Claussen, Claus-Frenz

    2018-04-17

    Psychiatry is the only medical specialty that lacks clinically applicable biomarkers for objective evaluation of the existing pathology at a single-patient level. On the basis of an original translational equilibriometric method for evaluation of movement patterns, we have introduced in the everyday clinical practice of psychiatry an easy-to-perform computerized objective quantification of the individual locomotor behaviour during execution of the Unterberger stepping test. For the last 20 years, we have gradually collected a large database of more than 1000 schizophrenic patients, their relatives, and matched psychiatric, neurological, and healthy controls via cross-sectional and longitudinal investigations. Comparative analyses revealed transdiagnostic locomotor similarities among schizophrenic patients, high-risk schizotaxic individuals, and neurological patients with multiple sclerosis and cerebellar ataxia, thus suggesting common underlying brain mechanisms. In parallel, intradiagnostic dissimilarities were revealed, which allow to separate out subclinical locomotor subgroups within the diagnostic categories. Prototypical qualitative (dysmetric and ataxic) locomotor abnormalities in schizophrenic patients were differentiated from 2 atypical quantitative ones, manifested as either hypolocomotion or hyperlocomotion. Theoretical analyses suggested that these 3 subtypes of locomotor abnormalities could be conceived as objectively measurable biomarkers of 3 schizophrenic subgroups with dissimilar brain mechanisms, which require different treatment strategies. Analogies with the prominent role of locomotor measures in some well-known animal models of mental disorders advocate for a promising objective translational research in the so far over-subjective field of psychiatry. Distinctions among prototypical, atypical, and diagnostic biomarkers, as well as between neuromotor and psychomotor locomotor abnormalities, are discussed. Conclusions are drawn about the translational and clinical implications of the new approach and its future perspectives. © 2018 John Wiley & Sons, Ltd.

  3. Quantitative evaluation of in vivo vital-dye fluorescence endoscopic imaging for the detection of Barrett’s-associated neoplasia

    PubMed Central

    Thekkek, Nadhi; Lee, Michelle H.; Polydorides, Alexandros D.; Rosen, Daniel G.; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2015-01-01

    Abstract. Current imaging tools are associated with inconsistent sensitivity and specificity for detection of Barrett’s-associated neoplasia. Optical imaging has shown promise in improving the classification of neoplasia in vivo. The goal of this pilot study was to evaluate whether in vivo vital dye fluorescence imaging (VFI) has the potential to improve the accuracy of early-detection of Barrett’s-associated neoplasia. In vivo endoscopic VFI images were collected from 65 sites in 14 patients with confirmed Barrett’s esophagus (BE), dysplasia, or esophageal adenocarcinoma using a modular video endoscope and a high-resolution microendoscope (HRME). Qualitative image features were compared to histology; VFI and HRME images show changes in glandular structure associated with neoplastic progression. Quantitative image features in VFI images were identified for objective image classification of metaplasia and neoplasia, and a diagnostic algorithm was developed using leave-one-out cross validation. Three image features extracted from VFI images were used to classify tissue as neoplastic or not with a sensitivity of 87.8% and a specificity of 77.6% (AUC=0.878). A multimodal approach incorporating VFI and HRME imaging can delineate epithelial changes present in Barrett’s-associated neoplasia. Quantitative analysis of VFI images may provide a means for objective interpretation of BE during surveillance. PMID:25950645

  4. Quantitative evaluation of in vivo vital-dye fluorescence endoscopic imaging for the detection of Barrett's-associated neoplasia.

    PubMed

    Thekkek, Nadhi; Lee, Michelle H; Polydorides, Alexandros D; Rosen, Daniel G; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2015-05-01

    Current imaging tools are associated with inconsistent sensitivity and specificity for detection of Barrett's-associated neoplasia. Optical imaging has shown promise in improving the classification of neoplasia in vivo. The goal of this pilot study was to evaluate whether in vivo vital dye fluorescence imaging (VFI) has the potential to improve the accuracy of early-detection of Barrett's-associated neoplasia. In vivo endoscopic VFI images were collected from 65 sites in 14 patients with confirmed Barrett's esophagus (BE), dysplasia, oresophageal adenocarcinoma using a modular video endoscope and a high-resolution microendoscope(HRME). Qualitative image features were compared to histology; VFI and HRME images show changes in glandular structure associated with neoplastic progression. Quantitative image features in VFI images were identified for objective image classification of metaplasia and neoplasia, and a diagnostic algorithm was developed using leave-one-out cross validation. Three image features extracted from VFI images were used to classify tissue as neoplastic or not with a sensitivity of 87.8% and a specificity of 77.6% (AUC = 0.878). A multimodal approach incorporating VFI and HRME imaging can delineate epithelial changes present in Barrett's-associated neoplasia. Quantitative analysis of VFI images may provide a means for objective interpretation of BE during surveillance.

  5. The SCHEIE Visual Field Grading System

    PubMed Central

    Sankar, Prithvi S.; O’Keefe, Laura; Choi, Daniel; Salowe, Rebecca; Miller-Ellis, Eydie; Lehman, Amanda; Addis, Victoria; Ramakrishnan, Meera; Natesh, Vikas; Whitehead, Gideon; Khachatryan, Naira; O’Brien, Joan

    2017-01-01

    Objective No method of grading visual field (VF) defects has been widely accepted throughout the glaucoma community. The SCHEIE (Systematic Classification of Humphrey visual fields-Easy Interpretation and Evaluation) grading system for glaucomatous visual fields was created to convey qualitative and quantitative information regarding visual field defects in an objective, reproducible, and easily applicable manner for research purposes. Methods The SCHEIE grading system is composed of a qualitative and quantitative score. The qualitative score consists of designation in one or more of the following categories: normal, central scotoma, paracentral scotoma, paracentral crescent, temporal quadrant, nasal quadrant, peripheral arcuate defect, expansive arcuate, or altitudinal defect. The quantitative component incorporates the Humphrey visual field index (VFI), location of visual defects for superior and inferior hemifields, and blind spot involvement. Accuracy and speed at grading using the qualitative and quantitative components was calculated for non-physician graders. Results Graders had a median accuracy of 96.67% for their qualitative scores and a median accuracy of 98.75% for their quantitative scores. Graders took a mean of 56 seconds per visual field to assign a qualitative score and 20 seconds per visual field to assign a quantitative score. Conclusion The SCHEIE grading system is a reproducible tool that combines qualitative and quantitative measurements to grade glaucomatous visual field defects. The system aims to standardize clinical staging and to make specific visual field defects more easily identifiable. Specific patterns of visual field loss may also be associated with genetic variants in future genetic analysis. PMID:28932621

  6. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  7. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  8. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment

    PubMed Central

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364

  9. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method

    PubMed Central

    2017-01-01

    Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). Conclusions This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. PMID:28292738

  10. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  11. D Imaging for Museum Artefacts: a Portable Test Object for Heritage and Museum Documentation of Small Objects

    NASA Astrophysics Data System (ADS)

    Hess, M.; Robson, S.

    2012-07-01

    3D colour image data generated for the recording of small museum objects and archaeological finds are highly variable in quality and fitness for purpose. Whilst current technology is capable of extremely high quality outputs, there are currently no common standards or applicable guidelines in either the museum or engineering domain suited to scientific evaluation, understanding and tendering for 3D colour digital data. This paper firstly explains the rationale towards and requirements for 3D digital documentation in museums. Secondly it describes the design process, development and use of a new portable test object suited to sensor evaluation and the provision of user acceptance metrics. The test object is specifically designed for museums and heritage institutions and includes known surface and geometric properties which support quantitative and comparative imaging on different systems. The development for a supporting protocol will allow object reference data to be included in the data processing workflow with specific reference to conservation and curation.

  12. Fuzzy Logic Approaches to Multi-Objective Decision-Making in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1994-01-01

    Fuzzy logic allows for the quantitative representation of multi-objective decision-making problems which have vague or fuzzy objectives and parameters. As such, fuzzy logic approaches are well-suited to situations where alternatives must be assessed by using criteria that are subjective and of unequal importance. This paper presents an overview of fuzzy logic and provides sample applications from the aerospace industry. Applications include an evaluation of vendor proposals, an analysis of future space vehicle options, and the selection of a future space propulsion system. On the basis of the results provided in this study, fuzzy logic provides a unique perspective on the decision-making process, allowing the evaluator to assess the degree to which each option meets the evaluation criteria. Future decision-making should take full advantage of fuzzy logic methods to complement existing approaches in the selection of alternatives.

  13. Weaving a Stronger Fabric for Improved Outcomes

    ERIC Educational Resources Information Center

    Lobry de Bruyn, Lisa; Prior, Julian; Lenehan, Jo

    2014-01-01

    Purpose: To explain how training and education events (TEEs) can be designed to increase the likelihood of achieving behavioural objectives. Approach: The approach combined both a quantitative review of evaluation surveys undertaken at the time of the TEE, and qualitative telephone interviews with selected attendees (2025% of the total population…

  14. The Intercultural Sensitivity of Chilean Teachers Serving an Immigrant Population in Schools

    ERIC Educational Resources Information Center

    Morales Mendoza, Karla; Sanhueza Henríquez, Susan; Friz Carrillo, Miguel; Riquelme Bravo, Paula

    2017-01-01

    The objective of this article is to evaluate the intercultural sensitivity of teachers working in culturally diverse classrooms, and to analyse differences in intercultural sensitivity based on the gender, age, training (advanced training courses), and intercultural experience of the teachers. A quantitative approach with a comparative descriptive…

  15. Effectiveness of Motivational Interviewing Interventions for Adolescent Substance Use Behavior Change: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Jensen, Chad D.; Cushing, Christopher C.; Aylward, Brandon S.; Craig, James T.; Sorell, Danielle M.; Steele, Ric G.

    2011-01-01

    Objective: This study was designed to quantitatively evaluate the effectiveness of motivational interviewing (MI) interventions for adolescent substance use behavior change. Method: Literature searches of electronic databases were undertaken in addition to manual reference searches of identified review articles. Databases searched include…

  16. Developing a multi-joint upper limb exoskeleton robot for diagnosis, therapy, and outcome evaluation in neurorehabilitation.

    PubMed

    Ren, Yupeng; Kang, Sang Hoon; Park, Hyung-Soon; Wu, Yi-Ning; Zhang, Li-Qun

    2013-05-01

    Arm impairments in patients post stroke involve the shoulder, elbow and wrist simultaneously. It is not very clear how patients develop spasticity and reduced range of motion (ROM) at the multiple joints and the abnormal couplings among the multiple joints and the multiple degrees-of-freedom (DOF) during passive movement. It is also not clear how they lose independent control of individual joints/DOFs and coordination among the joints/DOFs during voluntary movement. An upper limb exoskeleton robot, the IntelliArm, which can control the shoulder, elbow, and wrist, was developed, aiming to support clinicians and patients with the following integrated capabilities: 1) quantitative, objective, and comprehensive multi-joint neuromechanical pre-evaluation capabilities aiding multi-joint/DOF diagnosis for individual patients; 2) strenuous and safe passive stretching of hypertonic/deformed arm for loosening up muscles/joints based on the robot-aided diagnosis; 3) (assistive/resistive) active reaching training after passive stretching for regaining/improving motor control ability; and 4) quantitative, objective, and comprehensive neuromechanical outcome evaluation at the level of individual joints/DOFs, multiple joints, and whole arm. Feasibility of the integrated capabilities was demonstrated through experiments with stroke survivors and healthy subjects.

  17. Evaluating a nursing communication skills training course: The relationships between self-rated ability, satisfaction, and actual performance.

    PubMed

    Mullan, Barbara A; Kothe, Emily J

    2010-11-01

    Effective communication is a vital component of nursing care, however, nurses often lack the skills to communicate with patients, carers and other health care professionals. Communication skills training programs are frequently used to develop these skills. However, there is a paucity of data on how best to evaluate such courses. The aim of the current study was to evaluate the relationship between student self rating of their own ability and their satisfaction with a nurse training course as compared with an objective measure of communication skills. 209 first year nursing students completed a communication skills program. Both qualitative and quantitative data were collected and associations between measures were investigated. Paired samples t-tests showed significant improvement in self-rated ability over the course of the program. Students generally were very satisfied with the course which was reflected in both qualitative and quantitative measures. However, neither self-rated ability nor satisfaction was significantly correlated with the objective measure of performance, but self-rated ability and satisfaction were highly correlated with one another. The importance of these findings is discussed and implications for nurse education are proposed. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Association between quantitative measures obtained using fluorescence-based methods and activity status of occlusal caries lesions in primary molars.

    PubMed

    Novaes, Tatiane Fernandes; Reyes, Alessandra; Matos, Ronilza; Antunes-Pontes, Laura Regina; Marques, Renata Pereira de Samuel; Braga, Mariana Minatel; Diniz, Michele Baffi; Mendes, Fausto Medeiros

    2017-05-01

    Fluorescence-based methods (FBM) can add objectiveness to diagnosis strategy for caries. Few studies, however, have focused on the evaluation of caries activity. To evaluate the association between quantitative measures obtained with FBM, clinical parameters acquired from the patients, caries detection, and assessment of activity status in occlusal surfaces of primary molars. Six hundred and six teeth from 113 children (4-14 years) were evaluated. The presence of a biofilm, caries experience, and the number of active lesions were recorded. The teeth were assessed using FBM: DIAGNOdent pen (Lfpen) and Quantitative light-induced fluorescence (QLF). As reference standard, all teeth were evaluated using the ICDAS (International Caries Detection and Assessment System) associated with clinical activity assessments. Multilevel regressions compared the FBM values and evaluated the association between the FBM measures and clinical variables related to the caries activity. The measures from the FBM were higher in cavitated lesions. Only, ∆F values distinguished active and inactive lesions. The LFpen measures were higher in active lesions, at the cavitated threshold (56.95 ± 29.60). Following regression analyses, only the presence of visible biofilm on occlusal surfaces (adjusted prevalence ratio = 1.43) and ∆R values of the teeth (adjusted prevalence ratio = 1.02) were associated with caries activity. Some quantitative measures from FBM parameters are associated with caries activity evaluation, which is similar to the clinical evaluation of the presence of visible biofilm. © 2016 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. The use of a battery of tracking tests in the quantitative evaluation of neurological function

    NASA Technical Reports Server (NTRS)

    Repa, B. S.; Albers, J. W.; Potvin, A. R.; Tourtellotte, W. W.

    1972-01-01

    A tracking test battery has been applied in a drug trail designed to compare the efficacy of L-DOPA and amantadine to that of L-DOPA and placebo in the treatment of 28 patients with Parkinson's disease. The drug trial provided an ideal opportunity for objectively evaluating the usefulness of tracking tests in assessing changes in neurologic function. Evaluating changes in patient performance resulting from disease progression and controlled clinical trials is of great importance in establishing effective treatment programs.

  20. Detection of blur artifacts in histopathological whole-slide images of endomyocardial biopsies.

    PubMed

    Hang Wu; Phan, John H; Bhatia, Ajay K; Cundiff, Caitlin A; Shehata, Bahig M; Wang, May D

    2015-01-01

    Histopathological whole-slide images (WSIs) have emerged as an objective and quantitative means for image-based disease diagnosis. However, WSIs may contain acquisition artifacts that affect downstream image feature extraction and quantitative disease diagnosis. We develop a method for detecting blur artifacts in WSIs using distributions of local blur metrics. As features, these distributions enable accurate classification of WSI regions as sharp or blurry. We evaluate our method using over 1000 portions of an endomyocardial biopsy (EMB) WSI. Results indicate that local blur metrics accurately detect blurry image regions.

  1. Fuzzy Performance between Surface Fitting and Energy Distribution in Turbulence Runner

    PubMed Central

    Liang, Zhongwei; Liu, Xiaochu; Ye, Bangyan; Brauwer, Richard Kars

    2012-01-01

    Because the application of surface fitting algorithms exerts a considerable fuzzy influence on the mathematical features of kinetic energy distribution, their relation mechanism in different external conditional parameters must be quantitatively analyzed. Through determining the kinetic energy value of each selected representative position coordinate point by calculating kinetic energy parameters, several typical algorithms of complicated surface fitting are applied for constructing microkinetic energy distribution surface models in the objective turbulence runner with those obtained kinetic energy values. On the base of calculating the newly proposed mathematical features, we construct fuzzy evaluation data sequence and present a new three-dimensional fuzzy quantitative evaluation method; then the value change tendencies of kinetic energy distribution surface features can be clearly quantified, and the fuzzy performance mechanism discipline between the performance results of surface fitting algorithms, the spatial features of turbulence kinetic energy distribution surface, and their respective environmental parameter conditions can be quantitatively analyzed in detail, which results in the acquirement of final conclusions concerning the inherent turbulence kinetic energy distribution performance mechanism and its mathematical relation. A further turbulence energy quantitative study can be ensured. PMID:23213287

  2. [Study on objectively evaluating skin aging according to areas of skin texture].

    PubMed

    Shan, Gaixin; Gan, Ping; He, Ling; Sun, Lu; Li, Qiannan; Jiang, Zheng; He, Xiangqian

    2015-02-01

    Skin aging principles play important roles in skin disease diagnosis, the evaluation of skin cosmetic effect, forensic identification and age identification in sports competition, etc. This paper proposes a new method to evaluate the skin aging objectively and quantitatively by skin texture area. Firstly, the enlarged skin image was acquired. Then, the skin texture image was segmented by using the iterative threshold method, and the skin ridge image was extracted according to the watershed algorithm. Finally, the skin ridge areas of the skin texture were extracted. The experiment data showed that the average areas of skin ridges, of both men and women, had a good correlation with age (the correlation coefficient r of male was 0.938, and the correlation coefficient r of female was 0.922), and skin texture area and age regression curve showed that the skin texture area increased with age. Therefore, it is effective to evaluate skin aging objectively by the new method presented in this paper.

  3. Simple Learned Weighted Sums of Inferior Temporal Neuronal Firing Rates Accurately Predict Human Core Object Recognition Performance

    PubMed Central

    Hong, Ha; Solomon, Ethan A.; DiCarlo, James J.

    2015-01-01

    To go beyond qualitative models of the biological substrate of object recognition, we ask: can a single ventral stream neuronal linking hypothesis quantitatively account for core object recognition performance over a broad range of tasks? We measured human performance in 64 object recognition tests using thousands of challenging images that explore shape similarity and identity preserving object variation. We then used multielectrode arrays to measure neuronal population responses to those same images in visual areas V4 and inferior temporal (IT) cortex of monkeys and simulated V1 population responses. We tested leading candidate linking hypotheses and control hypotheses, each postulating how ventral stream neuronal responses underlie object recognition behavior. Specifically, for each hypothesis, we computed the predicted performance on the 64 tests and compared it with the measured pattern of human performance. All tested hypotheses based on low- and mid-level visually evoked activity (pixels, V1, and V4) were very poor predictors of the human behavioral pattern. However, simple learned weighted sums of distributed average IT firing rates exactly predicted the behavioral pattern. More elaborate linking hypotheses relying on IT trial-by-trial correlational structure, finer IT temporal codes, or ones that strictly respect the known spatial substructures of IT (“face patches”) did not improve predictive power. Although these results do not reject those more elaborate hypotheses, they suggest a simple, sufficient quantitative model: each object recognition task is learned from the spatially distributed mean firing rates (100 ms) of ∼60,000 IT neurons and is executed as a simple weighted sum of those firing rates. SIGNIFICANCE STATEMENT We sought to go beyond qualitative models of visual object recognition and determine whether a single neuronal linking hypothesis can quantitatively account for core object recognition behavior. To achieve this, we designed a database of images for evaluating object recognition performance. We used multielectrode arrays to characterize hundreds of neurons in the visual ventral stream of nonhuman primates and measured the object recognition performance of >100 human observers. Remarkably, we found that simple learned weighted sums of firing rates of neurons in monkey inferior temporal (IT) cortex accurately predicted human performance. Although previous work led us to expect that IT would outperform V4, we were surprised by the quantitative precision with which simple IT-based linking hypotheses accounted for human behavior. PMID:26424887

  4. Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1989-09-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less

  5. Metrology Standards for Quantitative Imaging Biomarkers

    PubMed Central

    Obuchowski, Nancy A.; Kessler, Larry G.; Raunig, David L.; Gatsonis, Constantine; Huang, Erich P.; Kondratovich, Marina; McShane, Lisa M.; Reeves, Anthony P.; Barboriak, Daniel P.; Guimaraes, Alexander R.; Wahl, Richard L.

    2015-01-01

    Although investigators in the imaging community have been active in developing and evaluating quantitative imaging biomarkers (QIBs), the development and implementation of QIBs have been hampered by the inconsistent or incorrect use of terminology or methods for technical performance and statistical concepts. Technical performance is an assessment of how a test performs in reference objects or subjects under controlled conditions. In this article, some of the relevant statistical concepts are reviewed, methods that can be used for evaluating and comparing QIBs are described, and some of the technical performance issues related to imaging biomarkers are discussed. More consistent and correct use of terminology and study design principles will improve clinical research, advance regulatory science, and foster better care for patients who undergo imaging studies. © RSNA, 2015 PMID:26267831

  6. An experimental comparison of online object-tracking algorithms

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Chen, Feng; Xu, Wenli; Yang, Ming-Hsuan

    2011-09-01

    This paper reviews and evaluates several state-of-the-art online object tracking algorithms. Notwithstanding decades of efforts, object tracking remains a challenging problem due to factors such as illumination, pose, scale, deformation, motion blur, noise, and occlusion. To account for appearance change, most recent tracking algorithms focus on robust object representations and effective state prediction. In this paper, we analyze the components of each tracking method and identify their key roles in dealing with specific challenges, thereby shedding light on how to choose and design algorithms for different situations. We compare state-of-the-art online tracking methods including the IVT,1 VRT,2 FragT,3 BoostT,4 SemiT,5 BeSemiT,6 L1T,7 MILT,8 VTD9 and TLD10 algorithms on numerous challenging sequences, and evaluate them with different performance metrics. The qualitative and quantitative comparative results demonstrate the strength and weakness of these algorithms.

  7. Change Detection Algorithms for Surveillance in Visual IoT: A Comparative Study

    NASA Astrophysics Data System (ADS)

    Akram, Beenish Ayesha; Zafar, Amna; Akbar, Ali Hammad; Wajid, Bilal; Chaudhry, Shafique Ahmad

    2018-01-01

    The VIoT (Visual Internet of Things) connects virtual information world with real world objects using sensors and pervasive computing. For video surveillance in VIoT, ChD (Change Detection) is a critical component. ChD algorithms identify regions of change in multiple images of the same scene recorded at different time intervals for video surveillance. This paper presents performance comparison of histogram thresholding and classification ChD algorithms using quantitative measures for video surveillance in VIoT based on salient features of datasets. The thresholding algorithms Otsu, Kapur, Rosin and classification methods k-means, EM (Expectation Maximization) were simulated in MATLAB using diverse datasets. For performance evaluation, the quantitative measures used include OSR (Overall Success Rate), YC (Yule's Coefficient) and JC (Jaccard's Coefficient), execution time and memory consumption. Experimental results showed that Kapur's algorithm performed better for both indoor and outdoor environments with illumination changes, shadowing and medium to fast moving objects. However, it reflected degraded performance for small object size with minor changes. Otsu algorithm showed better results for indoor environments with slow to medium changes and nomadic object mobility. k-means showed good results in indoor environment with small object size producing slow change, no shadowing and scarce illumination changes.

  8. Holographic quantitative imaging of sample hidden by turbid medium or occluding objects

    NASA Astrophysics Data System (ADS)

    Bianco, V.; Miccio, L.; Merola, F.; Memmolo, P.; Gennari, O.; Paturzo, Melania; Netti, P. A.; Ferraro, P.

    2015-03-01

    Digital Holography (DH) numerical procedures have been developed to allow imaging through turbid media. A fluid is considered turbid when dispersed particles provoke strong light scattering, thus destroying the image formation by any standard optical system. Here we show that sharp amplitude imaging and phase-contrast mapping of object hidden behind turbid medium and/or occluding objects are possible in harsh noise conditions and with a large field-of view by Multi-Look DH microscopy. In particular, it will be shown that both amplitude imaging and phase-contrast mapping of cells hidden behind a flow of Red Blood Cells can be obtained. This allows, in a noninvasive way, the quantitative evaluation of living processes in Lab on Chip platforms where conventional microscopy techniques fail. The combination of this technique with endoscopic imaging can pave the way for the holographic blood vessel inspection, e.g. to look for settled cholesterol plaques as well as blood clots for a rapid diagnostics of blood diseases.

  9. Quantitative 4D Transcatheter Intraarterial Perfusion MR Imaging as a Method to Standardize Angiographic Chemoembolization Endpoints

    PubMed Central

    Jin, Brian; Wang, Dingxin; Lewandowski, Robert J.; Ryu, Robert K.; Sato, Kent T.; Larson, Andrew C.; Salem, Riad; Omary, Reed A.

    2011-01-01

    PURPOSE We aimed to test the hypothesis that subjective angiographic endpoints during transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC) exhibit consistency and correlate with objective intraprocedural reductions in tumor perfusion as determined by quantitative four dimensional (4D) transcatheter intraarterial perfusion (TRIP) magnetic resonance (MR) imaging. MATERIALS AND METHODS This prospective study was approved by the institutional review board. Eighteen consecutive patients underwent TACE in a combined MR/interventional radiology (MR-IR) suite. Three board-certified interventional radiologists independently graded the angiographic endpoint of each procedure based on a previously described subjective angiographic chemoembolization endpoint (SACE) scale. A consensus SACE rating was established for each patient. Patients underwent quantitative 4D TRIP-MR imaging immediately before and after TACE, from which mean whole tumor perfusion (Fρ) was calculated. Consistency of SACE ratings between observers was evaluated using the intraclass correlation coefficient (ICC). The relationship between SACE ratings and intraprocedural TRIP-MR imaging perfusion changes was evaluated using Spearman’s rank correlation coefficient. RESULTS The SACE rating scale demonstrated very good consistency among all observers (ICC = 0.80). The consensus SACE rating was significantly correlated with both absolute (r = 0.54, P = 0.022) and percent (r = 0.85, P < 0.001) intraprocedural perfusion reduction. CONCLUSION The SACE rating scale demonstrates very good consistency between raters, and significantly correlates with objectively measured intraprocedural perfusion reductions during TACE. These results support the use of the SACE scale as a standardized alternative method to quantitative 4D TRIP-MR imaging to classify patients based on embolic endpoints of TACE. PMID:22021520

  10. Development of an Interactive Social Media Tool for Parents with Concerns about Vaccines

    ERIC Educational Resources Information Center

    Shoup, Jo Ann; Wagner, Nicole M.; Kraus, Courtney R.; Narwaney, Komal J.; Goddard, Kristin S.; Glanz, Jason M.

    2015-01-01

    Objective: Describe a process for designing, building, and evaluating a theory-driven social media intervention tool to help reduce parental concerns about vaccination. Method: We developed an interactive web-based tool using quantitative and qualitative methods (e.g., survey, focus groups, individual interviews, and usability testing). Results:…

  11. Sexual Health Promotion Programme: Participants' Perspectives on Capacity Building

    ERIC Educational Resources Information Center

    Keogh, Brian; Daly, Louise; Sharek, Danika; De Vries, Jan; McCann, Edward; Higgins, Agnes

    2016-01-01

    Objectives: The aim of this study was to evaluate a Health Service Executive (HSE) Foundation Programme in Sexual Health Promotion (FPSHP) with a specific emphasis on capacity building. Design: A mixed-method design using both quantitative and qualitative methods was used to collect the data. Setting: The FPSHP was delivered to staff working in…

  12. Force Exertion Capacity Measurements in Haptic Virtual Environments

    ERIC Educational Resources Information Center

    Munih, Marko; Bardorfer, Ales; Ceru, Bojan; Bajd, Tadej; Zupan, Anton

    2010-01-01

    An objective test for evaluating functional status of the upper limbs (ULs) in patients with muscular distrophy (MD) is presented. The method allows for quantitative assessment of the UL functional state with an emphasis on force exertion capacity. The experimental measurement setup and the methodology for the assessment of maximal exertable force…

  13. Evaluation of the ruminal bacterial diversity of cattle fed diets containing citrus pulp pellets

    USDA-ARS?s Scientific Manuscript database

    The rumen microbial ecosystem remains a mystery from a quantitative perspective. Dietary components and changes cause shifts in the ruminal microbial ecology that can play a role in animal health and productivity, but the magnitude of these changes remains unknown. The objective of this study was ...

  14. Nutrient uptake, biomass yield and quantitative analysis of aliphatic aldehydes in cilantro plants

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to evaluate the nutrient uptake, biomass production and yield of the major compounds in the essential oil of five genotypes of Coriandrum sativum L. The treatments were four accessions donated by the National Genetic Resources Advisory Council (NGRAC), U.S. Department...

  15. Model-assisted development of a laminography inspection system

    NASA Astrophysics Data System (ADS)

    Grandin, R.; Gray, J.

    2012-05-01

    Traditional computed tomography (CT) is an effective method of determining the internal structure of an object through non-destructive means; however, inspection of certain objects, such as those with planar geometrics or with limited access, requires an alternate approach. An alternative is laminography and has been the focus of a number of researchers in the past decade for both medical and industrial inspections. Many research efforts rely on geometrically-simple analytical models, such as the Shepp-Logan phantom, for the development of their algorithms. Recent work at the Center for Non-Destructive Evaluation makes extensive use of a forward model, XRSIM, to study artifacts arising from the reconstruction method, the effects of complex geometries and known issues such as high density features on the laminography reconstruction process. The use of a model provides full knowledge of all aspects of the geometry and provides a means to quantitatively evaluate the impact of methods designed to reduce artifacts generated by the reconstruction methods or that are result of the part geometry. We will illustrate the use of forward simulations to quantitatively assess reconstruction algorithm development and artifact reduction.

  16. Near-infrared fluorescence image quality test methods for standardized performance evaluation

    NASA Astrophysics Data System (ADS)

    Kanniyappan, Udayakumar; Wang, Bohan; Yang, Charles; Ghassemi, Pejhman; Wang, Quanzeng; Chen, Yu; Pfefer, Joshua

    2017-03-01

    Near-infrared fluorescence (NIRF) imaging has gained much attention as a clinical method for enhancing visualization of cancers, perfusion and biological structures in surgical applications where a fluorescent dye is monitored by an imaging system. In order to address the emerging need for standardization of this innovative technology, it is necessary to develop and validate test methods suitable for objective, quantitative assessment of device performance. Towards this goal, we develop target-based test methods and investigate best practices for key NIRF imaging system performance characteristics including spatial resolution, depth of field and sensitivity. Characterization of fluorescence properties was performed by generating excitation-emission matrix properties of indocyanine green and quantum dots in biological solutions and matrix materials. A turbid, fluorophore-doped target was used, along with a resolution target for assessing image sharpness. Multi-well plates filled with either liquid or solid targets were generated to explore best practices for evaluating detection sensitivity. Overall, our results demonstrate the utility of objective, quantitative, target-based testing approaches as well as the need to consider a wide range of factors in establishing standardized approaches for NIRF imaging system performance.

  17. Positioning Animal Welfare in the One Health Concept through Evaluation of an Animal Welfare Center in Skopje, Macedonia.

    PubMed

    Radeski, Miroslav; O'Shea, Helen; De Meneghi, Daniele; Ilieski, Vlatko

    2017-01-01

    The Animal Welfare Center (AWC) in Macedonia was established in 2009. The objectives of the center are animal welfare (AW) education, research, raising public awareness of AW, and increasing cooperation between the stakeholders. One Health (OH) was not the major focus of the AWC work initially, but, rather, a focus that evolved recently. The objective of this study was to evaluate the AWC from the OH perspective as an example case for positioning the AW within the overall OH concept. Three types of evaluation were performed: (1) assessment of OH-ness, by quantitative measurement of the operational and infrastructural aspects of the AWC; (2) impact evaluation, by conducting quantitative surveys on stakeholders and students; and (3) transdisciplinary evaluation, using semi-quantitative evaluation of the links of cooperation between the AWC and the stakeholders in society by the custom designed CACA (Cooperation, Activities, Communication, and Agreement) scoring system. Results for the OH-ness of the AWC showed relatively high scores for OH thinking, planning and working and middle scores for OH learning and sharing dimensions, i.e., dominance of the operational over infrastructural aspects of the AWC. The impact evaluation of the AWC shows that familiarity with the OH concept among stakeholders was low (44% of the respondents). However, there was a commonality among stakeholder's interest about AW and OH. According to the stakeholders' and students' opinions, the influence of AW on Animal, Environmental, and Human Health is relatively high (in the upper third of the 1-10 scale). The transdisciplinary evaluation of the AWC indicated the presence of transdisciplinarity work by the AWC, with a higher focus on the Universities and Research Institutions and some governmental institutions, and less linked with the Non-Governmental Organizations and Professional Associations (Chambers), e.g., the Veterinary Chamber in Macedonia. The evaluations conducted indicated that the AWC's work is closely dedicated to improving animal, environmental, and human health and has a considerable OH role among the stakeholders in the society. This study describes the significant role and importance that AW has in OH.

  18. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    NASA Astrophysics Data System (ADS)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  19. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  20. CMEIAS color segmentation: an improved computing technology to process color images for quantitative microbial ecology studies at single-cell resolution.

    PubMed

    Gross, Colin A; Reddy, Chandan K; Dazzo, Frank B

    2010-02-01

    Quantitative microscopy and digital image analysis are underutilized in microbial ecology largely because of the laborious task to segment foreground object pixels from background, especially in complex color micrographs of environmental samples. In this paper, we describe an improved computing technology developed to alleviate this limitation. The system's uniqueness is its ability to edit digital images accurately when presented with the difficult yet commonplace challenge of removing background pixels whose three-dimensional color space overlaps the range that defines foreground objects. Image segmentation is accomplished by utilizing algorithms that address color and spatial relationships of user-selected foreground object pixels. Performance of the color segmentation algorithm evaluated on 26 complex micrographs at single pixel resolution had an overall pixel classification accuracy of 99+%. Several applications illustrate how this improved computing technology can successfully resolve numerous challenges of complex color segmentation in order to produce images from which quantitative information can be accurately extracted, thereby gain new perspectives on the in situ ecology of microorganisms. Examples include improvements in the quantitative analysis of (1) microbial abundance and phylotype diversity of single cells classified by their discriminating color within heterogeneous communities, (2) cell viability, (3) spatial relationships and intensity of bacterial gene expression involved in cellular communication between individual cells within rhizoplane biofilms, and (4) biofilm ecophysiology based on ribotype-differentiated radioactive substrate utilization. The stand-alone executable file plus user manual and tutorial images for this color segmentation computing application are freely available at http://cme.msu.edu/cmeias/ . This improved computing technology opens new opportunities of imaging applications where discriminating colors really matter most, thereby strengthening quantitative microscopy-based approaches to advance microbial ecology in situ at individual single-cell resolution.

  1. Ground-water hydraulics - A summary of lectures presented by John G. Ferris at short courses conducted by the Ground Water Branch, part 1, Theory

    USGS Publications Warehouse

    Knowles, D.B.

    1955-01-01

    The objective of the Ground Water Branch is to evaluate the occurrence, availability, and quality of ground water.  The science of ground-water hydrology is applied toward attaining that goal.  Although many ground-water investigations are of a qualitative nature, quantitative studies are necessarily an integral component of the complete evaluation of occurrence and availability.  The worth of an aquifer as a fully developed source of water depends largely on two inherent characteristics: its ability to store, and its ability to transmit water.  Furthermore, quantitative knowledge of these characteristics facilitates measurement of hydrologic entities such as recharge, leakage, evapotranspiration, etc.  It is recognized that these two characteristics, referred to as the coefficients of storage and transmissibility, generally provide the very foundation on which quantitative studies are constructed.  Within the science of ground-water hydrology, ground-water hydraulics methods are applied to determine these constats from field data.

  2. Clinical study of quantitative diagnosis of early cervical cancer based on the classification of acetowhitening kinetics

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.

    2010-03-01

    A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.

  3. Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.

    PubMed

    He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan

    2009-07-01

    Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.

  4. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data

    PubMed Central

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi

    2015-01-01

    Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366

  5. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data.

    PubMed

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi

    2015-04-01

    Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  6. A Simple Configuration for Quantitative Phase Contrast Microscopy of Transmissible Samples

    NASA Astrophysics Data System (ADS)

    Sengupta, Chandan; Dasgupta, Koustav; Bhattacharya, K.

    Phase microscopy attempts to visualize and quantify the phase distribution of samples which are otherwise invisible under microscope without the use of stains. The two principal approaches to phase microscopy are essentially those of Fourier plane modulation and interferometric techniques. Although the former, first proposed by Zernike, had been the harbinger of phase microscopy, it was the latter that allowed for quantitative evaluation of phase samples. However interferometric techniques are fraught with associated problems such as complicated setup involving mirrors and beam-splitters, the need for a matched objective in the reference arm and also the need for vibration isolation. The present work proposes a single element cube beam-splitter (CBS) interferometer combined with a microscope objective (MO) for interference microscopy. Because of the monolithic nature of the interferometer, the system is almost insensitive to vibrations and relatively simple to align. It will be shown that phase shifting properties may also be introduced by suitable and proper use of polarizing devices. Initial results showing the quantitative three dimensional phase profiles of simulated and actual biological specimens are presented.

  7. MR Morphology of Triangular Fibrocartilage Complex: Correlation with Quantitative MR and Biomechanical Properties

    PubMed Central

    Bae, Won C.; Ruangchaijatuporn, Thumanoon; Chang, Eric Y; Biswas, Reni; Du, Jiang; Statum, Sheronda

    2016-01-01

    Objective To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Materials and Methods Five cadaveric wrists (22 to 70 yrs) were imaged at 3T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. Results On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. Conclusion These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques. PMID:26691643

  8. Statistical significance of trace evidence matches using independent physicochemical measurements

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George

    1997-02-01

    A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.

  9. Computer-aided analysis with Image J for quantitatively assessing psoriatic lesion area.

    PubMed

    Sun, Z; Wang, Y; Ji, S; Wang, K; Zhao, Y

    2015-11-01

    Body surface area is important in determining the severity of psoriasis. However, objective, reliable, and practical method is still in need for this purpose. We performed a computer image analysis (CIA) of psoriatic area using the image J freeware to determine whether this method could be used for objective evaluation of psoriatic area. Fifteen psoriasis patients were randomized to be treated with adalimumab or placebo in a clinical trial. At each visit, the psoriasis area of each body site was estimated by two physicians (E-method), and standard photographs were taken. The psoriasis area in the pictures was assessed with CIA using semi-automatic threshold selection (T-method), or manual selection (M-method, gold standard). The results assessed by the three methods were analyzed with reliability and affecting factors evaluated. Both T- and E-method correlated strongly with M-method, and T-method had a slightly stronger correlation with M-method. Both T- and E-methods had a good consistency between the evaluators. All the three methods were able to detect the change in the psoriatic area after treatment, while the E-method tends to overestimate. The CIA with image J freeware is reliable and practicable in quantitatively assessing the lesional of psoriasis area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  11. Meta-analysis of the technical performance of an imaging procedure: Guidelines and statistical methodology

    PubMed Central

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2017-01-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353

  12. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  13. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    PubMed

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  14. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  15. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    PubMed

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions and can comprehensively and accurately evaluate occupational health risk caused by DMF.

  16. A Quantitative and Model-Driven Approach to Assessing Higher Education in the United States of America

    ERIC Educational Resources Information Center

    Huang, Zuqing; Qiu, Robin G.

    2016-01-01

    University ranking or higher education assessment in general has been attracting more and more public attention over the years. However, the subjectivity-based evaluation index and indicator selections and weights that are widely adopted in most existing ranking systems have been called into question. In other words, the objectivity and…

  17. Further Evidence of Complex Motor Dysfunction in Drug Naive Children with Autism Using Automatic Motion Analysis of Gait

    ERIC Educational Resources Information Center

    Nobile, Maria; Perego, Paolo; Piccinini, Luigi; Mani, Elisa; Rossi, Agnese; Bellina, Monica; Molteni, Massimo

    2011-01-01

    In order to increase the knowledge of locomotor disturbances in children with autism, and of the mechanism underlying them, the objective of this exploratory study was to reliably and quantitatively evaluate linear gait parameters (spatio-temporal and kinematic parameters), upper body kinematic parameters, walk orientation and smoothness using an…

  18. User and System-Based Quality Criteria for Evaluating Information Resources and Services Available from Federal Websites: Final Report.

    ERIC Educational Resources Information Center

    Wyman, Steven K.; And Others

    This exploratory study establishes analytical tools (based on both technical criteria and user feedback) by which federal Web site administrators may assess the quality of their websites. The study combined qualitative and quantitative data collection techniques to achieve the following objectives: (1) identify and define key issues regarding…

  19. A Meta-Analysis of Interventions for Bereaved Children and Adolescents

    ERIC Educational Resources Information Center

    Rosner, Rita; Kruse, Joachim; Hagl, Maria

    2010-01-01

    The main objective of this review was to provide a quantitative and methodologically sound evaluation of existing treatments for bereavement and grief reactions in children and adolescents. Two meta-analyses were conducted: 1 on controlled studies and 1 on uncontrolled studies. The 2 meta-analyses were based on a total of 27 treatment studies…

  20. Discomfort Evaluation of Truck Ingress/Egress Motions Based on Biomechanical Analysis

    PubMed Central

    Choi, Nam-Chul; Lee, Sang Hun

    2015-01-01

    This paper presents a quantitative discomfort evaluation method based on biomechanical analysis results for human body movement, as well as its application to an assessment of the discomfort for truck ingress and egress. In this study, the motions of a human subject entering and exiting truck cabins with different types, numbers, and heights of footsteps were first measured using an optical motion capture system and load sensors. Next, the maximum voluntary contraction (MVC) ratios of the muscles were calculated through a biomechanical analysis of the musculoskeletal human model for the captured motion. Finally, the objective discomfort was evaluated using the proposed discomfort model based on the MVC ratios. To validate this new discomfort assessment method, human subject experiments were performed to investigate the subjective discomfort levels through a questionnaire for comparison with the objective discomfort levels. The validation results showed that the correlation between the objective and subjective discomforts was significant and could be described by a linear regression model. PMID:26067194

  1. The Impact of Quantitative Data Provided by a Multi-spectral Digital Skin Lesion Analysis Device on Dermatologists'Decisions to Biopsy Pigmented Lesions.

    PubMed

    Farberg, Aaron S; Winkelmann, Richard R; Tucker, Natalie; White, Richard; Rigel, Darrell S

    2017-09-01

    BACKGROUND: Early diagnosis of melanoma is critical to survival. New technologies, such as a multi-spectral digital skin lesion analysis (MSDSLA) device [MelaFind, STRATA Skin Sciences, Horsham, Pennsylvania] may be useful to enhance clinician evaluation of concerning pigmented skin lesions. Previous studies evaluated the effect of only the binary output. OBJECTIVE: The objective of this study was to determine how decisions dermatologists make regarding pigmented lesion biopsies are impacted by providing both the underlying classifier score (CS) and associated probability risk provided by multi-spectral digital skin lesion analysis. This outcome was also compared against the improvement reported with the provision of only the binary output. METHODS: Dermatologists attending an educational conference evaluated 50 pigmented lesions (25 melanomas and 25 benign lesions). Participants were asked if they would biopsy the lesion based on clinical images, and were asked this question again after being shown multi-spectral digital skin lesion analysis data that included the probability graphs and classifier score. RESULTS: Data were analyzed from a total of 160 United States board-certified dermatologists. Biopsy sensitivity for melanoma improved from 76 percent following clinical evaluation to 92 percent after quantitative multi-spectral digital skin lesion analysis information was provided ( p <0.0001). Specificity improved from 52 percent to 79 percent ( p <0.0001). The positive predictive value increased from 61 percent to 81 percent ( p <0.01) when the quantitative data were provided. Negative predictive value also increased (68% vs. 91%, p<0.01), and overall biopsy accuracy was greater with multi-spectral digital skin lesion analysis (64% vs. 86%, p <0.001). Interrater reliability improved (intraclass correlation 0.466 before, 0.559 after). CONCLUSION: Incorporating the classifier score and probability data into physician evaluation of pigmented lesions led to both increased sensitivity and specificity, thereby resulting in more accurate biopsy decisions.

  2. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  3. Quantification, validation, and follow-up of small bowel motility in Crohn's disease

    NASA Astrophysics Data System (ADS)

    Cerrolaza, Juan J.; Peng, Jennifer Q.; Safdar, Nabile M.; Conklin, Laurie; Sze, Raymond; Linguraru, Marius George

    2015-03-01

    The use of magnetic resonance enterography (MRE) has become a mainstay in the evaluation, assessment and follow up of inflammatory bowel diseases, such as Crohn's disease (CD), thanks to its high image quality and its non-ionizing nature. In particular, the advent of faster MRE sequences less sensitive to image-motion artifacts offers the possibility to obtain visual, structural and functional information of the patient's small bowel. However, the inherent subjectivity of the mere visual inspection of these images often hinders the accurate identification and monitoring of the pathological areas. In this paper, we present a framework that provides quantitative and objective motility information of the small bowel from free-breathing MRE dynamic sequences. After compensating for the breathing motion of the patient, we create personalized peristaltic activity maps via optical flow analysis. The result is the creation of a new set of images providing objective and precise functional information of the small bowel. The accuracy of the new method was also evaluated from two different perspectives: objective accuracy (1.1 ± 0.6 mm/s of error), i.e., the ability of the system to provide quantitative and accurate information about the motility of moving bowel landmarks, and subjective accuracy (avg. difference of 0.7 ± 0.7 in a range of 1 to 5), i.e., the degree of agreement with the subjective evaluation of an expert. Finally, the practical utility of the new method was successfully evaluated in a preliminary study with 32 studies of healthy and CD cases, showing its potential for the fast and accurate assessment and follow up of CD in the small bowel.

  4. Simple Learned Weighted Sums of Inferior Temporal Neuronal Firing Rates Accurately Predict Human Core Object Recognition Performance.

    PubMed

    Majaj, Najib J; Hong, Ha; Solomon, Ethan A; DiCarlo, James J

    2015-09-30

    To go beyond qualitative models of the biological substrate of object recognition, we ask: can a single ventral stream neuronal linking hypothesis quantitatively account for core object recognition performance over a broad range of tasks? We measured human performance in 64 object recognition tests using thousands of challenging images that explore shape similarity and identity preserving object variation. We then used multielectrode arrays to measure neuronal population responses to those same images in visual areas V4 and inferior temporal (IT) cortex of monkeys and simulated V1 population responses. We tested leading candidate linking hypotheses and control hypotheses, each postulating how ventral stream neuronal responses underlie object recognition behavior. Specifically, for each hypothesis, we computed the predicted performance on the 64 tests and compared it with the measured pattern of human performance. All tested hypotheses based on low- and mid-level visually evoked activity (pixels, V1, and V4) were very poor predictors of the human behavioral pattern. However, simple learned weighted sums of distributed average IT firing rates exactly predicted the behavioral pattern. More elaborate linking hypotheses relying on IT trial-by-trial correlational structure, finer IT temporal codes, or ones that strictly respect the known spatial substructures of IT ("face patches") did not improve predictive power. Although these results do not reject those more elaborate hypotheses, they suggest a simple, sufficient quantitative model: each object recognition task is learned from the spatially distributed mean firing rates (100 ms) of ∼60,000 IT neurons and is executed as a simple weighted sum of those firing rates. Significance statement: We sought to go beyond qualitative models of visual object recognition and determine whether a single neuronal linking hypothesis can quantitatively account for core object recognition behavior. To achieve this, we designed a database of images for evaluating object recognition performance. We used multielectrode arrays to characterize hundreds of neurons in the visual ventral stream of nonhuman primates and measured the object recognition performance of >100 human observers. Remarkably, we found that simple learned weighted sums of firing rates of neurons in monkey inferior temporal (IT) cortex accurately predicted human performance. Although previous work led us to expect that IT would outperform V4, we were surprised by the quantitative precision with which simple IT-based linking hypotheses accounted for human behavior. Copyright © 2015 the authors 0270-6474/15/3513402-17$15.00/0.

  5. Industrial application of thermal image processing and thermal control

    NASA Astrophysics Data System (ADS)

    Kong, Lingxue

    2001-09-01

    Industrial application of infrared thermography is virtually boundless as it can be used in any situations where there are temperature differences. This technology has particularly been widely used in automotive industry for process evaluation and system design. In this work, thermal image processing technique will be introduced to quantitatively calculate the heat stored in a warm/hot object and consequently, a thermal control system will be proposed to accurately and actively manage the thermal distribution within the object in accordance with the heat calculated from the thermal images.

  6. C-reactive protein estimation: a quantitative analysis for three nonsteroidal anti-inflammatory drugs: a randomized control trial.

    PubMed

    Salgia, Gaurav; Kulkarni, Deepak G; Shetty, Lakshmi

    2015-01-01

    C-reactive protein (CRP) estimation for quantitative analysis to assess anti-inflammatory action of nonsteroidal anti-inflammatory drugs (NSAIDs) after surgery in maxillofacial surgery. This study was to evaluate the efficacy of CRP as a quantitative analysis for objective assessment of efficacy of three NSAIDs in postoperative inflammation and pain control. The parallel study group design of randomization was done. Totally 60 patients were divided into three groups. CRP was evaluated at baseline and postoperatively (immediate and 72 h) after surgical removal of impacted lower third molar. The respective group received the drugs by random coding postoperatively. The assessment of pain control and inflammation using NSAIDs postoperatively after surgical removal of impacted lower third molar was qualitatively and quantitatively assessed with CRP levels. The blood sample of the patient was assessed immediate postoperatively and after 72 h. The visual analog scale (VAS) was used for assessment of pain and its correlation with CRP levels. Comparison of difference in levels of CRP levels had P < 0.05 with immediate postoperative and baseline levels. The duration of surgery with association of CRP levels P = 0.425 which was nonsignificant. The pain score was increased with mefenamic acid (P = 0.003), which was significant on VAS. Diclofenac had the best anti-inflammatory action. There was a significant increase in CRP levels in immediate postoperative values and 72 h. CRP test proved to be a useful indicator as a quantitative assessment tool for monitoring postsurgical inflammation and therapeutic effects of various anti-inflammatory drugs. CRP test is a useful indicator for quantitative assessment for comparative evaluation of NSAIDs.

  7. Multiple and mixed methods in formative evaluation: Is more better? Reflections from a South African study.

    PubMed

    Odendaal, Willem; Atkins, Salla; Lewin, Simon

    2016-12-15

    Formative programme evaluations assess intervention implementation processes, and are seen widely as a way of unlocking the 'black box' of any programme in order to explore and understand why a programme functions as it does. However, few critical assessments of the methods used in such evaluations are available, and there are especially few that reflect on how well the evaluation achieved its objectives. This paper describes a formative evaluation of a community-based lay health worker programme for TB and HIV/AIDS clients across three low-income communities in South Africa. It assesses each of the methods used in relation to the evaluation objectives, and offers suggestions on ways of optimising the use of multiple, mixed-methods within formative evaluations of complex health system interventions. The evaluation's qualitative methods comprised interviews, focus groups, observations and diary keeping. Quantitative methods included a time-and-motion study of the lay health workers' scope of practice and a client survey. The authors conceptualised and conducted the evaluation, and through iterative discussions, assessed the methods used and their results. Overall, the evaluation highlighted programme issues and insights beyond the reach of traditional single methods evaluations. The strengths of the multiple, mixed-methods in this evaluation included a detailed description and nuanced understanding of the programme and its implementation, and triangulation of the perspectives and experiences of clients, lay health workers, and programme managers. However, the use of multiple methods needs to be carefully planned and implemented as this approach can overstretch the logistic and analytic resources of an evaluation. For complex interventions, formative evaluation designs including multiple qualitative and quantitative methods hold distinct advantages over single method evaluations. However, their value is not in the number of methods used, but in how each method matches the evaluation questions and the scientific integrity with which the methods are selected and implemented.

  8. Estimation of 3D reconstruction errors in a stereo-vision system

    NASA Astrophysics Data System (ADS)

    Belhaoua, A.; Kohler, S.; Hirsch, E.

    2009-06-01

    The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.

  9. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  10. Quantitative Imaging Biomarkers of NAFLD

    PubMed Central

    Kinner, Sonja; Reeder, Scott B.

    2016-01-01

    Conventional imaging modalities, including ultrasonography (US), computed tomography (CT), and magnetic resonance (MR), play an important role in the diagnosis and management of patients with nonalcoholic fatty liver disease (NAFLD) by allowing noninvasive diagnosis of hepatic steatosis. However, conventional imaging modalities are limited as biomarkers of NAFLD for various reasons. Multi-parametric quantitative MRI techniques overcome many of the shortcomings of conventional imaging and allow comprehensive and objective evaluation of NAFLD. MRI can provide unconfounded biomarkers of hepatic fat, iron, and fibrosis in a single examination—a virtual biopsy has become a clinical reality. In this article, we will review the utility and limitation of conventional US, CT, and MR imaging for the diagnosis NAFLD. Recent advances in imaging biomarkers of NAFLD are also discussed with an emphasis in multi-parametric quantitative MRI. PMID:26848588

  11. The Effect of Air Density on Atmospheric Electric Fields Required for Lightning Initiation from a Long Airborne Object

    NASA Technical Reports Server (NTRS)

    Bazelyan, E. M.; Aleksandrov, N. L.; Raizer, Yu. Pl.; Konchankov, A. M.

    2006-01-01

    The purpose of the work was to determine minimum atmospheric electric fields required for lightning initiation from an airborne vehicle at various altitudes up to 10 km. The problem was reduced to the determination of a condition for initiation of a viable positive leader from a conductive object in an ambient electric field. It was shown that, depending on air density and shape and dimensions of the object, critical atmospheric fields are governed by the condition for leader viability or that for corona onset. To establish quantitative criteria for reduced air densities, available observations of spark discharges in long laboratory gaps were analyzed, the effect of air density on leader velocity was discussed and evolution in time of the properties of plasma in the leader channel was numerically simulated. The results obtained were used to evaluate the effect of pressure on the quantitative relationships between the potential difference near the leader tip, leader current and its velocity; based on these relationships, criteria for steady development of a leader were determined for various air pressures. Atmospheric electric fields required for lightning initiation from rods and ellipsoidal objects of various dimensions were calculated at different air densities. It was shown that there is no simple way to extend critical ambient fields obtained for some given objects and pressures to other objects and pressures.

  12. The book availability study as an objective measure of performance in a health sciences library.

    PubMed Central

    Kolner, S J; Welch, E C

    1985-01-01

    In its search for an objective overall diagnostic evaluation, the University of Illinois Library of the Health Sciences' Program Evaluation Committee selected a book availability measure; it is easy to administer and repeat, results are reproducible, and comparable data exist for other academic and health sciences libraries. The study followed the standard methodology in the literature with minor modifications. Patrons searching for particular books were asked to record item(s) needed and the outcome of the search. Library staff members then determined the reasons for failures in obtaining desired items. The results of the study are five performance scores. The first four represent the percentage probability of a library's operating with ideal effectiveness; the last provides an overall performance score. The scores of the Library of the Health Sciences demonstrated no unusual availability problems. The study was easy to implement and provided meaningful, quantitative, and objective data. PMID:3995202

  13. Quantitative evaluation of toothbrush and arm-joint motion during tooth brushing.

    PubMed

    Inada, Emi; Saitoh, Issei; Yu, Yong; Tomiyama, Daisuke; Murakami, Daisuke; Takemoto, Yoshihiko; Morizono, Ken; Iwasaki, Tomonori; Iwase, Yoko; Yamasaki, Youichi

    2015-07-01

    It is very difficult for dental professionals to objectively assess tooth brushing skill of patients, because an obvious index to assess the brushing motion of patients has not been established. The purpose of this study was to quantitatively evaluate toothbrush and arm-joint motion during tooth brushing. Tooth brushing motion, performed by dental hygienists for 15 s, was captured using a motion-capture system that continuously calculates the three-dimensional coordinates of object's motion relative to the floor. The dental hygienists performed the tooth brushing on the buccal and palatal sides of their right and left upper molars. The frequencies and power spectra of toothbrush motion and joint angles of the shoulder, elbow, and wrist were calculated and analyzed statistically. The frequency of toothbrush motion was higher on the left side (both buccal and palatal areas) than on the right side. There were no significant differences among joint angle frequencies within each brushing area. The inter- and intra-individual variations of the power spectrum of the elbow flexion angle when brushing were smaller than for any of the other angles. This study quantitatively confirmed that dental hygienists have individual distinctive rhythms during tooth brushing. All arm joints moved synchronously during brushing, and tooth brushing motion was controlled by coordinated movement of the joints. The elbow generated an individual's frequency through a stabilizing movement. The shoulder and wrist control the hand motion, and the elbow generates the cyclic rhythm during tooth brushing.

  14. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation): Comprehensive 3-year progress report for the period January 15, 1986-January 14, 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1988-06-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 58 refs., 15 figs., 4 tabs.« less

  15. Quantitative Evaluation of HHFKA Nutrition Standards for School Lunch Servings and Patterns of Consumption

    ERIC Educational Resources Information Center

    Echon, Roger M.

    2014-01-01

    Purpose/Objectives: The purpose of this paper is to provide baseline data and characteristics of food served and consumed prior to the recently mandated nutrition standards as authorized by the Healthy, Hunger-Free Kids Act of 2010 (HHFKA). Methods: Over 600,000 school lunch menus with associated food production records from 61 elementary schools…

  16. Development and Validation of a Quantitative Food Frequency Questionnaire among Rural- and Urban-Dwelling Adults in Colombia

    ERIC Educational Resources Information Center

    Dehghan, Mahshid; Lopez Jaramillo, Patricio; Duenas, Ruby; Anaya, Lilliam Lima; Garcia, Ronald G.; Zhang, Xiaohe; Islam, Shofiqul; Merchant, Anwar T.

    2012-01-01

    Objective: To validate a food frequency questionnaire (FFQ) against multiple 24-hour dietary recalls (DRs) that could be used for Colombian adults. Methods: A convenience sample of 219 individuals participated in the study. The validity of the FFQ was evaluated against multiple DRs. Four dietary recalls were collected during the year, and an FFQ…

  17. Assessment of Residual Stand Quality and Regeneration Following Shelterwood Cutting in Central Appalachian Hardwoods

    Treesearch

    James E. Johnson; Gary W. Miller; John E. Baumgras; Cynthia D. West

    1998-01-01

    Partial cutting to develop two-age stands is a relatively new practice in the central Appalachian region, and forest managers need quantitative information in order to evaluate how well it meets management objectives. Typically, this practice leaves a residual overstory of 10 to 40 ft2 per ac of basal area and leads to regeneration of desirable...

  18. A Six‐Stage Workflow for Robust Application of Systems Pharmacology

    PubMed Central

    Gadkar, K; Kirouac, DC; Mager, DE; van der Graaf, PH

    2016-01-01

    Quantitative and systems pharmacology (QSP) is increasingly being applied in pharmaceutical research and development. One factor critical to the ultimate success of QSP is the establishment of commonly accepted language, technical criteria, and workflows. We propose an integrated workflow that bridges conceptual objectives with underlying technical detail to support the execution, communication, and evaluation of QSP projects. PMID:27299936

  19. Evaluation of the ruminal bacterial diversity of cattle fed diets containing citrus pulp pellets (CPP) using bacterial tag-encoded FLX amplicon pyrosequencing (bTEFAP)

    USDA-ARS?s Scientific Manuscript database

    The rumen microbial ecosystem has been extensively studied, but remains a mystery from a quantitative perspective. Dietary components and changes cause shifts in the ruminal microflora that can affect animal health and productivity, but the majority of these changes remain unknown. The objective of ...

  20. Serum Squamous Cell Carcinoma Antigen in Psoriasis: A Potential Quantitative Biomarker for Disease Severity.

    PubMed

    Sun, Ziwen; Shi, Xiaomin; Wang, Yun; Zhao, Yi

    2018-06-05

    An objective and quantitative method to evaluate psoriasis severity is important for practice and research in the precision care of psoriasis. We aimed to explore serum biomarkers quantitatively in association with disease severity and treatment response in psoriasis patients, with serum squamous cell carcinoma antigen (SCCA) evaluated in this pilot study. 15 psoriasis patients were treated with adalimumab. At different visits before and after treatment, quantitative body surface area (qBSA) was obtained from standardized digital body images of the patients, and the psoriasis area severity index (PASI) was also monitored. SCCA were detected by using microparticle enzyme immunoassay. The serum biomarkers were also tested in healthy volunteers as normal controls. Receiver-operating characteristic (ROC) curve analysis was used to explore the optimal cutoff point of SCCA to differentiate mild and moderate-to-severe psoriasis. The serum SCCA level in the psoriasis group was significantly higher (p < 0.05) than in the normal control group. After treatment, the serum SCCA levels were significantly decreased (p < 0.05). The SCCA level was well correlated with PASI and qBSA. In ROC analysis, when taking PASI = 10 or qBSA = 10% as the threshold, an optimal cutoff point of SCCA was found at 2.0 ng/mL with the highest Youden index. Serum SCCA might be a useful quantitative biomarker for psoriasis disease severity. © 2018 S. Karger AG, Basel.

  1. Orthoclinostatic test as one of the methods for evaluating the human functional state

    NASA Technical Reports Server (NTRS)

    Doskin, V. A.; Gissen, L. D.; Bomshteyn, O. Z.; Merkin, E. N.; Sarychev, S. B.

    1980-01-01

    The possible use of different methods to evaluate the autonomic regulation in hygienic studies were examined. The simplest and most objective tests were selected. It is shown that the use of the optimized standards not only makes it possible to detect earlier unfavorables shifts, but also permits a quantitative characterization of the degree of impairment in the state of the organism. Precise interpretation of the observed shifts is possible. Results indicate that the standards can serve as one of the criteria for evaluating the state and can be widely used in hygienic practice.

  2. Using normalization 3D model for automatic clinical brain quantative analysis and evaluation

    NASA Astrophysics Data System (ADS)

    Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping

    2003-05-01

    Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.

  3. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  4. 3D Slicer as an Image Computing Platform for the Quantitative Imaging Network

    PubMed Central

    Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron

    2012-01-01

    Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690

  5. An electronic portfolio for quantitative assessment of surgical skills in undergraduate medical education.

    PubMed

    Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero

    2013-05-06

    We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved the maximum scores for participation as an observer or assistant. Medical students reported that use of an electronic portfolio that provided quantitative feedback on their progress was useful when the number and complexity of targets were appropriate, but not when the portfolio offered only formative evaluations based on reflection. Students felt that use of the e-Portfolio guided their learning process by indicating knowledge gaps to themselves and teachers.

  6. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    PubMed

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  7. Evaluation of the pain and local tenderness in bone metastasis treated with magnetic resonance-guided focused ultrasound surgery (MRgFUS)

    NASA Astrophysics Data System (ADS)

    Namba, Hirofumi; Kawasaki, Motohiro; Kato, Tomonari; Tani, Toshikazu; Ushida, Takahiro; Koizumi, Norihiro

    2017-03-01

    It has been reported that MRgFUS has pain palliative effects on the local pain in patients with bone metastasis. In general, a severity of pain has been evaluated using only subjective method with numerical rating scale (NRS) or visual analogue scale (VAS). It is important to evaluate local pain-palliative effects of MRgFUS treatment with objective and quantitative method. The aim of this study is to investigate changes in the severity of local pain of bone metastasis before and after MRgFUS treatments, measuring pressure pain threshold (PPT) using pressure algometer, and pain intensity using electrical stimulation device (the Pain Vision system) at most painful site of bone metastasis. We have conducted MRgFUS for pain palliation of bone metastasis for 8 patients, and evaluated the local tenderness quantitatively for 8 patients, and evaluated local pain intensity for 7 patients. Before the treatments, PPTs were 106.3kPa [40.0-431.5] at metastatic site and 344.8 kPa [206.0-667.0] at normal control site, which showed a significant difference. The PPTs at metastatic site shows a significant increase from 106.3 kPa [40.0-431.5] at the baseline to 270.5 kPa [93.5-533.5] at 3 months after the treatment. The NRS score shows a significant decrease from 6.0 [4-8] at baseline to 1 [0-3] at 3 months after the treatment. Similarly, the pain intensity shows a significant decrease 245 [96.3-888.7] at baseline to 55.9 [2.8-292] at 3 months after the treatment. The results of our study illustrate the pain-relieving effects of MRgFUS for the treatment of painful bone metastasis. PPT might be a useful parameter not only for assessing a treatment's effect, but also for the decision of the painful area to treat with MRgFUS. Pain Vision seems to be useful for quantitative and objective evaluation of local pain of painful bone metastasis.

  8. Computerized quantitative evaluation of mammographic accreditation phantom images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria,more » the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.« less

  9. Benchmarking quantitative label-free LC-MS data processing workflows using a complex spiked proteomic standard dataset.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-01-30

    Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Network of TAMCNS: Identifying Influence Regions Within the GCSS-MC Database

    DTIC Science & Technology

    2017-06-01

    relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This... methodology identifies the most important TAMCN and provides a list of TAMCNs in order of importance. We also analyze the community and core structure of...relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This

  11. Quantitative electromyography in ambulatory boys with Duchenne muscular dystrophy.

    PubMed

    Verma, Sumit; Lin, Jenny; Travers, Curtis; McCracken, Courtney; Shah, Durga

    2017-12-01

    This study's objective was to evaluate quantitative electromyography (QEMG) using multiple-motor-unit (multi-MUP) analysis in Duchenne muscular dystrophy (DMD). Ambulatory DMD boys, aged 5-15 years, were evaluated with QEMG at 6-month intervals over 14 months. EMG was performed in the right biceps brachii (BB) and tibialis anterior (TA) muscles. Normative QEMG data were obtained from age-matched healthy boys. Wilcoxon signed-rank tests were performed. Eighteen DMD subjects were enrolled, with a median age of 7 (interquartile range 7-10) years. Six-month evaluations were performed on 14 subjects. QEMG showed significantly abnormal mean MUP duration in BB and TA muscles, with no significant change over 6 months. QEMG is a sensitive electrophysiological marker of myopathy in DMD. Preliminary data do not reflect a significant change in MUP parameters over a 6-month interval; long-term follow-up QEMG studies are needed to understand its role as a biomarker for disease progression. Muscle Nerve 56: 1361-1364, 2017. © 2017 Wiley Periodicals, Inc.

  12. Surface plasmon resonance microscopy: achieving a quantitative optical response

    PubMed Central

    Peterson, Alexander W.; Halter, Michael; Plant, Anne L.; Elliott, John T.

    2016-01-01

    Surface plasmon resonance (SPR) imaging allows real-time label-free imaging based on index of refraction, and changes in index of refraction at an interface. Optical parameter analysis is achieved by application of the Fresnel model to SPR data typically taken by an instrument in a prism based configuration. We carry out SPR imaging on a microscope by launching light into a sample, and collecting reflected light through a high numerical aperture microscope objective. The SPR microscope enables spatial resolution that approaches the diffraction limit, and has a dynamic range that allows detection of subnanometer to submicrometer changes in thickness of biological material at a surface. However, unambiguous quantitative interpretation of SPR changes using the microscope system could not be achieved using the Fresnel model because of polarization dependent attenuation and optical aberration that occurs in the high numerical aperture objective. To overcome this problem, we demonstrate a model to correct for polarization diattenuation and optical aberrations in the SPR data, and develop a procedure to calibrate reflectivity to index of refraction values. The calibration and correction strategy for quantitative analysis was validated by comparing the known indices of refraction of bulk materials with corrected SPR data interpreted with the Fresnel model. Subsequently, we applied our SPR microscopy method to evaluate the index of refraction for a series of polymer microspheres in aqueous media and validated the quality of the measurement with quantitative phase microscopy. PMID:27782542

  13. Sensitivity analysis, calibration, and testing of a distributed hydrological model using error‐based weighting and one objective function

    USGS Publications Warehouse

    Foglia, L.; Hill, Mary C.; Mehl, Steffen W.; Burlando, P.

    2009-01-01

    We evaluate the utility of three interrelated means of using data to calibrate the fully distributed rainfall‐runoff model TOPKAPI as applied to the Maggia Valley drainage area in Switzerland. The use of error‐based weighting of observation and prior information data, local sensitivity analysis, and single‐objective function nonlinear regression provides quantitative evaluation of sensitivity of the 35 model parameters to the data, identification of data types most important to the calibration, and identification of correlations among parameters that contribute to nonuniqueness. Sensitivity analysis required only 71 model runs, and regression required about 50 model runs. The approach presented appears to be ideal for evaluation of models with long run times or as a preliminary step to more computationally demanding methods. The statistics used include composite scaled sensitivities, parameter correlation coefficients, leverage, Cook's D, and DFBETAS. Tests suggest predictive ability of the calibrated model typical of hydrologic models.

  14. Stylization levels of industrial design objects

    NASA Astrophysics Data System (ADS)

    Kukhta, M. S.; Sokolov, A. P.; Krauinsh, D. P.; Bouchard, C.

    2017-01-01

    The urgency of the research of form making problem in design is associated with the necessity of new understanding of visual culture and new approaches to design engineering representing the integration of artistic and designed problems. The aim of this research is to study the levels of stylization of design objects and dependance (relation) on the specific project objectives and existing technologies. On the ground of quantitative evaluation, the stylization measures are emphasized: figurative image, stylized image and abstract image. Theoretic conclusions are complemented by practical problem solution over creating openwork metal lantern. Variants of both the traditional mains supply of the lantern and the autonomic supply system based on solar energy were offered. The role of semantic factor, affecting the depth of perception of design objects semantic space, is represented in this paper.

  15. Quantification of EEG reactivity in comatose patients.

    PubMed

    Hermans, Mathilde C; Westover, M Brandon; van Putten, Michel J A M; Hirsch, Lawrence J; Gaspard, Nicolas

    2016-01-01

    EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet's AC1: 65-70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts' agreement regarding reactivity for each individual case. Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  16. Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses

    PubMed Central

    Alexander, Elsinore; Wei, Xin; Lee, Shinwook

    2018-01-01

    Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526

  17. In vivo measurements of proton relaxation times in human brain, liver, and skeletal muscle: a multicenter MRI study.

    PubMed

    de Certaines, J D; Henriksen, O; Spisni, A; Cortsen, M; Ring, P B

    1993-01-01

    Quantitative magnetic resonance imaging may offer unique potential for tissue characterization in vivo. In this connection texture analysis of quantitative MR images may be of special importance. Because evaluation of texture analysis needs large data material, multicenter approaches become mandatory. Within the frame of BME Concerted Action on Tissue Characterization by MRI and MRS, a pilot multicenter study was launched in order to evaluate the technical problems including comparability of relaxation time measurements carried out in the individual sites. Human brain, skeletal muscle, and liver were used as models. A total of 218 healthy volunteers were studied. Fifteen MRI scanners with field strength ranging from 0.08 T to 1.5 T were induced. Measurement accuracy was tested on the Eurospin relaxation time test object (TO5) and the obtained calibration curve was used for correction of the in vivo data. The results established that, by following a standardized procedure, comparable quantitative measurements can be obtained in vivo from a number of MR sites. The overall variation coefficient in vivo was in the same order of magnitude as ex vivo relaxometry. Thus, it is possible to carry out international multicenter studies on quantitative imaging, provided that quality control with respect to measurement accuracy and calibration of the MR equipments are performed.

  18. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  19. Comparison of qualitative and quantitative analysis of T2-weighted MRI scans in chronic-progressive multiple sclerosis

    NASA Astrophysics Data System (ADS)

    Adams, Hans-Peter; Wagner, Simone; Koziol, James A.

    1998-06-01

    Magnetic resonance imaging (MRI) is routinely used for the diagnosis of multiple sclerosis (MS), and for objective assessment of the extent of disease as a marker of treatment efficacy in MS clinical trials. The purpose of this study is to compare the evaluation of T2-weighted MRI scans in MS patients using a semi-automated quantitative technique with an independent assessment by a neurologist. Baseline, 6- month, and 12-month T2-weighted MRI scans from 41 chronic progressive MS patients were examined. The lesion volume ranged from 0.50 to 51.56 cm2 (mean: 8.08 cm2). Reproducibility of the quantitative technique was assessed by the re-evaluation of a random subset of 20 scans, the coefficient of variation of the replicate determinations was 8.2%. The reproducibility of the neurologist evaluations was assessed by the re-evaluation of a random subset of 10 patients. The rank correlation between the results of the two methods was 0.097, which did not significantly differ from zero. Disease-related activity in T2-weighted MRI scans is a multi-dimensional construct, and is not adequately summarized solely by determination of lesion volume. In this setting, image analysis software should not only support storage and retrieval as sets of pixels, but should also support links to an anatomical dictionary.

  20. Evaluating a Dutch cardiology primary care plus intervention on the Triple Aim outcomes: study design of a practice-based quantitative and qualitative research.

    PubMed

    Quanjel, Tessa C C; Spreeuwenberg, Marieke D; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk

    2017-09-06

    In an attempt to deal with the pressures on the health-care system and to guarantee sustainability, changes are needed. This study focuses on a cardiology primary care plus intervention. Primary care plus (PC+) is a new health-care delivery model focused on substitution of specialist care in the hospital setting with specialist care in the primary care setting. The intervention consists of a cardiology PC+ centre in which cardiologists, supported by other health-care professionals, provide consultations in a primary care setting. The PC+ centre aims to improve the health of the population and quality of care as experienced by patients, and reduce the number of referrals to hospital-based outpatient specialist care in order to reduce health-care costs. These aims reflect the Triple Aim principle. Hence, the objectives of the study are to evaluate the cardiology PC+ centre in terms of the Triple Aim outcomes and to evaluate the process of the introduction of PC+. The study is a practice-based, quantitative study with a longitudinal observational design, and an additional qualitative study to supplement, interpret and improve the quantitative study. The study population of the quantitative part will consist of adult patients (≥18 years) with non-acute and low-complexity cardiology-related health complaints, who will be referred to the cardiology PC+ centre (intervention group) or hospital-based outpatient cardiology care (control group). All eligible patients will be asked to complete questionnaires at three different time points consisting of questions about their demographics, health status and experience of care. Additionally, quantitative data will be collected about health-care utilization and related health-care costs at the PC+ centre and the hospital. The qualitative part, consisting of semi-structured interviews, focus groups, and observations, is designed to evaluate the process as well as to amplify, clarify and explain quantitative results. This study will evaluate a cardiology PC+ centre using quantitative and supplementary qualitative methods. The findings of both sub-studies will fill a gap in knowledge about the effects of PC+ and in particular whether PC+ is able to pursue the Triple Aim outcomes. NTR6629 (Data registered: 25-08-2017) (registered retrospectively).

  1. The Health Action Process Approach as a Motivational Model of Dietary Self-Management for People with Multiple Sclerosis: A Path Analysis

    ERIC Educational Resources Information Center

    Chiu, Chung-Yi; Lynch, Ruth Torkelson; Chan, Fong; Rose, Lindsey

    2012-01-01

    The main objective of this study was to evaluate the health action process approach (HAPA) as a motivational model for dietary self-management for people with multiple sclerosis (MS). Quantitative descriptive research design using path analysis was used. Participants were 209 individuals with MS recruited from the National MS Society and a…

  2. Physically adjusted NDF (paNDF) system for lactating dairy cow rations. I: Deriving equations that identify factors that influence effectiveness of fiber

    USDA-ARS?s Scientific Manuscript database

    Physically effective neutral detergent fiber (peNDF) is defined as the fraction of NDF that stimulates chewing activity and contributes to the floating mat of large particles in the rumen. The objective of this work was to re-evaluate the concept of peNDF by quantitatively relating physical and chem...

  3. Objective Quantification of Pre-and Postphonosurgery Vocal Fold Vibratory Characteristics Using High-Speed Videoendoscopy and a Harmonic Waveform Model

    ERIC Educational Resources Information Center

    Ikuma, Takeshi; Kunduk, Melda; McWhorter, Andrew J.

    2014-01-01

    Purpose: The model-based quantitative analysis of high-speed videoendoscopy (HSV) data at a low frame rate of 2,000 frames per second was assessed for its clinical adequacy. Stepwise regression was employed to evaluate the HSV parameters using harmonic models and their relationships to the Voice Handicap Index (VHI). Also, the model-based HSV…

  4. [Learning about death from the undergraduate: Evaluation of an educational intervention].

    PubMed

    Álvarez-del Río, Asunción; Torruco-García, Uri; Morales-Castillo, José Daniel; Varela-Ruiz, Margarita

    2015-01-01

    From June to November 2013 an elective subject "The doctor before death" was held in a public medical school. The aim of this report is to assess the achievement of the objectives of this course. The main objectives of the course were to develop competences, aptitude for reflection before death and encourage changes in attitude towards it. Each session was preceded by an article on the content; during sessions the interaction with physicians and patients facing the approach of death was favored; audiovisual, computer resources were used and conducted discussions. The evaluation of the course was a retrospective questionnaire as a quantitative source, and semi structured interviews and essays as qualitative sources. The development of competences, aptitude for reflection about death and attitude changes showed an increase after the intervention (p < 0.01); competence development had the smallest increase. With qualitative information 11 categories were integrated; all showed positive changes in attitude towards death, aptitude for reflection and developed competences (although in this respect the impact was minor). The educational intervention evaluated met the objectives, however, for a future intervention is necessary to reinforce competence development.

  5. Operational models of infrastructure resilience.

    PubMed

    Alderson, David L; Brown, Gerald G; Carlyle, W Matthew

    2015-04-01

    We propose a definition of infrastructure resilience that is tied to the operation (or function) of an infrastructure as a system of interacting components and that can be objectively evaluated using quantitative models. Specifically, for any particular system, we use quantitative models of system operation to represent the decisions of an infrastructure operator who guides the behavior of the system as a whole, even in the presence of disruptions. Modeling infrastructure operation in this way makes it possible to systematically evaluate the consequences associated with the loss of infrastructure components, and leads to a precise notion of "operational resilience" that facilitates model verification, validation, and reproducible results. Using a simple example of a notional infrastructure, we demonstrate how to use these models for (1) assessing the operational resilience of an infrastructure system, (2) identifying critical vulnerabilities that threaten its continued function, and (3) advising policymakers on investments to improve resilience. © 2014 Society for Risk Analysis.

  6. A quantitative approach to evaluating caring in nursing simulation.

    PubMed

    Eggenberger, Terry L; Keller, Kathryn B; Chase, Susan K; Payne, Linda

    2012-01-01

    This study was designed to test a quantitative method of measuring caring in the simulated environment. Since competency in caring is central to nursing practice, ways of including caring concepts in designing scenarios and in evaluation of performance need to be developed. Coates' Caring Efficacy scales were adapted for simulation and named the Caring Efficacy Scale-Simulation Student Version (CES-SSV) and Caring Efficacy Scale-Simulation Faculty Version (CES-SFV). A correlational study was designed to compare student self-ratings with faculty ratings on caring efficacy during an adult acute simulation experience with traditional and accelerated baccalaureate students in a nursing program grounded in caring theory. Student self-ratings were significantly correlated with objective ratings (r = 0.345, 0.356). Both the CES-SSV and the CES-SFV were found to have excellent internal consistency and significantly correlated interrater reliability. They were useful in measuring caring in the simulated learning environment.

  7. Sensory profile and acceptability for pitanga (Eugenia uniflora L.) nectar with different sweeteners.

    PubMed

    Freitas, Mírian Luisa Faria; Dutra, Mariana Borges de Lima; Bolini, Helena Maria André

    2016-12-01

    The objective of this study was to evaluate the sensory properties and acceptability of pitanga nectar samples prepared with sucrose and different sweeteners (sucralose, aspartame, stevia with 40% rebaudioside A, stevia with 95% rebaudioside A, neotame, and a 2:1 cyclamate/saccharin blend). A total of 13 assessors participated in a quantitative descriptive analysis and evaluated the samples in relation to the descriptor terms. The acceptability test was carried out by 120 fruit juice consumers. The results of the quantitative descriptive analysis of pitanga nectar showed that samples prepared with sucralose, aspartame, and the 2:1 cyclamate/saccharin blend had sensory profiles similar to that of the sample prepared with sucrose. Consumers' most accepted samples were prepared with sucrose, sucralose, aspartame, and neotame. The sweeteners that have the greatest potential to replace sucrose in pitanga nectar are sucralose and aspartame. © The Author(s) 2016.

  8. Descriptive quantitative analysis of hallux abductovalgus transverse plane radiographic parameters.

    PubMed

    Meyr, Andrew J; Myers, Adam; Pontious, Jane

    2014-01-01

    Although the transverse plane radiographic parameters of the first intermetatarsal angle (IMA), hallux abductus angle (HAA), and the metatarsal-sesamoid position (MSP) form the basis of preoperative procedure selection and postoperative surgical evaluation of the hallux abductovalgus deformity, the so-called normal values of these measurements have not been well established. The objectives of the present study were to (1) evaluate the descriptive statistics of the first IMA, HAA, and MSP from a large patient population and (2) to determine an objective basis for defining "normal" versus "abnormal" measurements. Anteroposterior foot radiographs from 373 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated for the measurements of the first IMA, HAA, and MSP. The results revealed a mean measurement of 9.93°, 17.59°, and position 3.63 for the first IMA, HAA, and MSP, respectively. An advanced descriptive analysis demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, clear differentiations in deformity progression were appreciated when the variables were graphically depicted against each other. This could represent a quantitative basis for defining "normal" versus "abnormal" values. From the results of the present study, we have concluded that these radiographic parameters can be more conservatively reported and analyzed using nonparametric descriptive and comparative statistics within medical studies and that the combination of a first IMA, HAA, and MSP at or greater than approximately 10°, 18°, and position 4, respectively, appears to be an objective "tipping point" in terms of deformity progression and might represent an upper limit of acceptable in terms of surgical deformity correction. Copyright © 2014 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  9. Improvement and Extension of Shape Evaluation Criteria in Multi-Scale Image Segmentation

    NASA Astrophysics Data System (ADS)

    Sakamoto, M.; Honda, Y.; Kondo, A.

    2016-06-01

    From the last decade, the multi-scale image segmentation is getting a particular interest and practically being used for object-based image analysis. In this study, we have addressed the issues on multi-scale image segmentation, especially, in improving the performances for validity of merging and variety of derived region's shape. Firstly, we have introduced constraints on the application of spectral criterion which could suppress excessive merging between dissimilar regions. Secondly, we have extended the evaluation for smoothness criterion by modifying the definition on the extent of the object, which was brought for controlling the shape's diversity. Thirdly, we have developed new shape criterion called aspect ratio. This criterion helps to improve the reproducibility on the shape of object to be matched to the actual objectives of interest. This criterion provides constraint on the aspect ratio in the bounding box of object by keeping properties controlled with conventional shape criteria. These improvements and extensions lead to more accurate, flexible, and diverse segmentation results according to the shape characteristics of the target of interest. Furthermore, we also investigated a technique for quantitative and automatic parameterization in multi-scale image segmentation. This approach is achieved by comparing segmentation result with training area specified in advance by considering the maximization of the average area in derived objects or satisfying the evaluation index called F-measure. Thus, it has been possible to automate the parameterization that suited the objectives especially in the view point of shape's reproducibility.

  10. Characterization and Application of a Grazing Angle Objective for Quantitative Infrared Reflection Microspectroscopy

    NASA Technical Reports Server (NTRS)

    Pepper, Stephen V.

    1995-01-01

    A grazing angle objective on an infrared microspectrometer is studied for quantitative spectroscopy by considering the angular dependence of the incident intensity within the objective's angular aperture. The assumption that there is no angular dependence is tested by comparing the experimental reflectance of Si and KBr surfaces with the reflectance calculated by integrating the Fresnel reflection coefficient over the angular aperture under this assumption. Good agreement was found, indicating that the specular reflectance of surfaces can straight-forwardly be quantitatively integrated over the angular aperture without considering non-uniform incident intensity. This quantitative approach is applied to the thickness determination of dipcoated Krytox on gold. The infrared optical constants of both materials are known, allowing the integration to be carried out. The thickness obtained is in fair agreement with the value determined by ellipsometry in the visible. Therefore, this paper illustrates a method for more quantitative use of a grazing angle objective for infrared reflectance microspectroscopy.

  11. Contribution of flow-volume curves to the detection of central airway obstruction*

    PubMed Central

    Raposo, Liliana Bárbara Perestrelo de Andrade e; Bugalho, António; Gomes, Maria João Marques

    2013-01-01

    OBJECTIVE: To assess the sensitivity and specificity of flow-volume curves in detecting central airway obstruction (CAO), and to determine whether their quantitative and qualitative criteria are associated with the location, type and degree of obstruction. METHODS: Over a four-month period, we consecutively evaluated patients with bronchoscopy indicated. Over a one-week period, all patients underwent clinical evaluation, flow-volume curve, bronchoscopy, and completed a dyspnea scale. Four reviewers, blinded to quantitative and clinical data, and bronchoscopy results, classified the morphology of the curves. A fifth reviewer determined the morphological criteria, as well as the quantitative criteria. RESULTS: We studied 82 patients, 36 (44%) of whom had CAO. The sensitivity and specificity of the flow-volume curves in detecting CAO were, respectively, 88.9% and 91.3% (quantitative criteria) and 30.6% and 93.5% (qualitative criteria). The most prevalent quantitative criteria in our sample were FEF50%/FIF50% ≥ 1, in 83% of patients, and FEV1/PEF ≥ 8 mL . L–1 . min–1, in 36%, both being associated with the type, location, and degree of obstruction (p < 0.05). There was concordance among the reviewers as to the presence of CAO. There is a relationship between the degree of obstruction and dyspnea. CONCLUSIONS: The quantitative criteria should always be calculated for flow-volume curves in order to detect CAO, because of the low sensitivity of the qualitative criteria. Both FEF50%/FIF50% ≥ 1 and FEV1/PEF ≥ 8 mL . L–1 . min–1 were associated with the location, type and degree of obstruction. PMID:24068266

  12. [Assessment of research papers in medical university staff evaluation].

    PubMed

    Zhou, Qing-hui

    2012-06-01

    Medical university staff evaluation is a substantial branch of education administration for medical university. Output number of research papers as a direct index reflecting the achievements in academic research, plays an important role in academic research evaluation. Another index, influence of the research paper, is an indirect index for academic research evaluation. This paper mainly introduced some commonly used indexes in evaluation of academic research papers currently, and analyzed the applicability and limitation of each index. The author regards that academic research evaluation in education administration, which is mainly based on evaluation of academic research papers, should combine the evaluation of journals where the papers are published with peer review of the papers, and integrate qualitative evaluation with quantitative evaluation, for the purpose of setting up an objective academic research evaluation system for medical university staff.

  13. Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study

    PubMed Central

    Sappa, Angel D.; Carvajal, Juan A.; Aguilera, Cristhian A.; Oliveira, Miguel; Romero, Dennis; Vintimilla, Boris X.

    2016-01-01

    This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and Long Wave InfraRed (LWIR). PMID:27294938

  14. Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study.

    PubMed

    Sappa, Angel D; Carvajal, Juan A; Aguilera, Cristhian A; Oliveira, Miguel; Romero, Dennis; Vintimilla, Boris X

    2016-06-10

    This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and Long Wave InfraRed (LWIR).

  15. The effect of image sharpness on quantitative eye movement data and on image quality evaluation while viewing natural images

    NASA Astrophysics Data System (ADS)

    Vuori, Tero; Olkkonen, Maria

    2006-01-01

    The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.

  16. Progress in quantitative GPR development at CNDE

    NASA Astrophysics Data System (ADS)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-02-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  17. Quantitative T2 Magnetic Resonance Imaging Compared to Morphological Grading of the Early Cervical Intervertebral Disc Degeneration: An Evaluation Approach in Asymptomatic Young Adults

    PubMed Central

    Han, Zhihua; Shao, Lixin; Xie, Yan; Wu, Jianhong; Zhang, Yan; Xin, Hongkui; Ren, Aijun; Guo, Yong; Wang, Deli; He, Qing; Ruan, Dike

    2014-01-01

    Objective The objective of this study was to evaluate the efficacy of quantitative T2 magnetic resonance imaging (MRI) for quantifying early cervical intervertebral disc (IVD) degeneration in asymptomatic young adults by correlating the T2 value with Pfirrmann grade, sex, and anatomic level. Methods Seventy asymptomatic young subjects (34 men and 36 women; mean age, 22.80±2.11 yr; range, 18–25 years) underwent 3.0-T MRI to obtain morphological data (one T1-fast spin echo (FSE) and three-plane T2-FSE, used to assign a Pfirrmann grade (I–V)) and for T2 mapping (multi-echo spin echo). T2 values in the nucleus pulposus (NP, n = 350) and anulus fibrosus (AF, n = 700) were obtained. Differences in T2 values between sexes and anatomic level were evaluated, and linear correlation analysis of T2 values versus degenerative grade was conducted. Findings Cervical IVDs of healthy young adults were commonly determined to be at Pfirrmann grades I and II. T2 values of NPs were significantly higher than those of AF at all anatomic levels (P<0.000). The NP, anterior AF and posterior AF values did not differ significantly between genders at the same anatomic level (P>0.05). T2 values decreased linearly with degenerative grade. Linear correlation analysis revealed a strong negative association between the Pfirrmann grade and the T2 values of the NP (P = 0.000) but not the T2 values of the AF (P = 0.854). However, non-degenerated discs (Pfirrmann grades I and II) showed a wide range of T2 relaxation time. T2 values according to disc degeneration level classification were as follows: grade I (>62.03 ms), grade II (54.60–62.03 ms), grade III (<54.60 ms). Conclusions T2 quantitation provides a more sensitive and robust approach for detecting and characterizing the early stage of cervical IVD degeneration and to create a reliable quantitative in healthy young adults. PMID:24498384

  18. Optimizing oncology therapeutics through quantitative translational and clinical pharmacology: challenges and opportunities.

    PubMed

    Venkatakrishnan, K; Friberg, L E; Ouellet, D; Mettetal, J T; Stein, A; Trocóniz, I F; Bruno, R; Mehrotra, N; Gobburu, J; Mould, D R

    2015-01-01

    Despite advances in biomedical research that have deepened our understanding of cancer hallmarks, resulting in the discovery and development of targeted therapies, the success rates of oncology drug development remain low. Opportunities remain for objective dose selection informed by exposure-response understanding to optimize the benefit-risk balance of novel therapies for cancer patients. This review article discusses the principles and applications of modeling and simulation approaches across the lifecycle of development of oncology therapeutics. Illustrative examples are used to convey the value gained from integration of quantitative clinical pharmacology strategies from the preclinical-translational phase through confirmatory clinical evaluation of efficacy and safety. © 2014 American Society for Clinical Pharmacology and Therapeutics.

  19. TH-A-207B-00: Shear-Wave Imaging and a QIBA US Biomarker Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less

  20. TH-A-207B-01: Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, S.

    Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less

  1. TH-A-207B-02: QIBA Ultrasound Elasticity Imaging System Biomarker Qualification and User Testing of Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garra, B.

    Imaging of tissue elastic properties is a relatively new and powerful approach to one of the oldest and most important diagnostic tools. Imaging of shear wave speed with ultrasound is has been added to most high-end ultrasound systems. Understanding this exciting imaging mode aiding its most effective use in medicine can be a rewarding effort for medical physicists and other medical imaging and treatment professionals. Assuring consistent, quantitative measurements across the many ultrasound systems in a typical imaging department will constitute a major step toward realizing the great potential of this technique and other quantitative imaging. This session will targetmore » these two goals with two presentations. A. Basics and Current Implementations of Ultrasound Imaging of Shear Wave Speed and Elasticity - Shigao Chen, Ph.D. Learning objectives-To understand: Introduction: Importance of tissue elasticity measurement Strain vs. shear wave elastography (SWE), beneficial features of SWE The link between shear wave speed and material properties, influence of viscosity Generation of shear waves External vibration (Fibroscan) ultrasound radiation force Point push Supersonic push (Aixplorer) Comb push (GE Logiq E9) Detection of shear waves Motion detection from pulse-echo ultrasound Importance of frame rate for shear wave imaging Plane wave imaging detection How to achieve high effective frame rate using line-by-line scanners Shear wave speed calculation Time to peak Random sample consensus (RANSAC) Cross correlation Sources of bias and variation in SWE Tissue viscosity Transducer compression or internal pressure of organ Reflection of shear waves at boundaries B. Elasticity Imaging System Biomarker Qualification and User Testing of Systems – Brian Garra, M.D. Learning objectives-To understand: Goals Review the need for quantitative medical imaging Provide examples of quantitative imaging biomarkers Acquaint the participant with the purpose of the RSNA Quantitative Imaging Biomarker Alliance and the need for such an organization Review the QIBA process for creating a quantitative biomarker Summarize steps needed to verify adherence of site, operators, and imaging systems to a QIBA profile Underlying Premise and Assumptions Objective, quantifiable results are needed to enhance the value of diagnostic imaging in clinical practice Reasons for quantification Evidence based medicine requires objective, not subjective observer data Computerized decision support tools (eg CAD) generally require quantitative input. Quantitative, reproducible measures are more easily used to develop personalized molecular medical diagnostic and treatment systems What is quantitative imaging? Definition from Imaging Metrology Workshop The Quantitative Imaging Biomarker Alliance Formation 2008 Mission Structure Example Imaging Biomarkers Being Explored Biomarker Selection Groundwork Draft Protocol for imaging and data evaluation QIBA Profile Drafting Equipment and Site Validation Technical Clinical Site and Equipment QA and Compliance Checking Ultrasound Elasticity Estimation Biomarker US Elasticity Estimation Background Current Status and Problems Biomarker Selection-process and outcome US SWS for Liver Fibrosis Biomarker Work Groundwork Literature search and analysis results Phase I phantom testing-Elastic phantoms Phase II phantom testing-Viscoelastic phantoms Digital Simulated Data Protocol and Profile Drafting Protocol: based on UPICT and existing literature and standards bodies protocols Profile-Current claims, Manufacturer specific appendices What comes after the profile Profile Validation Technical validation Clinical validation QA and Compliance Possible approaches Site Operator testing Site protocol re-evaluation Imaging system Manufacturer testing and attestation User acceptance testing and periodic QA Phantom Tests Digital Phantom Based Testing Standard QA Testing Remediation Schemes Profile Evolution Towards additional applications Towards higher accuracy and precision Supported in part by NIH contract HHSN268201300071C from NIBIB. Collaboration with GE Global Research, no personal support.; S. Chen, Some technologies described in this presentation have been licensed. Mayo Clinic and Dr. Chen have financial interests these technologies.« less

  2. Accurate object tracking system by integrating texture and depth cues

    NASA Astrophysics Data System (ADS)

    Chen, Ju-Chin; Lin, Yu-Hang

    2016-03-01

    A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.

  3. Evaluation of image quality in terahertz pulsed imaging using test objects.

    PubMed

    Fitzgerald, A J; Berry, E; Miles, R E; Zinovev, N N; Smith, M A; Chamberlain, J M

    2002-11-07

    As with other imaging modalities, the performance of terahertz (THz) imaging systems is limited by factors of spatial resolution, contrast and noise. The purpose of this paper is to introduce test objects and image analysis methods to evaluate and compare THz image quality in a quantitative and objective way, so that alternative terahertz imaging system configurations and acquisition techniques can be compared, and the range of image parameters can be assessed. Two test objects were designed and manufactured, one to determine the modulation transfer functions (MTF) and the other to derive image signal to noise ratio (SNR) at a range of contrasts. As expected the higher THz frequencies had larger MTFs, and better spatial resolution as determined by the spatial frequency at which the MTF dropped below the 20% threshold. Image SNR was compared for time domain and frequency domain image parameters and time delay based images consistently demonstrated higher SNR than intensity based parameters such as relative transmittance because the latter are more strongly affected by the sources of noise in the THz system such as laser fluctuations and detector shot noise.

  4. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    PubMed

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  5. Designing a mixed methods study in primary care.

    PubMed

    Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V

    2004-01-01

    Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.

  6. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes.

    PubMed

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-08-27

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications.

  7. Saliency-Guided Detection of Unknown Objects in RGB-D Indoor Scenes

    PubMed Central

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Xi, Ning

    2015-01-01

    This paper studies the problem of detecting unknown objects within indoor environments in an active and natural manner. The visual saliency scheme utilizing both color and depth cues is proposed to arouse the interests of the machine system for detecting unknown objects at salient positions in a 3D scene. The 3D points at the salient positions are selected as seed points for generating object hypotheses using the 3D shape. We perform multi-class labeling on a Markov random field (MRF) over the voxels of the 3D scene, combining cues from object hypotheses and 3D shape. The results from MRF are further refined by merging the labeled objects, which are spatially connected and have high correlation between color histograms. Quantitative and qualitative evaluations on two benchmark RGB-D datasets illustrate the advantages of the proposed method. The experiments of object detection and manipulation performed on a mobile manipulator validate its effectiveness and practicability in robotic applications. PMID:26343656

  8. Application of remote sensing to monitoring and studying dispersion in ocean dumping

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Ohlhorst, C. W.

    1981-01-01

    Remotely sensed wide area synoptic data provides information on ocean dumping that is not readily available by other means. A qualitative approach has been used to map features, such as river plumes. Results of quantitative analyses have been used to develop maps showing quantitative distributions of one or more water quality parameters, such as suspended solids or chlorophyll a. Joint NASA/NOAA experiments have been conducted at designated dump areas in the U.S. coastal zones to determine the applicability of aircraft remote sensing systems to map plumes resulting from ocean dumping of sewage sludge and industrial wastes. A second objective is related to the evaluation of previously developed quantitative analysis techniques for studying dispersion of materials in these plumes. It was found that plumes resulting from dumping of four waste materials have distinctive spectral characteristics. The development of a technology for use in a routine monitoring system, based on remote sensing techniques, is discussed.

  9. An Empirical Study of the Application of Decision Making Model Using Judgement in the Allocation of Resources to Competing Educational Programs. Final Report.

    ERIC Educational Resources Information Center

    Tuscher, Leroy J.

    The purpose of the study was to provide "baseline" data for determining the feasibility of further investigation into the use of quantitive judgmental data in evaluating school programs for determining program budget allocations. The specific objectives were to: 1) Apply a Cost-Utility Model to a "real world" situation in a public secondary…

  10. Regional fringe analysis for improving depth measurement in phase-shifting fringe projection profilometry

    NASA Astrophysics Data System (ADS)

    Chien, Kuang-Che Chang; Tu, Han-Yen; Hsieh, Ching-Huang; Cheng, Chau-Jern; Chang, Chun-Yen

    2018-01-01

    This study proposes a regional fringe analysis (RFA) method to detect the regions of a target object in captured shifted images to improve depth measurement in phase-shifting fringe projection profilometry (PS-FPP). In the RFA method, region-based segmentation is exploited to segment the de-fringed image of a target object, and a multi-level fuzzy-based classification with five presented features is used to analyze and discriminate the regions of an object from the segmented regions, which were associated with explicit fringe information. Then, in the experiment, the performance of the proposed method is tested and evaluated on 26 test cases made of five types of materials. The qualitative and quantitative results demonstrate that the proposed RFA method can effectively detect the desired regions of an object to improve depth measurement in the PS-FPP system.

  11. Radiographic evaluation of nasal septal deviation from computed tomography correlates poorly with physical exam findings.

    PubMed

    Sedaghat, Ahmad R; Kieff, David A; Bergmark, Regan W; Cunnane, Mary E; Busaba, Nicolas Y

    2015-03-01

    Performance of septoplasty is dependent on objective evidence of nasal septal deviation. Although physical examination including anterior rhinoscopy and endoscopic examination is the gold standard for evaluation of septal deviation, third-party payors' reviews of septoplasty claims are often made on computed tomography (CT) findings. However, the correlation between radiographic evaluation of septal deviation with physical examination findings is unknown. Retrospective, blinded, independent evaluation of septal deviation in 39 consecutive patients from physical examination, including anterior rhinoscopy and endoscopic examination, by an otolaryngologist and radiographic evaluation of sinus CT scan by a neuroradiologist. Four distinct septal locations (nasal valve, cartilaginous, inferior/maxillary crest and osseous septum) were evaluated on a 4-point scale representing (1) 0% to 25%, (2) >25% to 50%, (3) >50% to 75%, and (4) >75% obstruction. Correlation between physical examination and radiographic evaluations was made by Pearson's correlation and quantitative agreement assessed by Krippendorf's alpha. Statistically significant correlation was detected between physical examination including nasal endoscopy and radiographic assessment of septal deviation only at the osseous septum (p = 0.007, r = 0.425) with low quantitative agreement (α = 0.290). No significant correlation was detected at the cartilaginous septum (p = 0.286, r = 0.175), inferior septum (p = 0.117, r = 0.255), or nasal valve (p = 0.174, r = 0.222). Quantitative agreement at the nasal valve suggested a bias in CT to underestimate physical exam findings (α = -0.490). CT is a poor substitute for physical examination, the gold standard, in assessment of septal deviation. Clinical decisions about pursuit of septoplasty or third-party payors' decisions to approve septoplasty should not be made on radiographic evidence. © 2014 ARS-AAOA, LLC.

  12. Quantitative Market Research Regarding Funding of District 8 Construction Projects

    DOT National Transportation Integrated Search

    1995-05-01

    The primary objective of this quantitative research is to provide information : for more effective decision making regarding the level of investment in various : transportation systems in District 8. : This objective was accomplished by establishing ...

  13. Integrating service development with evaluation in telehealthcare: an ethnographic study

    PubMed Central

    Finch, Tracy; May, Carl; Mair, Frances; Mort, Maggie; Gask, Linda

    2003-01-01

    Objectives To identify issues that facilitate the successful integration of evaluation and development of telehealthcare services. Design Ethnographic study using various qualitative research techniques to obtain data from several sources, including in-depth semistructured interviews, project steering group meetings, and public telehealthcare meetings. Setting Seven telehealthcare evaluation projects (four randomised controlled trials and three pragmatic service evaluations) in the United Kingdom, studied over two years. Projects spanned a range of specialties—dermatology, psychiatry, respiratory medicine, cardiology, and oncology. Participants Clinicians, managers, technical experts, and researchers involved in the projects. Results and discussion Key problems in successfully integrating evaluation and service development in telehealthcare are, firstly, defining existing clinical practices (and anticipating changes) in ways that permit measurement; secondly, managing additional workload and conflicting responsibilities brought about by combining clinical and research responsibilities (including managing risk); and, thirdly, understanding various perspectives on effectiveness and the limitations of evaluation results beyond the context of the research study. Conclusions Combined implementation and evaluation of telehealthcare systems is complex, and is often underestimated. The distinction between quantitative outcomes and the workability of the system is important for producing evaluative knowledge that is of practical value. More pragmatic approaches to evaluation, that permit both quantitative and qualitative methods, are required to improve the quality of such research and its relevance for service provision in the NHS. PMID:14630758

  14. A clustering approach to segmenting users of internet-based risk calculators.

    PubMed

    Harle, C A; Downs, J S; Padman, R

    2011-01-01

    Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.

  15. A flexible skin piloerection monitoring sensor

    NASA Astrophysics Data System (ADS)

    Kim, Jaemin; Seo, Dae Geon; Cho, Young-Ho

    2014-06-01

    We have designed, fabricated, and tested a capacitive-type flexible micro sensor for measurement of the human skin piloerection arisen from sudden emotional and environmental change. The present skin piloerection monitoring methods are limited in objective and quantitative measurement by physical disturbance stimulation to the skin due to bulky size and heavy weight of measuring devices. The proposed flexible skin piloerection monitoring sensor is composed of 3 × 3 spiral coplanar capacitor array using conductive polymer for having high capacitive density and thin enough thickness to be attached to human skin. The performance of the skin piloerection monitoring sensor is characterized using the artificial bump, representing human skin goosebump; thus, resulting in the sensitivity of -0.00252%/μm and the nonlinearity of 25.9% for the artificial goosebump deformation in the range of 0-326 μm. We also verified successive human skin piloerection having 3.5 s duration on the subject's dorsal forearms, thus resulting in the capacitance change of -6.2 fF and -9.2 fF for the piloerection intensity of 145 μm and 194 μm, respectively. It is demonstrated experimentally that the proposed sensor is capable to measure the human skin piloerection objectively and quantitatively, thereby suggesting the quantitative evaluation method of the qualitative human emotional status for cognitive human-machine interfaces applications.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pavlickova, Katarina; Vyskupova, Monika, E-mail: vyskupova@fns.uniba.sk

    Cumulative environmental impact assessment deals with the occasional use in practical application of environmental impact assessment process. The main reasons are the difficulty of cumulative impact identification caused by lack of data, inability to measure the intensity and spatial effect of all types of impacts and the uncertainty of their future evolution. This work presents a method proposal to predict cumulative impacts on the basis of landscape vulnerability evaluation. For this purpose, qualitative assessment of landscape ecological stability is conducted and major vulnerability indicators of environmental and socio-economic receptors are specified and valuated. Potential cumulative impacts and the overall impactmore » significance are predicted quantitatively in modified Argonne multiple matrixes while considering the vulnerability of affected landscape receptors and the significance of impacts identified individually. The method was employed in the concrete environmental impact assessment process conducted in Slovakia. The results obtained in this case study reflect that this methodology is simple to apply, valid for all types of impacts and projects, inexpensive and not time-consuming. The objectivity of the partial methods used in this procedure is improved by quantitative landscape ecological stability evaluation, assignment of weights to vulnerability indicators based on the detailed characteristics of affected factors, and grading impact significance. - Highlights: • This paper suggests a method proposal for cumulative impact prediction. • The method includes landscape vulnerability evaluation. • The vulnerability of affected receptors is determined by their sensitivity. • This method can increase the objectivity of impact prediction in the EIA process.« less

  17. From partnerships to networks: new approaches for measuring U.S. National Heritage Area effectiveness.

    PubMed

    Laven, Daniel N; Krymkowski, Daniel H; Ventriss, Curtis L; Manning, Robert E; Mitchell, Nora J

    2010-08-01

    National Heritage Areas (NHAs) are an alternative and increasingly popular form of protected area management in the United States. NHAs seek to integrate environmental objectives with community and economic objectives at regional or landscape scales. NHA designations have increased rapidly in the last 20 years, generating a substantial need for evaluative information about (a) how NHAs work; (b) outcomes associated with the NHA process; and (c) the costs and benefits of investing public moneys into the NHA approach. Qualitative evaluation studies recently conducted at three NHAs have identified the importance of understanding network structure and function in the context of evaluating NHA management effectiveness. This article extends these case studies by examining quantitative network data from each of the sites. The authors analyze these data using both a descriptive approach and a statistically more robust approach known as exponential random graph modeling. Study findings indicate the presence of transitive structures and the absence of three-cycle structures in each of these networks. This suggests that these networks are relatively ''open,'' which may be desirable, given the uncertainty of the environments in which they operate. These findings also suggest, at least at the sites reported here, that the NHA approach may be an effective way to activate and develop networks of intersectoral organizational partners. Finally, this study demonstrates the utility of using quantitative network analysis to better understand the effectiveness of protected area management models that rely on partnership networks to achieve their intended outcomes.

  18. Quantitative light-induced fluorescence technology for quantitative evaluation of tooth wear

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Kyeom; Lee, Hyung-Suk; Park, Seok-Woo; Lee, Eun-Song; de Josselin de Jong, Elbert; Jung, Hoi-In; Kim, Baek-Il

    2017-12-01

    Various technologies used to objectively determine enamel thickness or dentin exposure have been suggested. However, most methods have clinical limitations. This study was conducted to confirm the potential of quantitative light-induced fluorescence (QLF) using autofluorescence intensity of occlusal surfaces of worn teeth according to enamel grinding depth in vitro. Sixteen permanent premolars were used. Each tooth was gradationally ground down at the occlusal surface in the apical direction. QLF-digital and swept-source optical coherence tomography images were acquired at each grinding depth (in steps of 100 μm). All QLF images were converted to 8-bit grayscale images to calculate the fluorescence intensity. The maximum brightness (MB) values of the same sound regions in grayscale images before (MB) and phased values after (MB) the grinding process were calculated. Finally, 13 samples were evaluated. MB increased over the grinding depth range with a strong correlation (r=0.994, P<0.001). In conclusion, the fluorescence intensity of the teeth and grinding depth was strongly correlated in the QLF images. Therefore, QLF technology may be a useful noninvasive tool used to monitor the progression of tooth wear and to conveniently estimate enamel thickness.

  19. Quantitative assessment of upper extremities motor function in multiple sclerosis.

    PubMed

    Daunoraviciene, Kristina; Ziziene, Jurgita; Griskevicius, Julius; Pauk, Jolanta; Ovcinikova, Agne; Kizlaitiene, Rasa; Kaubrys, Gintaras

    2018-05-18

    Upper extremity (UE) motor function deficits are commonly noted in multiple sclerosis (MS) patients and assessing it is challenging because of the lack of consensus regarding its definition. Instrumented biomechanical analysis of upper extremity movements can quantify coordination with different spatiotemporal measures and facilitate disability rating in MS patients. To identify objective quantitative parameters for more accurate evaluation of UE disability and relate it to existing clinical scores. Thirty-four MS patients and 24 healthy controls (CG) performed a finger-to-nose test as fast as possible and, in addition, clinical evaluation kinematic parameters of UE were measured by using inertial sensors. Generally, a higher disability score was associated with an increase of several temporal parameters, like slower task performance. The time taken to touch their nose was longer when the task was fulfilled with eyes closed. Time to peak angular velocity significantly changed in MS patients (EDSS > 5.0). The inter-joint coordination significantly decreases in MS patients (EDSS 3.0-5.5). Spatial parameters indicated that maximal ROM changes were in elbow flexion. Our findings have revealed that spatiotemporal parameters are related to the UE motor function and MS disability level. Moreover, they facilitate clinical rating by supporting clinical decisions with quantitative data.

  20. Carotid lesion characterization by synthetic-aperture-imaging techniques with multioffset ultrasonic probes

    NASA Astrophysics Data System (ADS)

    Capineri, Lorenzo; Castellini, Guido; Masotti, Leonardo F.; Rocchi, Santina

    1992-06-01

    This paper explores the applications of a high-resolution imaging technique to vascular ultrasound diagnosis, with emphasis on investigation of the carotid vessel. With the present diagnostic systems, it is difficult to measure quantitatively the extension of the lesions and to characterize the tissue; quantitative images require enough spatial resolution and dynamic to reveal fine high-risk pathologies. A broadband synthetic aperture technique with multi-offset probes is developed to improve the lesion characterization by the evaluation of local scattering parameters. This technique works with weak scatterers embedded in a constant velocity medium, large aperture, and isotropic sources and receivers. The features of this technique are: axial and lateral spatial resolution of the order of the wavelength, high dynamic range, quantitative measurements of the size and scattering intensity of the inhomogeneities, and capabilities of investigation of inclined layer. The evaluation of the performances in real condition is carried out by a software simulator in which different experimental situations can be reproduced. Images of simulated anatomic test-objects are presented. The images are obtained with an inversion process of the synthesized ultrasonic signals, collected on the linear aperture by a limited number of finite size transducers.

  1. Using confidence intervals to evaluate the focus alignment of spectrograph detector arrays.

    PubMed

    Sawyer, Travis W; Hawkins, Kyle S; Damento, Michael

    2017-06-20

    High-resolution spectrographs extract detailed spectral information of a sample and are frequently used in astronomy, laser-induced breakdown spectroscopy, and Raman spectroscopy. These instruments employ dispersive elements such as prisms and diffraction gratings to spatially separate different wavelengths of light, which are then detected by a charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) detector array. Precise alignment along the optical axis (focus position) of the detector array is critical to maximize the instrumental resolution; however, traditional approaches of scanning the detector through focus lack a quantitative measure of precision, limiting the repeatability and relying on one's experience. Here we propose a method to evaluate the focus alignment of spectrograph detector arrays by establishing confidence intervals to measure the alignment precision. We show that propagation of uncertainty can be used to estimate the variance in an alignment, thus providing a quantitative and repeatable means to evaluate the precision and confidence of an alignment. We test the approach by aligning the detector array of a prototype miniature echelle spectrograph. The results indicate that the procedure effectively quantifies alignment precision, enabling one to objectively determine when an alignment has reached an acceptable level. This quantitative approach also provides a foundation for further optimization, including automated alignment. Furthermore, the procedure introduced here can be extended to other alignment techniques that rely on numerically fitting data to a model, providing a general framework for evaluating the precision of alignment methods.

  2. Quantitative analysis of comparative genomic hybridization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manoir, S. du; Bentz, M.; Joos, S.

    1995-01-01

    Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a programmore » for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.« less

  3. A Novel Pretreatment-Free Duplex Chamber Digital PCR Detection System for the Absolute Quantitation of GMO Samples.

    PubMed

    Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Luo, Yunbo; Xu, Wentao

    2016-03-18

    Digital polymerase chain reaction (PCR) has developed rapidly since it was first reported in the 1990s. However, pretreatments are often required during preparation for digital PCR, which can increase operation error. The single-plex amplification of both the target and reference genes may cause uncertainties due to the different reaction volumes and the matrix effect. In the current study, a quantitative detection system based on the pretreatment-free duplex chamber digital PCR was developed. The dynamic range, limit of quantitation (LOQ), sensitivity and specificity were evaluated taking the GA21 event as the experimental object. Moreover, to determine the factors that may influence the stability of the duplex system, we evaluated whether the pretreatments, the primary and secondary structures of the probes and the SNP effect influence the detection. The results showed that the LOQ was 0.5% and the sensitivity was 0.1%. We also found that genome digestion and single nucleotide polymorphism (SNP) sites affect the detection results, whereas the unspecific hybridization within different probes had little side effect. This indicated that the detection system was suited for both chamber-based and droplet-based digital PCR. In conclusion, we have provided a simple and flexible way of achieving absolute quantitation for genetically modified organism (GMO) genome samples using commercial digital PCR detection systems.

  4. A Novel Pretreatment-Free Duplex Chamber Digital PCR Detection System for the Absolute Quantitation of GMO Samples

    PubMed Central

    Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Luo, Yunbo; Xu, Wentao

    2016-01-01

    Digital polymerase chain reaction (PCR) has developed rapidly since it was first reported in the 1990s. However, pretreatments are often required during preparation for digital PCR, which can increase operation error. The single-plex amplification of both the target and reference genes may cause uncertainties due to the different reaction volumes and the matrix effect. In the current study, a quantitative detection system based on the pretreatment-free duplex chamber digital PCR was developed. The dynamic range, limit of quantitation (LOQ), sensitivity and specificity were evaluated taking the GA21 event as the experimental object. Moreover, to determine the factors that may influence the stability of the duplex system, we evaluated whether the pretreatments, the primary and secondary structures of the probes and the SNP effect influence the detection. The results showed that the LOQ was 0.5% and the sensitivity was 0.1%. We also found that genome digestion and single nucleotide polymorphism (SNP) sites affect the detection results, whereas the unspecific hybridization within different probes had little side effect. This indicated that the detection system was suited for both chamber-based and droplet-based digital PCR. In conclusion, we have provided a simple and flexible way of achieving absolute quantitation for genetically modified organism (GMO) genome samples using commercial digital PCR detection systems. PMID:26999129

  5. Evaluation of changes in periodontal bacteria in healthy dogs over 6 months using quantitative real-time PCR.

    PubMed

    Maruyama, N; Mori, A; Shono, S; Oda, H; Sako, T

    2018-03-01

    Porphyromonas gulae, Tannerella forsythia and Campylobacter rectus are considered dominant periodontal pathogens in dogs. Recently, quantitative real-time PCR (qRT-PCR) methods have been used for absolute quantitative determination of oral bacterial counts. The purpose of the present study was to establish a standardized qRT-PCR procedure to quantify bacterial counts of the three target periodontal bacteria (P. gulae, T. forsythia and C. rectus). Copy numbers of the three target periodontal bacteria were evaluated in 26 healthy dogs. Then, changes in bacterial counts of the three target periodontal bacteria were evaluated for 24 weeks in 7 healthy dogs after periodontal scaling. Analytical evaluation of each self-designed primer indicated acceptable analytical imprecision. All 26 healthy dogs were found to be positive for P. gulae, T. forsythia and C. rectus. Median total bacterial counts (copies/ng) of each target genes were 385.612 for P. gulae, 25.109 for T. forsythia and 5.771 for C. rectus. Significant differences were observed between the copy numbers of the three target periodontal bacteria. Periodontal scaling reduced median copy numbers of the three target periodontal bacteria in 7 healthy dogs. However, after periodontal scaling, copy numbers of all three periodontal bacteria significantly increased over time (p<0.05, Kruskal-Wallis test) (24 weeks). In conclusion, our results demonstrated that qRT-PCR can accurately measure periodontal bacteria in dogs. Furthermore, the present study has revealed that qRT-PCR method can be considered as a new objective evaluation system for canine periodontal disease. Copyright© by the Polish Academy of Sciences.

  6. Quantitative 3D Ultrashort Time-to-Echo (UTE) MRI and Micro-CT (μCT) Evaluation of the Temporomandibular Joint (TMJ) Condylar Morphology

    PubMed Central

    Geiger, Daniel; Bae, Won C.; Statum, Sheronda; Du, Jiang; Chung, Christine B.

    2014-01-01

    Objective Temporomandibular dysfunction involves osteoarthritis of the TMJ, including degeneration and morphologic changes of the mandibular condyle. Purpose of this study was to determine accuracy of novel 3D-UTE MRI versus micro-CT (μCT) for quantitative evaluation of mandibular condyle morphology. Material & Methods Nine TMJ condyle specimens were harvested from cadavers (2M, 3F; Age 85 ± 10 yrs., mean±SD). 3D-UTE MRI (TR=50ms, TE=0.05 ms, 104 μm isotropic-voxel) was performed using a 3-T MR scanner and μCT (18 μm isotropic-voxel) was performed. MR datasets were spatially-registered with μCT dataset. Two observers segmented bony contours of the condyles. Fibrocartilage was segmented on MR dataset. Using a custom program, bone and fibrocartilage surface coordinates, Gaussian curvature, volume of segmented regions and fibrocartilage thickness were determined for quantitative evaluation of joint morphology. Agreement between techniques (MRI vs. μCT) and observers (MRI vs. MRI) for Gaussian curvature, mean curvature and segmented volume of the bone were determined using intraclass correlation correlation (ICC) analyses. Results Between MRI and μCT, the average deviation of surface coordinates was 0.19±0.15 mm, slightly higher than spatial resolution of MRI. Average deviation of the Gaussian curvature and volume of segmented regions, from MRI to μCT, was 5.7±6.5% and 6.6±6.2%, respectively. ICC coefficients (MRI vs. μCT) for Gaussian curvature, mean curvature and segmented volumes were respectively 0.892, 0.893 and 0.972. Between observers (MRI vs. MRI), the ICC coefficients were 0.998, 0.999 and 0.997 respectively. Fibrocartilage thickness was 0.55±0.11 mm, as previously described in literature for grossly normal TMJ samples. Conclusion 3D-UTE MR quantitative evaluation of TMJ condyle morphology ex-vivo, including surface, curvature and segmented volume, shows high correlation against μCT and between observers. In addition, UTE MRI allows quantitative evaluation of the fibrocartilaginous condylar component. PMID:24092237

  7. Learning Grasp Context Distinctions that Generalize

    NASA Technical Reports Server (NTRS)

    Platt, Robert; Grupen, Roderic A.; Fagg, Andrew H.

    2006-01-01

    Control-based approaches to grasp synthesis create grasping behavior by sequencing and combining control primitives. In the absence of any other structure, these approaches must evaluate a large number of feasible control sequences as a function of object shape, object pose, and task. This work explores a new approach to grasp synthesis that limits consideration to variations on a generalized localize-reach-grasp control policy. A new learning algorithm, known as schema structured learning, is used to learn which instantiations of the generalized policy are most likely to lead to a successful grasp in different problem contexts. Two experiments are described where Dexter, a bimanual upper torso, learns to select an appropriate grasp strategy as a function of object eccentricity and orientation. In addition, it is shown that grasp skills learned in this way can generalize to new objects. Results are presented showing that after learning how to grasp a small, representative set of objects, the robot's performance quantitatively improves for similar objects that it has not experienced before.

  8. Direct quantitative evaluation of disease symptoms on living plant leaves growing under natural light.

    PubMed

    Matsunaga, Tomoko M; Ogawa, Daisuke; Taguchi-Shiobara, Fumio; Ishimoto, Masao; Matsunaga, Sachihiro; Habu, Yoshiki

    2017-06-01

    Leaf color is an important indicator when evaluating plant growth and responses to biotic/abiotic stress. Acquisition of images by digital cameras allows analysis and long-term storage of the acquired images. However, under field conditions, where light intensity can fluctuate and other factors (shade, reflection, and background, etc.) vary, stable and reproducible measurement and quantification of leaf color are hard to achieve. Digital scanners provide fixed conditions for obtaining image data, allowing stable and reliable comparison among samples, but require detached plant materials to capture images, and the destructive processes involved often induce deformation of plant materials (curled leaves and faded colors, etc.). In this study, by using a lightweight digital scanner connected to a mobile computer, we obtained digital image data from intact plant leaves grown in natural-light greenhouses without detaching the targets. We took images of soybean leaves infected by Xanthomonas campestris pv. glycines , and distinctively quantified two disease symptoms (brown lesions and yellow halos) using freely available image processing software. The image data were amenable to quantitative and statistical analyses, allowing precise and objective evaluation of disease resistance.

  9. Devising tissue ingrowth metrics: a contribution to the computational characterization of engineered soft tissue healing.

    PubMed

    Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin

    2018-03-14

    The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.

  10. Generating One Biometric Feature from Another: Faces from Fingerprints

    PubMed Central

    Ozkaya, Necla; Sagiroglu, Seref

    2010-01-01

    This study presents a new approach based on artificial neural networks for generating one biometric feature (faces) from another (only fingerprints). An automatic and intelligent system was designed and developed to analyze the relationships among fingerprints and faces and also to model and to improve the existence of the relationships. The new proposed system is the first study that generates all parts of the face including eyebrows, eyes, nose, mouth, ears and face border from only fingerprints. It is also unique and different from similar studies recently presented in the literature with some superior features. The parameter settings of the system were achieved with the help of Taguchi experimental design technique. The performance and accuracy of the system have been evaluated with 10-fold cross validation technique using qualitative evaluation metrics in addition to the expanded quantitative evaluation metrics. Consequently, the results were presented on the basis of the combination of these objective and subjective metrics for illustrating the qualitative properties of the proposed methods as well as a quantitative evaluation of their performances. Experimental results have shown that one biometric feature can be determined from another. These results have once more indicated that there is a strong relationship between fingerprints and faces. PMID:22399877

  11. Analysis and Evaluation of Processes and Equipment in Tasks 2 and 4 of the Low-cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Wolf, M.

    1979-01-01

    To facilitate the task of objectively comparing competing process options, a methodology was needed for the quantitative evaluation of their relative cost effectiveness. Such a methodology was developed and is described, together with three examples for its application. The criterion for the evaluation is the cost of the energy produced by the system. The method permits the evaluation of competing design options for subsystems, based on the differences in cost and efficiency of the subsystems, assuming comparable reliability and service life, or of competing manufacturing process options for such subsystems, which include solar cells or modules. This process option analysis is based on differences in cost, yield, and conversion efficiency contribution of the process steps considered.

  12. [Classical and molecular methods for identification and quantification of domestic moulds].

    PubMed

    Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S

    2017-12-01

    To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.

  13. Forest management and economics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buongiorno, J.; Gilless, J.K.

    1987-01-01

    This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.

  14. Applications of Doppler ultrasound in clinical vascular disease

    NASA Technical Reports Server (NTRS)

    Barnes, R. W.; Hokanson, D. E.; Sumner, D. S.; Strandness, D. E., Jr.

    1975-01-01

    Doppler ultrasound has become the most useful and versatile noninvasive technique for objective evaluation of clinical vascular disease. Commercially available continuous-wave instruments provide qualitative and quantitative assessment of venous and arterial disease. Pulsed Doppler ultrasound was developed to provide longitudinal and transverse cross-sectional images of the arterial lumen with a resolution approaching that of conventional X-ray techniques. Application of Doppler ultrasound in venous, peripheral arterial, and cerebrovascular diseases is reviewed.

  15. Objective Evaluation of Visual Fatigue Using Binocular Fusion Maintenance.

    PubMed

    Hirota, Masakazu; Morimoto, Takeshi; Kanda, Hiroyuki; Endo, Takao; Miyoshi, Tomomitsu; Miyagawa, Suguru; Hirohara, Yoko; Yamaguchi, Tatsuo; Saika, Makoto; Fujikado, Takashi

    2018-03-01

    In this study, we investigated whether an individual's visual fatigue can be evaluated objectively and quantitatively from their ability to maintain binocular fusion. Binocular fusion maintenance (BFM) was measured using a custom-made binocular open-view Shack-Hartmann wavefront aberrometer equipped with liquid crystal shutters, wherein eye movements and wavefront aberrations were measured simultaneously. Transmittance in the liquid crystal shutter in front of the subject's nondominant eye was reduced linearly, and BFM was determined from the transmittance at the point when binocular fusion was broken and vergence eye movement was induced. In total, 40 healthy subjects underwent the BFM test and completed a questionnaire regarding subjective symptoms before and after a visual task lasting 30 minutes. BFM was significantly reduced after the visual task ( P < 0.001) and was negatively correlated with the total subjective eye symptom score (adjusted R 2 = 0.752, P < 0.001). Furthermore, the diagnostic accuracy for visual fatigue was significantly higher in BFM than in the conventional test results (aggregated fusional vergence range, near point of convergence, and the high-frequency component of accommodative microfluctuations; P = 0.007). These results suggest that BFM can be used as an indicator for evaluating visual fatigue. BFM can be used to evaluate the visual fatigue caused by the new visual devices, such as head-mount display, objectively.

  16. Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.

    PubMed

    Kemp, Pavlina S; VanderVeen, Deborah K

    2016-01-01

    The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.

  17. Maintenance = reuse-oriented software development

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1989-01-01

    Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.

  18. An object tracking method based on guided filter for night fusion image

    NASA Astrophysics Data System (ADS)

    Qian, Xiaoyan; Wang, Yuedong; Han, Lei

    2016-01-01

    Online object tracking is a challenging problem as it entails learning an effective model to account for appearance change caused by intrinsic and extrinsic factors. In this paper, we propose a novel online object tracking with guided image filter for accurate and robust night fusion image tracking. Firstly, frame difference is applied to produce the coarse target, which helps to generate observation models. Under the restriction of these models and local source image, guided filter generates sufficient and accurate foreground target. Then accurate boundaries of the target can be extracted from detection results. Finally timely updating for observation models help to avoid tracking shift. Both qualitative and quantitative evaluations on challenging image sequences demonstrate that the proposed tracking algorithm performs favorably against several state-of-art methods.

  19. Imaging Performance of Quantitative Transmission Ultrasound

    PubMed Central

    Lenox, Mark W.; Wiskin, James; Lewis, Matthew A.; Darrouzet, Stephen; Borup, David; Hsieh, Scott

    2015-01-01

    Quantitative Transmission Ultrasound (QTUS) is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound. PMID:26604918

  20. Breast Retraction Assessment: an objective evaluation of cosmetic results of patients treated conservatively for breast cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pezner, R.D.; Patterson, M.P.; Hill, L.R.

    Breast Retraction Assessment (BRA) is an objective evaluation of the amount of cosmetic retraction of the treated breast in comparison to the untreated breast in patients who receive conservative treatment for breast cancer. A clear acrylic sheet supported vertically and marked as a grid at 1 cm intervals is employed to perform the measurements. Average BRA value in 29 control patients without breast cancer was 1.2 cm. Average BRA value in 27 patients treated conservatively for clinical Stage I or II unilateral breast cancer was 3.7 cm. BRA values in breast cancer patients ranged from 0.0 to 8.5 cm. Patientsmore » who received a local radiation boost to the primary tumor bed site had statistically significantly less retraction than those who did not receive a boost. Patients who had an extensive primary tumor resection had statistically significantly more retraction than those who underwent a more limited resection. In comparison to qualitative forms of cosmetic analysis, BRA is an objective test that can quantitatively evaluate factors which may be related to cosmetic retraction in patients treated conservatively for breast cancer.« less

  1. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  2. A longitudinal interprofessional simulation curriculum for critical care teams: Exploring successes and challenges.

    PubMed

    Leclair, Laurie W; Dawson, Mary; Howe, Alison; Hale, Sue; Zelman, Eric; Clouser, Ryan; Garrison, Garth; Allen, Gilman

    2018-05-01

    Interprofessional care teams are the backbone of intensive care units (ICUs) where severity of illness is high and care requires varied skills and experience. Despite this care model, longitudinal educational programmes for such workplace teams rarely include all professions. In this article, we report findings on the initial assessment and evaluation of an ongoing, longitudinal simulation-based curriculum for interprofessional workplace critical care teams. The study had two independent components, quantitative learner assessment and qualitative curricular evaluation. To assess curriculum effectiveness at meeting learning objectives, participant-reported key learning points identified using a self-assessment tool administered immediately following curricular participation were mapped to session learning objectives. To evaluate the curriculum, we conducted a qualitative study using a phenomenology approach involving purposeful sampling of nine curricular participants undergoing recorded semi-structured interviews. Verbatim transcripts were reviewed by two independent readers to derive themes further subdivided into successes and barriers. Learner self-assessment demonstrated that the majority of learners, across all professions, achieved at least one intended learning objective with senior learners more likely to report team-based objectives and junior learners more likely to report knowledge/practice objectives. Successes identified by curricular evaluation included authentic critical care curricular content, safe learning environment, and team comradery from shared experience. Barriers included unfamiliarity with the simulation environment and clinical coverage for curricular participation. This study suggests that a sustainable interprofessional curriculum for workplace ICU critical care teams can achieve the desired educational impact and effectively deliver authentic simulated work experiences if barriers to educational engagement and participation can be overcome.

  3. T2* Mapping Provides Information That Is Statistically Comparable to an Arthroscopic Evaluation of Acetabular Cartilage.

    PubMed

    Morgan, Patrick; Nissi, Mikko J; Hughes, John; Mortazavi, Shabnam; Ellerman, Jutta

    2017-07-01

    Objectives The purpose of this study was to validate T2* mapping as an objective, noninvasive method for the prediction of acetabular cartilage damage. Methods This is the second step in the validation of T2*. In a previous study, we established a quantitative predictive model for identifying and grading acetabular cartilage damage. In this study, the model was applied to a second cohort of 27 consecutive hips to validate the model. A clinical 3.0-T imaging protocol with T2* mapping was used. Acetabular regions of interest (ROI) were identified on magnetic resonance and graded using the previously established model. Each ROI was then graded in a blinded fashion by arthroscopy. Accurate surgical location of ROIs was facilitated with a 2-dimensional map projection of the acetabulum. A total of 459 ROIs were studied. Results When T2* mapping and arthroscopic assessment were compared, 82% of ROIs were within 1 Beck group (of a total 6 possible) and 32% of ROIs were classified identically. Disease prediction based on receiver operating characteristic curve analysis demonstrated a sensitivity of 0.713 and a specificity of 0.804. Model stability evaluation required no significant changes to the predictive model produced in the initial study. Conclusions These results validate that T2* mapping provides statistically comparable information regarding acetabular cartilage when compared to arthroscopy. In contrast to arthroscopy, T2* mapping is quantitative, noninvasive, and can be used in follow-up. Unlike research quantitative magnetic resonance protocols, T2* takes little time and does not require a contrast agent. This may facilitate its use in the clinical sphere.

  4. Designing A Mixed Methods Study In Primary Care

    PubMed Central

    Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.

    2004-01-01

    BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277

  5. Assessment of In-Stent Restenosis Using 64-MDCT: Analysis of the CORE-64 Multicenter International Trial

    PubMed Central

    Wykrzykowska, Joanna J.; Arbab-Zadeh, Armin; Godoy, Gustavo; Miller, Julie M.; Lin, Shezhang; Vavere, Andrea; Paul, Narinder; Niinuma, Hiroyuki; Hoe, John; Brinker, Jeffrey; Khosa, Faisal; Sarwar, Sheryar; Lima, Joao; Clouse, Melvin E.

    2012-01-01

    OBJECTIVE Evaluations of stents by MDCT from studies performed at single centers have yielded variable results with a high proportion of unassessable stents. The purpose of this study was to evaluate the accuracy of 64-MDCT angiography (MDCTA) in identifying in-stent restenosis in a multicenter trial. MATERIALS AND METHODS The Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography Using 64 Detectors (CORE-64) Multicenter Trial and Registry evaluated the accuracy of 64-MDCTA in assessing 405 patients referred for coronary angiography. A total of 75 stents in 52 patients were assessed: 48 of 75 stents (64%) in 36 of 52 patients (69%) could be evaluated. The prevalence of in-stent restenosis by quantitative coronary angiography (QCA) in this subgroup was 23% (17/75). Eighty percent of the stents were ≤ 3.0 mm in diameter. RESULTS The overall sensitivity, specificity, positive predictive value, and negative predictive value to detect 50% in-stent stenosis visually using MDCT compared with QCA was 33.3%, 91.7%, 57.1%, and 80.5%, respectively, with an overall accuracy of 77.1% for the 48 assessable stents. The ability to evaluate stents on MDCTA varied by stent type: Thick-strut stents such as Bx Velocity were assessable in 50% of the cases; Cypher, 62.5% of the cases; and thinner-strut stents such as Taxus, 75% of the cases. We performed quantitative assessment of in-stent contrast attenuation in Hounsfield units and correlated that value with the quantitative percentage of stenosis by QCA. The correlation coefficient between the average attenuation decrease and ≥ 50% stenosis by QCA was 0.25 (p = 0.073). Quantitative assessment failed to improve the accuracy of MDCT over qualitative assessment. CONCLUSION The results of our study showed that 64-MDCT has poor ability to detect in-stent restenosis in small-diameter stents. Evaluability and negative predictive value were better in large-diameter stents. Thus, 64-MDCT may be appropriate for stent assessment in only selected patients. PMID:20028909

  6. Establishment of a Quantitative Medical Technology Evaluation System and Indicators within Medical Institutions.

    PubMed

    Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji

    2018-06-05

    The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the study established a quantitative medical technology evaluation system model and assessment indicators within medical institutions based on the Delphi method and analytical hierarchy process. Moreover, further verifications, adjustments, and optimizations of the system and indicators will be performed in follow-up studies.

  7. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    PubMed

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. ©Hani Badran, Pierre Pluye, Roland Grad. Originally published in JMIR Medical Education (http://mededu.jmir.org), 14.03.2017.

  8. Multi-indicator Evaluation System for Broadsword, Rod, Sword and Spear Athletes Based on Analytic Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Luo, Lin

    2017-08-01

    In the practical selection of Wushu athletes, the objective evaluation of the level of athletes lacks sufficient technical indicators and often relies on the coach’s subjective judgments. It is difficult to accurately and objectively reflect the overall quality of the athletes without a fully quantified indicator system, thus affecting the level improvement of Wushu competition. The analytic hierarchy process (AHP) is a systemic analysis method combining quantitative and qualitative analysis. This paper realizes structured, hierarchized and quantified decision-making process of evaluating broadsword, rod, sword and spear athletes in the AHP. Combing characteristics of the athletes, analysis is carried out from three aspects, i.e., the athlete’s body shape, physical function and sports quality and 18 specific evaluation indicators established, and then combining expert advice and practical experience, pairwise comparison matrix is determined, and then the weight of the indicators and comprehensive evaluation coefficient are obtained to establish the evaluation model for the athletes, thus providing a scientific theoretical basis for the selection of Wushu athletes. The evaluation model proposed in this paper has realized the evaluation system of broadsword, rod, sword and spear athletes, which has effectively improved the scientific level of Wushu athletes selection in practical application.

  9. Current perspectives of CASA applications in diverse mammalian spermatozoa.

    PubMed

    van der Horst, Gerhard; Maree, Liana; du Plessis, Stefan S

    2018-03-26

    Since the advent of computer-aided sperm analysis (CASA) some four decades ago, advances in computer technology and software algorithms have helped establish it as a research and diagnostic instrument for the analysis of spermatozoa. Despite mammalian spermatozoa being the most diverse cell type known, CASA is a great tool that has the capacity to provide rapid, reliable and objective quantitative assessment of sperm quality. This paper provides contemporary research findings illustrating the scientific and commercial applications of CASA and its ability to evaluate diverse mammalian spermatozoa (human, primates, rodents, domestic mammals, wildlife species) at both structural and functional levels. The potential of CASA to quantitatively measure essential aspects related to sperm subpopulations, hyperactivation, morphology and morphometry is also demonstrated. Furthermore, applications of CASA are provided for improved mammalian sperm quality assessment, evaluation of sperm functionality and the effect of different chemical substances or pathologies on sperm fertilising ability. It is clear that CASA has evolved significantly and is currently superior to many manual techniques in the research and clinical setting.

  10. Shearography for Non-Destructive Evaluation with Applications to BAT Mask Tile Adhesive Bonding and Specular Surface Honeycomb Panels

    NASA Technical Reports Server (NTRS)

    Lysak, Daniel B.

    2003-01-01

    In this report we examine the applicability of shearography techniques for nondestructive inspection and evaluation in two unique application areas. In the first application, shearography is used to evaluate the quality of adhesive bonds holding lead tiles to the BAT gamma ray mask for the NASA Swift program. By exciting the mask with a vibration, the more poorly bonded tiles can be distinguished by their greater displacement response, which is readily identifiable in the shearography image. A quantitative analysis is presented that compares the shearography results with a destructive pull test measuring the force at bond failure. Generally speaking, the results show good agreement. Further investigation would be useful to optimize certain test parameters such as vibration frequency and amplitude. The second application is to evaluate the bonding between the skin and core of a honeycomb structure with a specular (mirror-like) surface. In standard shearography techniques, the object under test must have a diffuse surface to generate the speckle patterns in laser light, which are then sheared. A novel configuration using the specular surface as a mirror to image speckles from a diffuser is presented, opening up the use of shearography to a new class of objects that could not have been examined with the traditional approach. This new technique readily identifies large scale bond failures in the panel, demonstrating the validity of this approach. For the particular panel examined here, some scaling issues should be examined further to resolve the measurement scale down to the very small size of the core cells. In addition, further development should be undertaken to determine the general applicability of the new approach and to establish a firm quantitative foundation.

  11. Developing a monitoring and evaluation framework to integrate and formalize the informal waste and recycling sector: the case of the Philippine National Framework Plan.

    PubMed

    Serrona, Kevin Roy B; Yu, Jeongsoo; Aguinaldo, Emelita; Florece, Leonardo M

    2014-09-01

    The Philippines has been making inroads in solid waste management with the enactment and implementation of the Republic Act 9003 or the Ecological Waste Management Act of 2000. Said legislation has had tremendous influence in terms of how the national and local government units confront the challenges of waste management in urban and rural areas using the reduce, reuse, recycle and recovery framework or 4Rs. One of the sectors needing assistance is the informal waste sector whose aspiration is legal recognition of their rank and integration of their waste recovery activities in mainstream waste management. To realize this, the Philippine National Solid Waste Management Commission initiated the formulation of the National Framework Plan for the Informal Waste Sector, which stipulates approaches, strategies and methodologies to concretely involve the said sector in different spheres of local waste management, such as collection, recycling and disposal. What needs to be fleshed out is the monitoring and evaluation component in order to gauge qualitative and quantitative achievements vis-a-vis the Framework Plan. In the process of providing an enabling environment for the informal waste sector, progress has to be monitored and verified qualitatively and quantitatively and measured against activities, outputs, objectives and goals. Using the Framework Plan as the reference, this article developed monitoring and evaluation indicators using the logical framework approach in project management. The primary objective is to institutionalize monitoring and evaluation, not just in informal waste sector plans, but in any waste management initiatives to ensure that envisaged goals are achieved. © The Author(s) 2014.

  12. Multicriteria decision analysis applied to Glen Canyon Dam

    USGS Publications Warehouse

    Flug, M.; Seitz, H.L.H.; Scott, J.F.

    2000-01-01

    Conflicts in water resources exist because river-reservoir systems are managed to optimize traditional benefits (e.g., hydropower and flood control), which are historically quantified in economic terms, whereas natural and environmental resources, including in-stream and riparian resources, are more difficult or impossible to quantify in economic terms. Multicriteria decision analysis provides a quantitative approach to evaluate resources subject to river basin management alternatives. This objective quantification method includes inputs from special interest groups, the general public, and concerned individuals, as well as professionals for each resource considered in a trade-off analysis. Multicriteria decision analysis is applied to resources and flow alternatives presented in the environmental impact statement for Glen Canyon Dam on the Colorado River. A numeric rating and priority-weighting scheme is used to evaluate 29 specific natural resource attributes, grouped into seven main resource objectives, for nine flow alternatives enumerated in the environmental impact statement.

  13. Construction and validation of clinical contents for development of learning objects.

    PubMed

    Hortense, Flávia Tatiana Pedrolo; Bergerot, Cristiane Decat; Domenico, Edvane Birelo Lopes de

    2018-01-01

    to describe the process of construction and validation of clinical contents for health learning objects, aimed at patients in the treatment of head and neck cancer. descriptive, methodological study. The development of the script and the storyboard were based on scientific evidence and submitted to the appreciation of specialists for validation of content. The agreement index was checked quantitatively and the suggestions were qualitatively evaluated. The items described in the roadmap were approved by 99% of expert experts. The suggestions for adjustments were inserted in their entirety in the final version. The free-marginal kappa statistical test, for multiple evaluators, presented value equal to 0.68%, granting a substantial agreement. The steps taken in the construction and validation of the content for the production of educational material for patients with head and neck cancer were adequate, relevant and suitable for use in other subjects.

  14. Metabolomic Profiling as a Possible Reverse Engineering Tool for Estimating Processing Conditions of Dry-Cured Hams.

    PubMed

    Sugimoto, Masahiro; Obiya, Shinichi; Kaneko, Miku; Enomoto, Ayame; Honma, Mayu; Wakayama, Masataka; Soga, Tomoyoshi; Tomita, Masaru

    2017-01-18

    Dry-cured hams are popular among consumers. To increase the attractiveness of the product, objective analytical methods and algorithms to evaluate the relationship between observable properties and consumer acceptability are required. In this study, metabolomics, which is used for quantitative profiling of hundreds of small molecules, was applied to 12 kinds of dry-cured hams from Japan and Europe. In total, 203 charged metabolites, including amino acids, organic acids, nucleotides, and peptides, were successfully identified and quantified. Metabolite profiles were compared for the samples with different countries of origin and processing methods (e.g., smoking or use of a starter culture). Principal component analysis of the metabolite profiles with sensory properties revealed significant correlations for redness, homogeneity, and fat whiteness. This approach could be used to design new ham products by objective evaluation of various features.

  15. Optimization of Dual-Energy Xenon-CT for Quantitative Assessment of Regional Pulmonary Ventilation

    PubMed Central

    Fuld, Matthew K.; Halaweish, Ahmed; Newell, John D.; Krauss, Bernhard; Hoffman, Eric A.

    2013-01-01

    Objective Dual-energy X-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study we seek to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. Materials and Methods The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon-oxygen gas mixtures (0, 20, 25, 33, 50, 66, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved three-material decomposition calibration parameters. Additionally, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine in order to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Results Attenuation curves for xenon were obtained from the syringe test objects and were used to develop improved three-material decomposition parameters (HU enhancement per percent xenon: Within the chest phantom: 2.25 at 80kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; In open air: 2.5 at 80kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally non-dependent portion of the airway tree test-object, while not affecting quantitation of xenon in the three-material decomposition DECT. 40%Xe/40%He/20%O2 provided good signal-to-noise, greater than the Rose Criterion (SNR > 5), while avoiding gravitational effects of similar concentrations of xenon in a 60%O2 mixture. 80/140-kVp (tin-filtered) provided improved SNR compared with 100/140-kVp in a swine with an equivalent thoracic transverse density to a human subject with body mass index of 33. Airways were brighter in the 80/140 kVp scan (80/140Sn, 31.6%; 100/140Sn, 25.1%) with considerably lower noise (80/140Sn, CV of 0.140; 100/140Sn, CV of 0.216). Conclusion In order to provide a truly quantitative measure of regional lung function with xenon-DECT, the basic protocols and parameter calibrations needed to be better understood and quantified. It is critically important to understand the fundamentals of new techniques in order to allow for proper implementation and interpretation of their results prior to wide spread usage. With the use of an in house derived xenon calibration curve for three-material decomposition rather than the scanner supplied calibration and a xenon/helium/oxygen mixture we demonstrate highly accurate quantitation of xenon gas volumes and avoid gravitational effects on gas distribution. This study provides a foundation for other researchers to use and test these methods with the goal of clinical translation. PMID:23571834

  16. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  17. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  18. 42 CFR 431.424 - Evaluation requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... evaluations. Demonstration evaluations will include the following: (1) Quantitative research methods. (i... of appropriate evaluation strategies (including experimental and other quantitative and qualitative... demonstration. (ii) CMS will consider alternative evaluation designs when quantitative designs are technically...

  19. POPA: A Personality and Object Profiling Assistant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreicer, J.S.

    POPA: A Personality and Object Profiling Assistant system utilizes an extension and variation of a process developed for decision analysis as a tool to quantify intuitive feelings and subjective judgments. The technique is based on a manipulation of the Analytical Hierarchy Process. The POPA system models an individual in terms of his character type, life orientation, and incentive (motivational) factors. Then an object (i.e., individual, project, situation, or policy) is modeled with respect to its three most important factors. The individual and object models are combined to indicate the influence each of the three object factors have on the individual.more » We have investigated this problem: 1) to develop a technique that models personality types in a quantitative and organized manner, 2) to develop a tool capable of evaluating the probable success of obtaining funding for proposed programs at Los Alamos National Laboratory, 3) to determine the feasibility of quantifying feelings and intuition, and 4) to better understand subjective knowledge acquisition (especially intuition). 49 refs., 10 figs., 5 tabs.« less

  20. Multi-objective decision-making under uncertainty: Fuzzy logic methods

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.

    1994-01-01

    Selecting the best option among alternatives is often a difficult process. This process becomes even more difficult when the evaluation criteria are vague or qualitative, and when the objectives vary in importance and scope. Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.

  1. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  2. Magnetic resonance imaging of cartilage repair.

    PubMed

    Potter, Hollis G; Chong, Le Roy; Sneag, Darryl B

    2008-12-01

    Magnetic resonance imaging is an important noninvasive modality in characterizing cartilage morphology, biochemistry, and function. It serves as a valuable objective outcome measure in diagnosing pathology at the time of initial injury, guiding surgical planning, and evaluating postsurgical repair. This article reviews the current literature addressing the recent advances in qualitative and quantitative magnetic resonance imaging techniques in the preoperative setting, and in patients who have undergone cartilage repair techniques such as microfracture, autologous cartilage transplantation, or osteochondral transplantation.

  3. Quantitative Evaluations of the Effects of the Seabed Sediments on Scattering and Propagation of Acoustics Energy in Shallow Oceans

    DTIC Science & Technology

    1999-09-30

    Dec. (1998) Yamamoto, T., “ A poroelastic model of highly permeable rocks,” Geophysics, revised August 1999a. Yamamoto, T., “ Acoustical imaging of...scattering mechanisms (volume fluctuation, bottom and sub-bottom roughness) on the acoustic propagation and scattering, and the effects of poroelastic ...properties of the sediments on the propagation of acoustic waves. OBJECTIVES To develop a universal (forward/inverse) model for the seafloor roughness

  4. Quantitative Analysis of Situational Awareness (QUASA): Applying Signal Detection Theory to True/False Probes and Self-Ratings

    DTIC Science & Technology

    2004-06-01

    obtained. Further refinements of the technique based on recent research in experimental psychology are also considered. INTRODUCTION The key...an established line of research in psychology in which objective and subjective metrics are combined to analyse the degree of ‘calibration’ in... Creelman , 1991). A notable exception is the study by Kunimoto et al. (2001) in which confidence ratings were subjected to SDT analysis to evaluate the

  5. TIES for Dummies 3rd Edition (Technology Identification, Evaluation, and Selection) Basic how to's to implement the TIES method

    NASA Technical Reports Server (NTRS)

    Kirby, Michelle R.

    2002-01-01

    The TIES method is a forecasting environment whereby the decision-maker has the ability to easily assess and trade-off the impact of various technologies without sophisticated and time-consuming mathematical formulations. TIES provides a methodical approach where technically feasible alternatives can be identified with accuracy and speed to reduce design cycle time, and subsequently, life cycle costs, and was achieved through the use of various probabilistic methods, such as Response Surface Methodology and Monte Carlo Simulations. Furthermore, structured and systematic techniques are utilized from other fields to identify possible concepts and evaluation criteria by which comparisons can be made. This objective is achieved by employing the use of Morphological Matrices and Multi-Attribute Decision Making techniques. Through the execution of each step, a family of design alternatives for a given set of customer requirements can be identified and assessed subjectively or objectively. This methodology allows for more information (knowledge) to be brought into the earlier phases of the design process and will have direct implications on the affordability of the system. The increased knowledge allows for optimum allocation of company resources and quantitative justification for program decisions. Finally, the TIES method provided novel results and quantitative justification to facilitate decision making in the early stages of design so as to produce affordable and quality products.

  6. Ultrasound-based quantification of vitreous floaters correlates with contrast sensitivity and quality of life.

    PubMed

    Mamou, Jonathan; Wa, Christianne A; Yee, Kenneth M P; Silverman, Ronald H; Ketterling, Jeffrey A; Sadun, Alfredo A; Sebag, J

    2015-01-22

    Clinical evaluation of floaters lacks quantitative assessment of vitreous structure. This study used quantitative ultrasound (QUS) to measure vitreous opacities. Since floaters reduce contrast sensitivity (CS) and quality of life (Visual Function Questionnaire [VFQ]), it is hypothesized that QUS will correlate with CS and VFQ in patients with floaters. Twenty-two eyes (22 subjects; age = 57 ± 19 years) with floaters were evaluated with Freiburg acuity contrast testing (FrACT; %Weber) and VFQ. Ultrasonography used a customized probe (15-MHz center frequency, 20-mm focal length, 7-mm aperture) with longitudinal and transverse scans taken in primary gaze and a horizontal longitudinal scan through premacular vitreous in temporal gaze. Each scan set had 100 frames of log-compressed envelope data. Within each frame, two regions of interest (ROIs) were analyzed (whole-central and posterior vitreous) to yield three parameters (energy, E; mean amplitude, M; and percentage of vitreous filled by echodensities, P50) averaged over the entire 100-frame dataset. Statistical analyses evaluated E, M, and P50 correlations with CS and VFQ. Contrast sensitivity ranged from 1.19%W (normal) to 5.59%W. All QUS parameters in two scan positions within the whole-central ROI correlated with CS (R > 0.67, P < 0.001). P50 in the nasal longitudinal position had R = 0.867 (P < 0.001). Correlations with VFQ ranged from R = 0.52 (P < 0.013) to R = 0.65 (P < 0.001). Quantitative ultrasound provides quantitative measures of vitreous echodensity that correlate with CS and VFQ, providing objective assessment of vitreous structure underlying the functional disturbances induced by floaters, useful to quantify vitreous disease severity and the response to therapy. Copyright 2015 The Association for Research in Vision and Ophthalmology, Inc.

  7. Ultrasound-Based Quantification of Vitreous Floaters Correlates with Contrast Sensitivity and Quality of Life

    PubMed Central

    Mamou, Jonathan; Wa, Christianne A.; Yee, Kenneth M. P.; Silverman, Ronald H.; Ketterling, Jeffrey A.; Sadun, Alfredo A.; Sebag, J.

    2015-01-01

    Purpose. Clinical evaluation of floaters lacks quantitative assessment of vitreous structure. This study used quantitative ultrasound (QUS) to measure vitreous opacities. Since floaters reduce contrast sensitivity (CS) and quality of life (Visual Function Questionnaire [VFQ]), it is hypothesized that QUS will correlate with CS and VFQ in patients with floaters. Methods. Twenty-two eyes (22 subjects; age = 57 ± 19 years) with floaters were evaluated with Freiburg acuity contrast testing (FrACT; %Weber) and VFQ. Ultrasonography used a customized probe (15-MHz center frequency, 20-mm focal length, 7-mm aperture) with longitudinal and transverse scans taken in primary gaze and a horizontal longitudinal scan through premacular vitreous in temporal gaze. Each scan set had 100 frames of log-compressed envelope data. Within each frame, two regions of interest (ROIs) were analyzed (whole-central and posterior vitreous) to yield three parameters (energy, E; mean amplitude, M; and percentage of vitreous filled by echodensities, P50) averaged over the entire 100-frame dataset. Statistical analyses evaluated E, M, and P50 correlations with CS and VFQ. Results. Contrast sensitivity ranged from 1.19%W (normal) to 5.59%W. All QUS parameters in two scan positions within the whole-central ROI correlated with CS (R > 0.67, P < 0.001). P50 in the nasal longitudinal position had R = 0.867 (P < 0.001). Correlations with VFQ ranged from R = 0.52 (P < 0.013) to R = 0.65 (P < 0.001). Conclusions. Quantitative ultrasound provides quantitative measures of vitreous echodensity that correlate with CS and VFQ, providing objective assessment of vitreous structure underlying the functional disturbances induced by floaters, useful to quantify vitreous disease severity and the response to therapy. PMID:25613948

  8. Defining an Analytic Framework to Evaluate Quantitative MRI Markers of Traumatic Axonal Injury: Preliminary Results in a Mouse Closed Head Injury Model

    PubMed Central

    Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.

    2017-01-01

    Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972

  9. Are we missing the boat? Current uses of long-term biological monitoring data in the evaluation and management of marine protected areas.

    PubMed

    Addison, P F E; Flander, L B; Cook, C N

    2015-02-01

    Protected area management agencies are increasingly using management effectiveness evaluation (MEE) to better understand, learn from and improve conservation efforts around the globe. Outcome assessment is the final stage of MEE, where conservation outcomes are measured to determine whether management objectives are being achieved. When quantitative monitoring data are available, best-practice examples of outcome assessments demonstrate that data should be assessed against quantitative condition categories. Such assessments enable more transparent and repeatable integration of monitoring data into MEE, which can promote evidence-based management and improve public accountability and reporting. We interviewed key informants from marine protected area (MPA) management agencies to investigate how scientific data sources, especially long-term biological monitoring data, are currently informing conservation management. Our study revealed that even when long-term monitoring results are available, management agencies are not using them for quantitative condition assessment in MEE. Instead, many agencies conduct qualitative condition assessments, where monitoring results are interpreted using expert judgment only. Whilst we found substantial evidence for the use of long-term monitoring data in the evidence-based management of MPAs, MEE is rarely the sole mechanism that facilitates the knowledge transfer of scientific evidence to management action. This suggests that the first goal of MEE (to enable environmental accountability and reporting) is being achieved, but the second and arguably more important goal of facilitating evidence-based management is not. Given that many MEE approaches are in their infancy, recommendations are made to assist management agencies realize the full potential of long-term quantitative monitoring data for protected area evaluation and evidence-based management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  11. Brain tumor response to nimotuzumab treatment evaluated on magnetic resonance imaging.

    PubMed

    Dalmau, Evelio Rafael González; Cabal Mirabal, Carlos; Martínez, Giselle Saurez; Dávila, Agustín Lage; Suárez, José Carlos Ugarte; Cabanas Armada, Ricardo; Rodriguez Cruz, Gretel; Darias Zayas, Daniel; Castillo, Martha Ríos; Valle Garrido, Luis; Sotolongo, Luis Quevedo; Fernández, Mercedes Monzón

    2014-02-01

    Nimotuzumab, a humanized monoclonal antibody anti-epidermal growth factor receptor, has been shown to improve survival and quality of life in patients with pediatric malignant brain tumor. It is necessary, however, to increase the objective response criteria to define the optimal therapeutic schedule. The aim of this study was to obtain magnetic resonance imaging (MRI) and magnetic resonance spectroscopy (MRS) quantitative information related to dimensions and morphology, molecular mobility and metabolic activity of the lesion and surroundings in order to evaluate any changes through time. Fourteen pediatric patients treated with nimotuzumab were evaluated on MRI and MRS for >2 years. Each patient was their own control. The MRI/MRS pulse sequence parameters were standardized to ensure experimental reproducibility. A total of 71.4% of patients had stable disease; 21.4% had objective response and 7.1% had progression of disease during the >2 year evaluation period. MRI/MRS data with clinical information provide a clearer picture of treatment response and confirm once again that nimotuzumab is effective in the treatment of pediatric brain tumor. These imaging procedures can be a useful tool for the clinical evaluation of study protocol in clinical practice. © 2013 The Authors. Pediatrics International © 2013 Japan Pediatric Society.

  12. Assessment and monitoring of forest ecosystem structure

    Treesearch

    Oscar A. Aguirre Calderón; Javier Jiménez Pérez; Horst Kramer

    2006-01-01

    Characterization of forest ecosystems structure must be based on quantitative indices that allow objective analysis of human influences or natural succession processes. The objective of this paper is the compilation of diverse quantitative variables to describe structural attributes from the arboreal stratum of the ecosystem, as well as different methods of forest...

  13. Definitions and validation criteria for biomarkers and surrogate endpoints: development and testing of a quantitative hierarchical levels of evidence schema.

    PubMed

    Lassere, Marissa N; Johnson, Kent R; Boers, Maarten; Tugwell, Peter; Brooks, Peter; Simon, Lee; Strand, Vibeke; Conaghan, Philip G; Ostergaard, Mikkel; Maksymowych, Walter P; Landewe, Robert; Bresnihan, Barry; Tak, Paul-Peter; Wakefield, Richard; Mease, Philip; Bingham, Clifton O; Hughes, Michael; Altman, Doug; Buyse, Marc; Galbraith, Sally; Wells, George

    2007-03-01

    There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Our objective was to review the literature on biomarkers and surrogates to develop a hierarchical schema that systematically evaluates and ranks the surrogacy status of biomarkers and surrogates; and to obtain feedback from stakeholders. After a systematic search of Medline and Embase on biomarkers, surrogate (outcomes, endpoints, markers, indicators), intermediate endpoints, and leading indicators, a quantitative surrogate validation schema was developed and subsequently evaluated at a stakeholder workshop. The search identified several classification schema and definitions. Components of these were incorporated into a new quantitative surrogate validation level of evidence schema that evaluates biomarkers along 4 domains: Target, Study Design, Statistical Strength, and Penalties. Scores derived from 3 domains the Target that the marker is being substituted for, the Design of the (best) evidence, and the Statistical strength are additive. Penalties are then applied if there is serious counterevidence. A total score (0 to 15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. It was proposed that the term "surrogate" be restricted to markers attaining Levels 1 or 2 only. Most stakeholders agreed that this operationalization of the National Institutes of Health definitions of biomarker, surrogate endpoint, and clinical endpoint was useful. Further development and application of this schema provides incentives and guidance for effective biomarker and surrogate endpoint research, and more efficient drug discovery, development, and approval.

  14. Balance Assessment in Deaf Children and Teenagers Prior to and Post Capoeira Practice through the Berg Balance Scale.

    PubMed

    Lima, Rubianne

    2017-12-01

    Hearing loss changes the functionality and body structure a disability that limits activity and restricts the participation of the individual in situations of daily life. It is believed that capoeira can help people with visual disabilities to minimize these deficits. BSE is a low specificity scale that evaluates objectively and functionally aspects of balance and risk of falls in the elderly and children, including the effect of environment on balance function. The objective of the research is to analyze deaf children and adolescents prior to and post-practice of capoeira using the Berg Balance Scale (BBS). Quantitative, clinical and observational studies. Twenty five deaf children between 10 and 16 years old of both genders were assessed. BBS was applied in two stages: before starting capoeira and after 6 months of training. The one-hour classes were held once a week for quantitative evaluation purposes. The subjects were divided and evaluated in two groups (10-13 years old and 14-16 years old). There was a significant statistical difference in BBS scores. The general group and the group of 10-13 years old (p = 0.0251) showed an increase in scores after practicing capoeira (p = 0.0039). There were no statistically significant differences in the group from 14 to 16 years of age (p = 0.0504). Using the Berg Balance Scale, it was possible to observe an improvement in the balance of the group of children and adolescents who practiced capoeira, and consequently, a decrease in the risk of falling.

  15. Design and control of RUPERT: a device for robotic upper extremity repetitive therapy.

    PubMed

    Sugar, Thomas G; He, Jiping; Koeneman, Edward J; Koeneman, James B; Herman, Richard; Huang, H; Schultz, Robert S; Herring, D E; Wanberg, J; Balasubramanian, Sivakumar; Swenson, Pete; Ward, Jeffrey A

    2007-09-01

    The structural design, control system, and integrated biofeedback for a wearable exoskeletal robot for upper extremity stroke rehabilitation are presented. Assisted with clinical evaluation, designers, engineers, and scientists have built a device for robotic assisted upper extremity repetitive therapy (RUPERT). Intense, repetitive physical rehabilitation has been shown to be beneficial overcoming upper extremity deficits, but the therapy is labor intensive and expensive and difficult to evaluate quantitatively and objectively. The RUPERT is developed to provide a low cost, safe and easy-to-use, robotic-device to assist the patient and therapist to achieve more systematic therapy at home or in the clinic. The RUPERT has four actuated degrees-of-freedom driven by compliant and safe pneumatic muscles (PMs) on the shoulder, elbow, and wrist. They are programmed to actuate the device to extend the arm and move the arm in 3-D space. It is very important to note that gravity is not compensated and the daily tasks are practiced in a natural setting. Because the device is wearable and lightweight to increase portability, it can be worn standing or sitting providing therapy tasks that better mimic activities of daily living. The sensors feed back position and force information for quantitative evaluation of task performance. The device can also provide real-time, objective assessment of functional improvement. We have tested the device on stroke survivors performing two critical activities of daily living (ADL): reaching out and self feeding. The future improvement of the device involves increased degrees-of-freedom and interactive control to adapt to a user's physical conditions.

  16. Evaluating the Quality of Education at Dentistry School of Tehran University of Medical Sciences

    PubMed Central

    Farzianpour, Fereshteh; Monzavi, Abbas; Yassini, Esmaeil

    2011-01-01

    Background: Educational evaluation is a process which deals with data collection and assessment of academic activities’ progress. In this research, educational evaluation of Dentistry School of Tehran University of Medical Sciences, which trains students in undergraduate and residency courses, was studied. Methods: This descriptive study was done with a model of educational evaluation in ten steps and 13 fields including purposes and mission objectives, management and organization, academic board members, students, human resources and support, educational, research, health and treatment spaces, educational, diagnostic, research and laboratory tools, educational, research, health and treatment programs and courses, process of teaching and learning, evaluation and assessment, alumni, and patients satisfaction. Data were collected using observation, interviews, questionnaires, and checklists. Results: Results of the study were mainly qualitative and in some cases quantitative, based on defined optimal situation. The total mean of qualitative results of educational evaluation of dentistry school in all 13 fields was 55.98% which is relatively desirable. In the case of quantitative ones, results of some fields such as treatment quality of patients and education and learning of the students were relatively desirable (61.32% and 60.16% respectively). Conclusion: According to the results, educational goals and missions, educational and research facilities and spaces which were identified as the weakest areas need to be considered and paid more serious attention. PMID:22013466

  17. Standardized evaluation framework for evaluating coronary artery stenosis detection, stenosis quantification and lumen segmentation algorithms in computed tomography angiography.

    PubMed

    Kirişli, H A; Schaap, M; Metz, C T; Dharampal, A S; Meijboom, W B; Papadopoulou, S L; Dedic, A; Nieman, K; de Graaf, M A; Meijs, M F L; Cramer, M J; Broersen, A; Cetin, S; Eslami, A; Flórez-Valencia, L; Lor, K L; Matuszewski, B; Melki, I; Mohr, B; Oksüz, I; Shahzad, R; Wang, C; Kitslaar, P H; Unal, G; Katouzian, A; Örkisz, M; Chen, C M; Precioso, F; Najman, L; Masood, S; Ünay, D; van Vliet, L; Moreno, R; Goldenberg, R; Vuçini, E; Krestin, G P; Niessen, W J; van Walsum, T

    2013-12-01

    Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with expert's manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. A Metric-Based System for Evaluating the Productivity of Preclinical Faculty at an Academic Medical Center in the Era of Clinical and Translational Science.

    PubMed

    Wiegers, Susan E; Houser, Steven R; Pearson, Helen E; Untalan, Ann; Cheung, Joseph Y; Fisher, Susan G; Kaiser, Larry R; Feldman, Arthur M

    2015-08-01

    Academic medical centers are faced with increasing budgetary constraints due to a flat National Institutes of Health budget, lower reimbursements for clinical services, higher costs of technology including informatics and a changing competitive landscape. As such, institutional stakeholders are increasingly asking whether resources are allocated appropriately and whether there are objective methods for measuring faculty contributions and engagement. The complexities of translational research can be particularly challenging when trying to assess faculty contributions because of team science. For over a decade, we have used an objective scoring system called the Matrix to assess faculty productivity and engagement in four areas: research, education, scholarship, and administration or services. The Matrix was developed to be dynamic, quantitative, and able to insure that a fully engaged educator would have a Matrix score that was comparable to a fully engaged investigator. In this report, we present the Matrix in its current form in order to provide a well-tested objective system of performance evaluation for nonclinical faculty to help academic leaders in decision making. © 2015 Wiley Periodicals, Inc.

  19. Mixed methods evaluation of well-being benefits derived from a heritage-in-health intervention with hospital patients

    PubMed Central

    Paddon, Hannah L.; Thomson, Linda J.M.; Menon, Usha; Lanceley, Anne E.; Chatterjee, Helen J.

    2013-01-01

    Background This study sought to determine the effects of a heritage-in-health intervention on well-being. Benefits of arts-in-health interventions are relatively well-documented yet little robust research has been conducted using heritage-in-health interventions, such as those involving museum objects. Methods Hospital patients (n = 57) participated in semi-structured, 30–40 minute facilitated interview sessions, discussing and handling museum objects comprising selections of six artefacts and specimens loaned from archaeology, art, geology and natural history collections. Well-being measures (Positive Affect Negative Affect Scale, Visual Analogue Scales) evaluated the sessions while inductive and deductive thematic analysis investigated psycho-educational features accounting for changes. Results Comparison of pre- and post-session quantitative measures showed significant increases in well-being and happiness. Qualitative investigation revealed thinking and meaning-making opportunities for participants engaged with objects. Conclusions Heritage-in-health sessions enhanced positive mood and social interaction, endorsing the need for provision of well-being-related museum and gallery activities for socially excluded or vulnerable healthcare audiences. PMID:25621005

  20. The clinical effects of music therapy in palliative medicine.

    PubMed

    Gallagher, Lisa M; Lagman, Ruth; Walsh, Declan; Davis, Mellar P; Legrand, Susan B

    2006-08-01

    This study was to objectively assess the effect of music therapy on patients with advanced disease. Two hundred patients with chronic and/or advanced illnesses were prospectively evaluated. The effects of music therapy on these patients are reported. Visual analog scales, the Happy/Sad Faces Assessment Tool, and a behavior scale recorded pre- and post-music therapy scores on standardized data collection forms. A computerized database was used to collect and analyze the data. Utilizing the Wilcoxon signed rank test and a paired t test, music therapy improved anxiety, body movement, facial expression, mood, pain, shortness of breath, and verbalizations. Sessions with family members were also evaluated, and music therapy improved families' facial expressions, mood, and verbalizations. All improvements were statistically significant (P<0.001). Most patients and families had a positive subjective and objective response to music therapy. Objective data were obtained for a large number of patients with advanced disease. This is a significant addition to the quantitative literature on music therapy in this unique patient population. Our results suggest that music therapy is invaluable in palliative medicine.

  1. Predicting perceived visual complexity of abstract patterns using computational measures: The influence of mirror symmetry on complexity perception

    PubMed Central

    Leder, Helmut

    2017-01-01

    Visual complexity is relevant for many areas ranging from improving usability of technical displays or websites up to understanding aesthetic experiences. Therefore, many attempts have been made to relate objective properties of images to perceived complexity in artworks and other images. It has been argued that visual complexity is a multidimensional construct mainly consisting of two dimensions: A quantitative dimension that increases complexity through number of elements, and a structural dimension representing order negatively related to complexity. The objective of this work is to study human perception of visual complexity utilizing two large independent sets of abstract patterns. A wide range of computational measures of complexity was calculated, further combined using linear models as well as machine learning (random forests), and compared with data from human evaluations. Our results confirm the adequacy of existing two-factor models of perceived visual complexity consisting of a quantitative and a structural factor (in our case mirror symmetry) for both of our stimulus sets. In addition, a non-linear transformation of mirror symmetry giving more influence to small deviations from symmetry greatly increased explained variance. Thus, we again demonstrate the multidimensional nature of human complexity perception and present comprehensive quantitative models of the visual complexity of abstract patterns, which might be useful for future experiments and applications. PMID:29099832

  2. Advanced forensic validation for human spermatozoa identification using SPERM HY-LITER™ Express with quantitative image analysis.

    PubMed

    Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko

    2017-07-01

    Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.

  3. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  4. Nuclear medicine and quantitative imaging research (quantitative studies in radiopharmaceutical science): Comprehensive progress report, April 1, 1986-December 31, 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1988-06-01

    This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less

  5. Characterization and quantitation of anthocyanins and other phenolics in native Andean potatoes.

    PubMed

    Giusti, M Monica; Polit, Maria Fernanda; Ayvaz, Huseyin; Tay, David; Manrique, Ivan

    2014-05-14

    Andean potatoes are gaining popularity not only for their appealing colors and culinary uses but also for their potential higher content of polyphenolic compounds. The objective of this study was to identify potato varieties with increased phenolic content. This was achieved through characterization and quantitation of the phenolic composition in 20 varieties of native Andean potatoes from 4 different Solanum species with different colors. Major quantitative and qualitative differences among evaluated samples were more dependent on the coloration of the extracted sample rather than on the species. The most predominant anthocyanidins were petunidin-3-coumaroylrutinoside-5-glucoside and pelargonidin-3-coumaroylrutinoside-5-glucoside in purple and red potato extracts, respectively, while chlorogenic acid and its isomers were the main phenolic compund (43% of the total phenolic content). Our study suggested that the appropriate selection of native potatoes could provide new sources of polyphenolics with health promoting properties and natural pigments with increased stability for food applications.

  6. Thermal evaluation for exposed stone house with quantitative and qualitative approach in mountainous area, Wonosobo, Indonesia

    NASA Astrophysics Data System (ADS)

    Hermawan, Hermawan; Prianto, Eddy

    2017-12-01

    A building can be considered as having a good thermal performance if it can make the occupant comfortable. Thermal comfort can be seen from the occupant's respond toward the architectural elements and the environment, such as lighting, the room crowding, air temperature, humidity, oxygen level, and occupant's behaviours. The objective of this research is to analyse the thermal performance of four different orientation houses in mountainous area. The research was conducted on the four expose stone houses with four different orientations in the slope of Sindoro Mountain which has relative cool temperature, about 26°C. The measurement of the elements above was done quantitatively and qualitatively for 24 hours. The results are as follows. First, the most comfortable house is west-orientation house. Second, based on the quantitative and qualitative observation, there is no significant difference (±5 %). Third, the occupant's behaviours (caring and genen) also become factors influencing occupant's comfort.

  7. Differences in standing and sitting postures of youth with idiopathic scoliosis from quantitative analysis of digital photographs.

    PubMed

    Fortin, Carole; Ehrmann Feldman, Debbie; Cheriet, Farida; Labelle, Hubert

    2013-08-01

    The objective of this study was to explore whether differences in standing and sitting postures of youth with idiopathic scoliosis could be detected from quantitative analysis of digital photographs. Standing and sitting postures of 50 participants aged 10-20-years-old with idiopathic scoliosis (Cobb angle: 15° to 60°) were assessed from digital photographs using a posture evaluation software program. Based on the XY coordinates of markers, 13 angular and linear posture indices were calculated in both positions. Paired t-tests were used to compare values of standing and sitting posture indices. Significant differences between standing and sitting positions (p < 0.05) were found for head protraction, shoulder elevation, scapula asymmetry, trunk list, scoliosis angle, waist angles, and frontal and sagittal plane pelvic tilt. Quantitative analysis of digital photographs is a clinically feasible method to measure standing and sitting postures among youth with scoliosis and to assist in decisions on therapeutic interventions.

  8. 12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...

  9. 12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...

  10. 12 CFR 614.4362 - Loan and lease concentration risk mitigation policy.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... include: (1) A purpose and objective; (2) Clearly defined and consistently used terms; (3) Quantitative... exceptions and reporting requirements. (b) Quantitative methods. (1) At a minimum, the quantitative methods...

  11. 2D shear-wave ultrasound elastography (SWE) evaluation of ablation zone following radiofrequency ablation of liver lesions: is it more accurate?

    PubMed Central

    Bo, Xiao W; Li, Xiao L; Guo, Le H; Li, Dan D; Liu, Bo J; Wang, Dan; He, Ya P; Xu, Xiao H

    2016-01-01

    Objective: To evaluate the usefulness of two-dimensional quantitative ultrasound shear-wave elastography (2D-SWE) [i.e. virtual touch imaging quantification (VTIQ)] in assessing the ablation zone after radiofrequency ablation (RFA) for ex vivo swine livers. Methods: RFA was performed in 10 pieces of fresh ex vivo swine livers with a T20 electrode needle and 20-W output power. Conventional ultrasound, conventional strain elastography (SE) and VTIQ were performed to depict the ablation zone 0 min, 10 min, 30 min and 60 min after ablation. On VTIQ, the ablation zones were evaluated qualitatively by evaluating the shear-wave velocity (SWV) map and quantitatively by measuring the SWV. The ultrasound, SE and VTIQ results were compared against gross pathological and histopathological specimens. Results: VTIQ SWV maps gave more details about the ablation zone, the central necrotic zone appeared as red, lateral necrotic zone as green and transitional zone as light green, from inner to exterior, while the peripheral unablated liver appeared as blue. Conventional ultrasound and SE, however, only marginally depicted the whole ablation zone. The volumes of the whole ablation zone (central necrotic zone + lateral necrotic zone + transitional zone) and necrotic zone (central necrotic zone + lateral necrotic zone) measured by VTIQ showed excellent correlation (r = 0.915, p < 0.001, and 0.856, p = 0.002, respectively) with those by gross pathological specimen, whereas both conventional ultrasound and SE underestimated the volume of the whole ablation zone. The SWV values of the central necrotic zone, lateral necrotic zone, transitional zone and unablated liver parenchyma were 7.54–8.03 m s−1, 5.13–5.28 m s−1, 3.31–3.53 m s−1 and 2.11–2.21 m s−1, respectively (p < 0.001 for all the comparisons). The SWV value for each ablation zone did not change significantly at different observation times within an hour after RFA (all p > 0.05). Conclusion: The quantitative 2D-SWE of VTIQ is useful for the depiction of the ablation zone after RFA and it facilitates discrimination of different areas in the ablation zone qualitatively and quantitatively. This elastography technique might be useful for the therapeutic response evaluation instantly after RFA. Advances in knowledge: A new quantitative 2D-SWE (i.e. VTIQ) for evaluation treatment response after RFA is demonstrated. It facilitates discrimination of the different areas in the ablation zone qualitatively and quantitatively and may be useful for the therapeutic response evaluation instantly after RFA in the future. PMID:26933911

  12. Multi-sensor image fusion algorithm based on multi-objective particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Xie, Xia-zhu; Xu, Ya-wei

    2017-11-01

    On the basis of DT-CWT (Dual-Tree Complex Wavelet Transform - DT-CWT) theory, an approach based on MOPSO (Multi-objective Particle Swarm Optimization Algorithm) was proposed to objectively choose the fused weights of low frequency sub-bands. High and low frequency sub-bands were produced by DT-CWT. Absolute value of coefficients was adopted as fusion rule to fuse high frequency sub-bands. Fusion weights in low frequency sub-bands were used as particles in MOPSO. Spatial Frequency and Average Gradient were adopted as two kinds of fitness functions in MOPSO. The experimental result shows that the proposed approach performances better than Average Fusion and fusion methods based on local variance and local energy respectively in brightness, clarity and quantitative evaluation which includes Entropy, Spatial Frequency, Average Gradient and QAB/F.

  13. T1- or T2-weighted magnetic resonance imaging: what is the best choice to evaluate atrophy of the hippocampus?

    PubMed

    Fischbach-Boulanger, C; Fitsiori, A; Noblet, V; Baloglu, S; Oesterle, H; Draghici, S; Philippi, N; Duron, E; Hanon, O; Dietemann, J-L; Blanc, F; Kremer, S

    2018-05-01

    Magnetic resonance imaging is part of the diagnostic criteria for Alzheimer's disease (AD) through the evaluation of hippocampal atrophy. The objective of this study was to evaluate which sequence of T1-weighted (T1WI) and T2-weighted (T2WI) imaging allowed the best visual evaluation of hippocampal atrophy. Visual qualitative ratings of the hippocampus of 100 patients with mild cognitive impairment (MCI) and 50 patients with AD were made independently by four operators according to the medial temporal lobe atrophy score based either on T1WI or T2WI. These two evaluations were compared in terms of interobserver reproducibility, concordance with a quantitative volumetric measure, discrimination power between AD and MCI groups, and correlation with several neuropsychological tests. The medial temporal lobe atrophy score evaluated on either T1WI or T2WI exhibited similar interobserver variability and accordance with quantitative volumetric evaluation. However, the visual evaluation on T2WI seemed to provide better discrimination power between AD and MCI groups for both left (T1WI, P = 0.0001; T2WI, P = 7.072 × 10 -5 ) and right (T1WI, P = 0.008; T2WI, P = 0.001) hippocampus, and a higher overall correlation with neuropsychological tests. The present study suggests that T2WI provides a more adequate visual rating of hippocampal atrophy. © 2018 EAN.

  14. Assessment of the usability of a digital learning technology prototype for monitoring intracranial pressure 1

    PubMed Central

    de Carvalho, Lilian Regina; Évora, Yolanda Dora Martinez; Zem-Mascarenhas, Silvia Helena

    2016-01-01

    ABSTRACT Objective: to assess the usability of a digital learning technology prototype as a new method for minimally invasive monitoring of intracranial pressure. Method: descriptive study using a quantitative approach on assessing the usability of a prototype based on Nielsen's ten heuristics. Four experts in the area of Human-Computer interaction participated in the study. Results: the evaluation delivered eight violated heuristics and 31 usability problems in the 32 screens of the prototype. Conclusion: the suggestions of the evaluators were critical for developing an intuitive, user-friendly interface and will be included in the final version of the digital learning technology. PMID:27579932

  15. Development and evaluation of a psychoeducation practitioner training program (PPTP).

    PubMed

    Matsuda, Mitsunobu; Kono, Ayumi

    2015-08-01

    The objective of this study was to develop a psychoeducation practitioner training program (PPTP) and to evaluate its usefulness with regard to nursing competencies (knowledge, self-efficacy, attitude, motivation, skills). A mixed-method research design was applied in this study. Some of the quantitative data were a one-group pretest-posttest study. Forty nurses participated in the PPTP, of whom 38 (17 men and 21 women) completed a 2-consecutive-day curriculum (dropout rate: 5%). The PPTP significantly improved nurses' knowledge of, self-efficacy for, and attitude toward psychoeducation. However, the program did not lead to the acquisition of psychoeducational skills. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Estimation of masonry mechanical characteristics by ESPI fringe interpretation

    NASA Astrophysics Data System (ADS)

    Facchini, M.; Zanetta, P.; Binda, L.; Roberti, G. Mirabella; Tiraboschi, C.

    Electronic speckle pattern interferometry (ESPI) can be a powerful tool for efficient non-destructive testing and evaluation of micro-deformations of masonry materials and structures. Unlike traditional transducers, ESPI requires no direct contact with the object, and the full-field visualisation it offers provides for a better understanding of the surface behaviour. This paper describes an in-plane deformation inspection system which has been built up for an automatic acquisition of interferograms at different stages of a test. The system is applied to the evaluation of some mechanical characteristics of masonry components. Qualitative and quantitative results are obtained and an overall discussion is presented.

  17. Muscle MRI STIR signal intensity and atrophy are correlated to focal lower limb neuropathy severity.

    PubMed

    Deroide, N; Bousson, V; Mambre, L; Vicaut, E; Laredo, J D; Kubis, Nathalie

    2015-03-01

    The objective is to determine if muscle MRI is useful for assessing neuropathy severity. Clinical, MRI and electromyography (EMG) examinations were performed in 17 patients with focal lower limb neuropathies. MRI Short Tau Inversion Recovery (STIR) signal intensity, amyotrophy, and muscle fatty infiltration measured after T1-weighted image acquisition, EMG spontaneous activity (SA), and maximal voluntary contraction (MVC) were graded using semiquantitative scores and quantitative scores for STIR signal intensity and were correlated to the Medical Research Council (MRC) score for testing muscle strength. Within this population, subgroups were selected according to severity (mild versus severe), duration (subacute versus chronic), and topography (distal versus proximal) of the neuropathy. EMG SA and MVC MRI amyotrophy and quantitative scoring of muscle STIR intensity were correlated with the MRC score. Moreover, MRI amyotrophy was significantly increased in severe, chronic, and proximal neuropathies along with fatty infiltration in chronic lesions. Muscle MRI atrophy and quantitative evaluation of signal intensity were correlated to MRC score in our study. Semiquantitative evaluation of muscle STIR signal was sensitive enough for detection of topography of the nerve lesion but was not suitable to assess severity. Muscle MRI could support EMG in chronic and proximal neuropathy, which showed poor sensitivity in these patients.

  18. Quantitative reflectance spectroscopy of buddingtonite from the Cuprite mining district, Nevada

    NASA Technical Reports Server (NTRS)

    Felzer, Benjamin; Hauff, Phoebe; Goetz, Alexander F. H.

    1994-01-01

    Buddingtonite, an ammonium-bearing feldspar diagnostic of volcanic-hosted alteration, can be identified and, in some cases, quantitatively measured using short-wave infrared (SWIR) reflectance spectroscopy. In this study over 200 samples from Cuprite, Nevada, were evaluated by X ray diffraction, chemical analysis, scanning electron microscopy, and SWIR reflectance spectroscopy with the objective of developing a quantitative remote-sensing technique for rapid determination of the amount of ammonium or buddingtonite present, and its distribution across the site. Based upon the Hapke theory of radiative transfer from particulate surfaces, spectra from quantitative, physical mixtures were compared with computed mixture spectra. We hypothesized that the concentration of ammonium in each sample is related to the size and shape of the ammonium absorption bands and tested this hypothesis for samples of relatively pure buddingtonite. We found that the band depth of the 2.12-micron NH4 feature is linearly related to the NH4 concentration for the Cuprite buddingtonite, and that the relationship is approximately exponential for a larger range of NH4 concentrations. Associated minerals such as smectite and jarosite suppress the depth of the 2.12-micron NH4 absorption band. Quantitative reflectance spectroscopy is possible when the effects of these associated minerals are also considered.

  19. Using multi-species occupancy models in structured decision making on managed lands

    USGS Publications Warehouse

    Sauer, John R.; Blank, Peter J.; Zipkin, Elise F.; Fallon, Jane E.; Fallon, Frederick W.

    2013-01-01

    Land managers must balance the needs of a variety of species when manipulating habitats. Structured decision making provides a systematic means of defining choices and choosing among alternative management options; implementation of a structured decision requires quantitative approaches to predicting consequences of management on the relevant species. Multi-species occupancy models provide a convenient framework for making structured decisions when the management objective is focused on a collection of species. These models use replicate survey data that are often collected on managed lands. Occupancy can be modeled for each species as a function of habitat and other environmental features, and Bayesian methods allow for estimation and prediction of collective responses of groups of species to alternative scenarios of habitat management. We provide an example of this approach using data from breeding bird surveys conducted in 2008 at the Patuxent Research Refuge in Laurel, Maryland, evaluating the effects of eliminating meadow and wetland habitats on scrub-successional and woodland-breeding bird species using summed total occupancy of species as an objective function. Removal of meadows and wetlands decreased value of an objective function based on scrub-successional species by 23.3% (95% CI: 20.3–26.5), but caused only a 2% (0.5, 3.5) increase in value of an objective function based on woodland species, documenting differential effects of elimination of meadows and wetlands on these groups of breeding birds. This approach provides a useful quantitative tool for managers interested in structured decision making.

  20. Multi-objective optimization for evaluation of simulation fidelity for precipitation, cloudiness and insolation in regional climate models

    NASA Astrophysics Data System (ADS)

    Lee, H.

    2016-12-01

    Precipitation is one of the most important climate variables that are taken into account in studying regional climate. Nevertheless, how precipitation will respond to a changing climate and even its mean state in the current climate are not well represented in regional climate models (RCMs). Hence, comprehensive and mathematically rigorous methodologies to evaluate precipitation and related variables in multiple RCMs are required. The main objective of the current study is to evaluate the joint variability of climate variables related to model performance in simulating precipitation and condense multiple evaluation metrics into a single summary score. We use multi-objective optimization, a mathematical process that provides a set of optimal tradeoff solutions based on a range of evaluation metrics, to characterize the joint representation of precipitation, cloudiness and insolation in RCMs participating in the North American Regional Climate Change Assessment Program (NARCCAP) and Coordinated Regional Climate Downscaling Experiment-North America (CORDEX-NA). We also leverage ground observations, NASA satellite data and the Regional Climate Model Evaluation System (RCMES). Overall, the quantitative comparison of joint probability density functions between the three variables indicates that performance of each model differs markedly between sub-regions and also shows strong seasonal dependence. Because of the large variability across the models, it is important to evaluate models systematically and make future projections using only models showing relatively good performance. Our results indicate that the optimized multi-model ensemble always shows better performance than the arithmetic ensemble mean and may guide reliable future projections.

  1. On Quantitative Biomarkers of VNS Therapy Using EEG and ECG Signals.

    PubMed

    Ravan, Maryam; Sabesan, Shivkumar; D'Cruz, O'Neill

    2017-02-01

    The goal of this work is to objectively evaluate the effectiveness of neuromodulation therapies, specifically, Vagus nerve stimulation (VNS) in reducing the severity of seizures in patients with medically refractory epilepsy. Using novel quantitative features obtained from combination of electroencephalographic (EEG) and electrocardiographic (ECG) signals around seizure events in 16 patients who underwent implantation of closed-loop VNS therapy system, namely AspireSR, we evaluated if automated delivery of VNS at the time of seizure onset reduces the severity of seizures by reducing EEG spatial synchronization as well as the duration and magnitude of heart rate increase. Unsupervised classification was subsequently applied to test the discriminative ability and validity of these features to measure responsiveness to VNS therapy. Results of application of this methodology to compare 105 pre-VNS treatment and 107 post-VNS treatment seizures revealed that seizures that were acutely stimulated using VNS had a reduced ictal spread as well as reduced impact on cardiovascular function compared to the ones that occurred prior to any treatment. Furthermore, application of an unsupervised fuzzy-c-mean classifier to evaluate the ability of the combined EEG-ECG based features to classify pre and post-treatment seizures achieved a classification accuracy of 85.85%. These results indicate the importance of timely delivery of VNS to reduce seizure severity and thus help achieve better seizure control for patients with epilepsy. The proposed set of quantitative features could be used as potential biomarkers for predicting long-term response to VNS therapy.

  2. Objective Evaluation of Visual Fatigue Using Binocular Fusion Maintenance

    PubMed Central

    Hirota, Masakazu; Morimoto, Takeshi; Kanda, Hiroyuki; Endo, Takao; Miyoshi, Tomomitsu; Miyagawa, Suguru; Hirohara, Yoko; Yamaguchi, Tatsuo; Saika, Makoto

    2018-01-01

    Purpose In this study, we investigated whether an individual's visual fatigue can be evaluated objectively and quantitatively from their ability to maintain binocular fusion. Methods Binocular fusion maintenance (BFM) was measured using a custom-made binocular open-view Shack–Hartmann wavefront aberrometer equipped with liquid crystal shutters, wherein eye movements and wavefront aberrations were measured simultaneously. Transmittance in the liquid crystal shutter in front of the subject's nondominant eye was reduced linearly, and BFM was determined from the transmittance at the point when binocular fusion was broken and vergence eye movement was induced. In total, 40 healthy subjects underwent the BFM test and completed a questionnaire regarding subjective symptoms before and after a visual task lasting 30 minutes. Results BFM was significantly reduced after the visual task (P < 0.001) and was negatively correlated with the total subjective eye symptom score (adjusted R2 = 0.752, P < 0.001). Furthermore, the diagnostic accuracy for visual fatigue was significantly higher in BFM than in the conventional test results (aggregated fusional vergence range, near point of convergence, and the high-frequency component of accommodative microfluctuations; P = 0.007). Conclusions These results suggest that BFM can be used as an indicator for evaluating visual fatigue. Translational Relevance BFM can be used to evaluate the visual fatigue caused by the new visual devices, such as head-mount display, objectively. PMID:29600117

  3. Quantitative and qualitative comparison of MR imaging of the temporomandibular joint at 1.5 and 3.0 T using an optimized high-resolution protocol

    PubMed Central

    Spinner, Georg; Wyss, Michael; Erni, Stefan; Ettlin, Dominik A; Nanz, Daniel; Ulbrich, Erika J; Gallo, Luigi M; Andreisek, Gustav

    2016-01-01

    Objectives: To quantitatively and qualitatively compare MRI of the temporomandibular joint (TMJ) using an optimized high-resolution protocol at 3.0 T and a clinical standard protocol at 1.5 T. Methods: A phantom and 12 asymptomatic volunteers were MR imaged using a 2-channel surface coil (standard TMJ coil) at 1.5 and 3.0 T (Philips Achieva and Philips Ingenia, respectively; Philips Healthcare, Best, Netherlands). Imaging protocol consisted of coronal and oblique sagittal proton density-weighted turbo spin echo sequences. For quantitative evaluation, a spherical phantom was imaged. Signal-to-noise ratio (SNR) maps were calculated on a voxelwise basis. For qualitative evaluation, all volunteers underwent MRI of the TMJ with the jaw in closed position. Two readers independently assessed visibility and delineation of anatomical structures of the TMJ and overall image quality on a 5-point Likert scale. Quantitative and qualitative measurements were compared between field strengths. Results: The quantitative analysis showed similar SNR for the high-resolution protocol at 3.0 T compared with the clinical protocol at 1.5 T. The qualitative analysis showed significantly better visibility and delineation of clinically relevant anatomical structures of the TMJ, including the TMJ disc and pterygoid muscle as well as better overall image quality at 3.0 T than at 1.5 T. Conclusions: The presented results indicate that expected gains in SNR at 3.0 T can be used to increase the spatial resolution when imaging the TMJ, which translates into increased visibility and delineation of anatomical structures of the TMJ. Therefore, imaging at 3.0 T should be preferred over 1.5 T for imaging the TMJ. PMID:26371077

  4. Music therapy for coma patients: preliminary results.

    PubMed

    Sun, J; Chen, W

    2015-04-01

    The application of quantitative EEG (δ+θ/α+β value) and GCS value to evaluate the role of music therapy for traumatic brain injury coma patients. Forty patients of traumatic brain injury coma were selected to meet the inclusion criteria. Twenty cases were selected for the rehabilitation, neurology and neurosurgery ward, whose families could actively cooperate with, and the patients could receive a long-term fixed nursing staff with formal music therapy (music group). Twenty cases were in the intensive care unit of the rehabilitation, neurology and neurosurgery ward. Their families members cooperated poorly, had often changing nursing staff, and without a formal music therapy (control group). After a one monthe follow up, the GCS value and quantitative EEG (δ+θ/α+β value) were compared between the two groups. Between the two groups, except for the presence or absence of formal music therapy, the rest of treatment had no significant difference and was matched by age, gender, and injury types. In 40 cases of traumatic brain injury patients, the GCS value increased in the music group after treatment when compared to the control group. The difference between the two groups was significant (p < 0.05). The quantitative EEG value (δ+θ/α+β value) of music group values were decreased after treatment, and the difference was significant compared with the control group (p < 0.05). Through the quantitative EEG (δ+θ/α+β value) and the GCS observation score, music therapy in patients with craniocerebral trauma coma has obviously an effect on promoting to regain consciousness. The quantitative EEG (δ+θ/α+β value) can be used as an objective index to evaluate the state of brain function.

  5. Significant reduction of inflammation and sebaceous glands size in acne vulgaris lesions after intense pulsed light treatment.

    PubMed

    Barakat, Manal T; Moftah, Noha H; El Khayyat, Mohammad A M; Abdelhakim, Zainab A

    2017-01-01

    Intense pulsed light (IPL) has been used for years in treatment of acne vulgaris. However, quantitative evaluation of histopathological changes after its use as a sole therapy was poorly investigated. Accordingly, this study aims to objectively evaluate inflammatory infiltrate and sebaceous glands in acne vulgaris after IPL. Twenty-four patients of acne were treated with six IPL sessions. Clinical evaluation was done at 2 weeks after last session by counting acne lesions. Patient satisfaction using Cardiff Acne Disability Index (CADI) was recorded at baseline, 2 weeks and 3 months after IPL. Using histopathological and computerized morphometric analysis, quantitative evaluation of inflammatory infiltrate and measurement of surface area of sebaceous glands were performed for skin biopsies at baseline and 2 weeks after last session. After IPL, there was significant reduction of all acne lesions especially inflammatory variety with significant decrease of CADI score at 2 weeks and 3 months after IPL (p < .05). Microscopically, there was significant decrease in density of inflammatory infiltrate and surface area of sebaceous glands (p < .05). So, IPL is fairly effective therapy in acne vulgaris especially inflammatory variety. The results suggest that IPL could improve acne lesions through targeting both inflammation and sebaceous glands. © 2016 Wiley Periodicals, Inc.

  6. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  7. Simulation Training: Evaluating the Instructor’s Contribution to a Wizard of Oz Simulator in Obstetrics and Gynecology Ultrasound Training

    PubMed Central

    Tepper, Ronnie

    2017-01-01

    Background Workplaces today demand graduates who are prepared with field-specific knowledge, advanced social skills, problem-solving skills, and integration capabilities. Meeting these goals with didactic learning (DL) is becoming increasingly difficult. Enhanced training methods that would better prepare tomorrow’s graduates must be more engaging and game-like, such as feedback based e-learning or simulation-based training, while saving time. Empirical evidence regarding the effectiveness of advanced learning methods is lacking. Objective quantitative research comparing advanced training methods with DL is sparse. Objectives This quantitative study assessed the effectiveness of a computerized interactive simulator coupled with an instructor who monitored students’ progress and provided Web-based immediate feedback. Methods A low-cost, globally accessible, telemedicine simulator, developed at the Technion—Israel Institute of Technology, Haifa, Israel—was used. A previous study in the field of interventional cardiology, evaluating the efficacy of the simulator to enhanced learning via knowledge exams, presented promising results of average scores varying from 94% after training and 54% before training (n=20) with P<.001. Two independent experiments involving obstetrics and gynecology (Ob-Gyn) physicians and senior ultrasound sonographers, with 32 subjects, were conducted using a new interactive concept of the WOZ (Wizard of OZ) simulator platform. The contribution of an instructor to learning outcomes was evaluated by comparing students’ knowledge before and after each interactive instructor-led session as well as after fully automated e-learning in the field of Ob-Gyn. Results from objective knowledge tests were analyzed using hypothesis testing and model fitting. Results A significant advantage (P=.01) was found in favor of the WOZ training approach. Content type and training audience were not significant. Conclusions This study evaluated the contribution of an integrated teaching environment using a computerized interactive simulator, with an instructor providing immediate Web-based immediate feedback to trainees. Involvement of an instructor in the simulation-based training process provided better learning outcomes that varied training content and trainee populations did not affect the overall learning gains. PMID:28432039

  8. Inside marginal adaptation of crowns by X-ray micro-computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dos Santos, T. M.; Lima, I.; Lopes, R. T.

    The objective of this work was to access dental arcade by using X-ray micro-computed tomography. For this purpose high resolution system was used and three groups were studied: Zirkonzahn CAD-CAM system, IPS e.max Press, and metal ceramic. The three systems assessed in this study showed results of marginal and discrepancy gaps clinically accepted. The great result of 2D and 3D evaluations showed that the used technique is a powerful method to investigate quantitative characteristics of dental arcade. (authors)

  9. Effect of environment on insulation materials, volume 1

    NASA Technical Reports Server (NTRS)

    Parmley, R. T.; Smith, F. J.; Glassford, A. P.; Coleman, J.; Stevenson, D. R.

    1973-01-01

    Twenty candidate multilayer insulation and insulation related materials were subjected to eight conditions that represent possible operational environments. These exposures include ground contaminants, various operational temperatures, space vacuum, space-vented propellants, and tank leakage. The objective of this program was to obtain and evaluate the data from these exposures to provide both a quantitative and qualitative description of the degradation to certain physical and thermal properties, and from this, to obtain a better understanding of the environmental effects on the insulation performance.

  10. Accelerated Irradiations for High Dose Microstructures in Fast Reactor Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiao, Zhijie

    The objective of this project is to determine the extent to which high dose rate, self-ion irradiation can be used as an accelerated irradiation tool to understand microstructure evolution at high doses and temperatures relevant to advanced fast reactors. We will accomplish the goal by evaluating phase stability and swelling of F-M alloys relevant to SFR systems at very high dose by combining experiment and modeling in an effort to obtain a quantitative description of the processes at high and low damage rates.

  11. Link-Based Similarity Measures Using Reachability Vectors

    PubMed Central

    Yoon, Seok-Ho; Kim, Ji-Soo; Ryu, Minsoo; Choi, Ho-Jin

    2014-01-01

    We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures. PMID:24701188

  12. Driving simulator validation of driver behavior with limited safe vantage points for data collection in work zones.

    PubMed

    Bham, Ghulam H; Leu, Ming C; Vallati, Manoj; Mathur, Durga R

    2014-06-01

    This study is aimed at validating a driving simulator (DS) for the study of driver behavior in work zones. A validation study requires field data collection. For studies conducted in highway work zones, the availability of safe vantage points for data collection at critical locations can be a significant challenge. A validation framework is therefore proposed in this paper, demonstrated using a fixed-based DS that addresses the issue by using a global positioning system (GPS). The validation of the DS was conducted using objective and subjective evaluations. The objective validation was divided into qualitative and quantitative evaluations. The DS was validated by comparing the results of simulation with the field data, which were collected using a GPS along the highway and video recordings at specific locations in a work zone. The constructed work zone scenario in the DS was subjectively evaluated with 46 participants. The objective evaluation established the absolute and relative validity of the DS. The mean speeds from the DS data showed excellent agreement with the field data. The subjective evaluation indicated realistic driving experience by the participants. The use of GPS showed that continuous data collected along the highway can overcome the challenges of unavailability of safe vantage points especially at critical locations. Further, a validated DS can be used for examining driver behavior in complex situations by replicating realistic scenarios. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Quantitative measurement of radiofrequency volumetric tissue reduction by multidetector CT in patients with inferior turbinate hypertrophy.

    PubMed

    Bahadir, Osman; Kosucu, Polat

    2012-12-01

    To objectively assess the efficacy of radiofrequency thermal ablation of inferior turbinate hypertrophy. Thirty-five patients with nasal obstruction secondary to inferior turbinate hypertrophy were prospectively enrolled. Radiofrequency energy was delivered to four sites in each inferior turbinate. Patients were evaluated before and 8 weeks after intervention. Subjective evaluation of nasal obstruction was performed using a visual analogue scale (VAS), and objective evaluation of the turbinate volume reduction was calculated using multidetector CT. Volumetric measurements of the preoperative inferior turbinate were compared with postoperative values on both sides. The great majority of patients (91.4%) exhibited subjective postoperative improvement. Mean obstruction (VAS) improved significantly from 7.45±1.48 to 3.54±1.96. Significant turbinate volume reduction was achieved by the surgery on both right and left sides [(preoperative vs. postoperative, right: 6.55±1.62cm(3) vs. 5.10±1.47cm(3), (P<0.01); left: 6.72±1.53cm(3) vs. 5.00±1.37cm(3), (P<0.01)] respectively. Radiofrequency is a safe and effective surgical procedure in reducing turbinate volume in patients with inferior turbinate hypertrophy. Multidetector CT is an objective method of assessment in detecting radiofrequency turbinate volume reduction. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Quantitative assessment of participant knowledge and evaluation of participant satisfaction in the CARES training program.

    PubMed

    Goodman, Melody S; Si, Xuemei; Stafford, Jewel D; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2012-01-01

    The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research methodology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community-academic research partnerships.

  15. Issues in evaluation: evaluating assessments of elderly people using a combination of methods.

    PubMed

    McEwan, R T

    1989-02-01

    In evaluating a health service, individuals will give differing accounts of its performance, according to their experiences of the service, and the evaluative perspective they adopt. The value of a service may also change through time, and according to the particular part of the service studied. Traditional health care evaluations have generally not accounted for this variability because of the approaches used. Studies evaluating screening or assessment programmes for the elderly have focused on programme effectiveness and efficiency, using relatively inflexible quantitative methods. Evaluative approaches must reflect the complexity of health service provision, and methods must vary to suit the particular research objective. Under these circumstances, this paper presents the case for the use of multiple triangulation in evaluative research, where differing methods and perspectives are combined in one study. Emphasis is placed on the applications and benefits of subjectivist approaches in evaluation. An example of combined methods is provided in the form of an evaluation of the Newcastle Care Plan for the Elderly.

  16. Linking agent-based models and stochastic models of financial markets

    PubMed Central

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene

    2012-01-01

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086

  17. Linking agent-based models and stochastic models of financial markets.

    PubMed

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene

    2012-05-29

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.

  18. Effect of Genital Sampling Site on the Detection and Quantification of Ureaplasma Species with Quantitative Polymerase Chain Reaction during Pregnancy.

    PubMed

    Faron, Gilles; Vancutsem, Ellen; Naessens, Anne; Buyl, Ronald; Gucciardo, Leonardo; Foulon, Walter

    2017-01-01

    Objective . This study aimed to compare the qualitative and quantitative reproducibility of quantitative PCR (qPCR) for Ureaplasma species (Ureaplasma spp.) throughout pregnancy and according to the genital sampling site. Study Design . Between 5 and 14 weeks of gestation (T1), vaginal, fornix, and two cervical samples were taken. Sampling was repeated during the 2nd (T2) and 3rd (T3) trimester in randomly selected T1 positive and negative women. Qualitative and quantitative reproducibility were evaluated using, respectively, Cohen's kappa ( κ ) and interclass correlation coefficients (ICC) and repeated measures ANOVA on the log-transformed mean number of DNA copies for each sampling site. Results . During T1, 51/127 women were positive for U. parvum and 8 for U. urealyticum (4 patients for both). Sampling was repeated for 44/55 women at T2 and/or T3; 43 (97.7%) remained positive at the three timepoints. κ ranged between 0.83 and 0.95 and the ICC for cervical samples was 0.86. Conclusions . Colonization by Ureaplasma spp. seems to be very constant during pregnancy and vaginal samples have the highest detection rate.

  19. Control of Cattle Ticks and Tick-Borne Diseases by Acaricide in Southern Province of Zambia: A Retrospective Evaluation of Animal Health Measures According to Current One Health Concepts.

    PubMed

    Laing, Gabrielle; Aragrande, Maurizio; Canali, Massimo; Savic, Sara; De Meneghi, Daniele

    2018-01-01

    One health thinking for health interventions is increasingly being used to capture previously unseen stakeholders and impacts across people, animals, and the environment. The Network for One Health Evaluation (NEOH) proposes a systems-based framework to quantitatively assess integration and highlight the added value (theory of change) that this approach will bring to a project. This case study will retrospectively evaluate the pioneering use of a One Health (OH) approach during an international collaboration (satellite project to tackle production losses due to tick-borne disease in cattle in Southern Zambia in late 1980s). The objective of the evaluation is twofold: retrospective evaluation the OH-ness of the satellite project and identification of costs and benefits. Data for evaluation was recovered from publications, project documents, and witness interviews. A mixed qualitative and quantitative evaluation was undertaken. In this case study, a transdisciplinary approach allowed for the identification of a serious public health risk arising from the unexpected reuse of chemical containers by the local public against advice. Should this pioneering project not have been completed then it is assumed this behavior could have had a large impact on public wellbeing and ultimately reduced regional productivity and compromised welfare. From the economic evaluation, the costs of implementing this OH approach, helping to avoid harm, were small in comparison to overall project costs. The overall OH Index was 0.34. The satellite project demonstrated good OH operations by managing to incorporate the input across multiple dimensions but was slightly weaker on OH infrastructures (OH Ratio = 1.20). These quantitative results can be used in the initial validation and benchmarking of this novel framework. Limitations of the evaluation were mainly a lack of data due to the length of time since project completion and a lack of formal monitoring of program impact. In future health strategy development and execution, routine monitoring and evaluation from an OH perspective (by utilizing the framework proposed by NEOH), could prove valuable or used as a tool for retrospective evaluation of existing policies.

  20. Efficacy of mesotherapy in facial rejuvenation: a histological and immunohistochemical evaluation

    PubMed Central

    El-Domyati, Moetaz; El-Ammawi, Tarek S.; Moawad, Osama; El-Fakahany, Hasan; Medhat, Walid; Mahoney, Mỹ G.; Uitto, Jouni

    2012-01-01

    Background Mesotherapy, commonly known as “biorejuvenation” or “biorevitalization”, is a technique used to rejuvenate the skin by means of a transdermal injection of a multivitamin solution and natural plant extracts that are thought to improve the signs of skin aging. Objectives This prospective study aimed to evaluate the clinical effect of mesotherapy applied to periorbital wrinkles and to quantitatively evaluate histological changes in the skin occurring in response to the same treatment. Methods Six volunteers with Fitzpatrick skin types III or IV and Glogau class I–III wrinkles were subjected to a three-month course of mesotherapy injections in the periocular area (six sessions administered at two-week intervals). Standard photographs and skin biopsies were obtained from the treatment area at baseline, at the end of treatment, and at three months post-treatment. Quantitative evaluation of collagen types I, III, and VII, newly synthesized collagen, total elastin, and tropoelastin was performed using a computerized morphometric analysis. Results The clinical evaluation of volunteers at baseline, end of treatment, and three months post-treatment revealed no significant differences. Histological and immunostaining analysis of collagen types I, III, and VII, newly synthesized collagen, total elastin, and tropoelastin showed no statistically significant changes (P > 0.05) after mesotherapy injection. Conclusions The present study indicates that mesotherapy for skin rejuvenation does not result in statistically significant histological changes or clinical improvement. PMID:22788806

  1. Comparison of qualitative and quantitative evaluation of diffusion-weighted MRI and chemical-shift imaging in the differentiation of benign and malignant vertebral body fractures.

    PubMed

    Geith, Tobias; Schmidt, Gerwin; Biffar, Andreas; Dietrich, Olaf; Dürr, Hans Roland; Reiser, Maximilian; Baur-Melnyk, Andrea

    2012-11-01

    The objective of our study was to compare the diagnostic value of qualitative diffusion-weighted imaging (DWI), quantitative DWI, and chemical-shift imaging in a single prospective cohort of patients with acute osteoporotic and malignant vertebral fractures. The study group was composed of patients with 26 osteoporotic vertebral fractures (18 women, eight men; mean age, 69 years; age range, 31 years 6 months to 86 years 2 months) and 20 malignant vertebral fractures (nine women, 11 men; mean age, 63.4 years; age range, 24 years 8 months to 86 years 4 months). T1-weighted, STIR, and T2-weighted sequences were acquired at 1.5 T. A DW reverse fast imaging with steady-state free precession (PSIF) sequence at different delta values was evaluated qualitatively. A DW echo-planar imaging (EPI) sequence and a DW single-shot turbo spin-echo (TSE) sequence at different b values were evaluated qualitatively and quantitatively using the apparent diffusion coefficient. Opposed-phase sequences were used to assess signal intensity qualitatively. The signal loss between in- and opposed-phase images was determined quantitatively. Two-tailed Fisher exact test, Mann-Whitney test, and receiver operating characteristic analysis were performed. Sensitivities, specificities, and accuracies were determined. Qualitative DW-PSIF imaging (delta = 3 ms) showed the best performance for distinguishing between benign and malignant fractures (sensitivity, 100%; specificity, 88.5%; accuracy, 93.5%). Qualitative DW-EPI (b = 50 s/mm(2) [p = 1.00]; b = 250 s/mm(2) [p = 0.50]) and DW single-shot TSE imaging (b = 100 s/mm(2) [p = 1.00]; b = 250 s/mm(2) [p = 0.18]; b = 400 s/mm(2) [p = 0.18]; b = 600 s/mm(2) [p = 0.39]) did not indicate significant differences between benign and malignant fractures. DW-EPI using a b value of 500 s/mm(2) (p = 0.01) indicated significant differences between benign and malignant vertebral fractures. Quantitative DW-EPI (p = 0.09) and qualitative opposed-phase imaging (p = 0.06) did not exhibit significant differences, quantitative DW single-shot TSE imaging (p = 0.002) and quantitative chemical-shift imaging (p = 0.01) showed significant differences between benign and malignant fractures. The DW-PSIF sequence (delta = 3 ms) had the highest accuracy in differentiating benign from malignant vertebral fractures. Quantitative chemical-shift imaging and quantitative DW single-shot TSE imaging had a lower accuracy than DW-PSIF imaging because of a large overlap. Qualitative assessment of opposed-phase, DW-EPI, and DW single-shot TSE sequences and quantitative assessment of the DW-EPI sequence were not suitable for distinguishing between benign and malignant vertebral fractures.

  2. Using Live-Crown Ratio to Control Wood Quality: An Example of Quantitative Silviculture

    Treesearch

    Thomas J. Dean

    1999-01-01

    Quantitative silviculture is the application of biological relationships in meeting specific, quantitative management objectives. It is a two-sided approach requiring the identification and application of biological relationships. An example of quantitative silviculture is presented that uses a relationship between average-live crown ratio and relative stand density...

  3. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  4. Attitudes of affiliate faculty members toward medical student summative evaluation for clinical clerkships: a qualitative analysis.

    PubMed

    Wang, Karen E; Fitzpatrick, Caroline; George, David; Lane, Lindsey

    2012-01-01

    Summative evaluation of medical students is a critical component of the educational process. Despite extensive literature on evaluation, few studies have centered on affiliate faculty members' attitudes toward summative evaluation of students, though it has been suggested that these attitudes influence their effectiveness as evaluators. The objective is to examine affiliate faculty members' attitudes toward clinical clerkship evaluation using primarily qualitative research methods. The study used a nonexperimental research design and employed mixed methods. Data were collected through interviews, focus groups, and a questionnaire from 11 affiliate faculty members. Themes emerging from the data fell into three broad categories: (a) factors that influence grading, (b) consequences of negative evaluations, and (c) disconnections in the grading process. The quantitative portion of the study revealed important discrepancies supporting the use of qualitative methods. The study highlights faculty members' struggles with the evaluative process and emphasizes the need for improvements in evaluation tools and faculty development.

  5. Detecting Target Objects by Natural Language Instructions Using an RGB-D Camera

    PubMed Central

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Tang, Hongru; Xi, Ning

    2016-01-01

    Controlling robots by natural language (NL) is increasingly attracting attention for its versatility, convenience and no need of extensive training for users. Grounding is a crucial challenge of this problem to enable robots to understand NL instructions from humans. This paper mainly explores the object grounding problem and concretely studies how to detect target objects by the NL instructions using an RGB-D camera in robotic manipulation applications. In particular, a simple yet robust vision algorithm is applied to segment objects of interest. With the metric information of all segmented objects, the object attributes and relations between objects are further extracted. The NL instructions that incorporate multiple cues for object specifications are parsed into domain-specific annotations. The annotations from NL and extracted information from the RGB-D camera are matched in a computational state estimation framework to search all possible object grounding states. The final grounding is accomplished by selecting the states which have the maximum probabilities. An RGB-D scene dataset associated with different groups of NL instructions based on different cognition levels of the robot are collected. Quantitative evaluations on the dataset illustrate the advantages of the proposed method. The experiments of NL controlled object manipulation and NL-based task programming using a mobile manipulator show its effectiveness and practicability in robotic applications. PMID:27983604

  6. Evaluation of self-esteem in cancer patients undergoing chemotherapy treatment1

    PubMed Central

    Leite, Marilia Aparecida Carvalho; Nogueira, Denismar Alves; Terra, Fábio de Souza

    2015-01-01

    Objective: to evaluate the self-esteem of cancer patients undergoing chemotherapy. Method: descriptive analytical cross-sectional study with a quantitative approach. Around 156 patients that attended an oncology unit of a mid-sized hospital participated in the study. Results: we found a higher frequency of patients with high self-esteem, but some of them showed average or low self-esteem. The scale showed a Cronbach's alpha value of 0.746, by considering its acceptable internal consistency for the evaluated items. No independent variables showed significant associations with self-esteem. Conclusion: the cancer patients evaluated have presented high self-esteem; thus, it becomes crucial for nursing to plan the assistance of patients undergoing chemotherapy treatments, which enables actions and strategies that meet their physical and psychosocial conditions, aiming to maintain and rehabilitate these people's emotional aspects. PMID:26625999

  7. Design for sustainability of industrial symbiosis based on emergy and multi-objective particle swarm optimization.

    PubMed

    Ren, Jingzheng; Liang, Hanwei; Dong, Liang; Sun, Lu; Gao, Zhiqiu

    2016-08-15

    Industrial symbiosis provides novel and practical pathway to the design for the sustainability. Decision support tool for its verification is necessary for practitioners and policy makers, while to date, quantitative research is limited. The objective of this work is to present an innovative approach for supporting decision-making in the design for the sustainability with the implementation of industrial symbiosis in chemical complex. Through incorporating the emergy theory, the model is formulated as a multi-objective approach that can optimize both the economic benefit and sustainable performance of the integrated industrial system. A set of emergy based evaluation index are designed. Multi-objective Particle Swarm Algorithm is proposed to solve the model, and the decision-makers are allowed to choose the suitable solutions form the Pareto solutions. An illustrative case has been studied by the proposed method, a few of compromises between high profitability and high sustainability can be obtained for the decision-makers/stakeholders to make decision. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. On Multi-Objective Based Constitutive Modelling Methodology and Numerical Validation in Small-Hole Drilling of Al6063/SiCp Composites

    PubMed Central

    Xiang, Junfeng; Xie, Lijing; Gao, Feinong; Zhang, Yu; Yi, Jie; Wang, Tao; Pang, Siqin; Wang, Xibin

    2018-01-01

    Discrepancies in capturing material behavior of some materials, such as Particulate Reinforced Metal Matrix Composites, by using conventional ad hoc strategy make the applicability of Johnson-Cook constitutive model challenged. Despites applicable efforts, its extended formalism with more fitting parameters would increase the difficulty in identifying constitutive parameters. A weighted multi-objective strategy for identifying any constitutive formalism is developed to predict mechanical behavior in static and dynamic loading conditions equally well. These varying weighting is based on the Gaussian-distributed noise evaluation of experimentally obtained stress-strain data in quasi-static or dynamic mode. This universal method can be used to determine fast and directly whether the constitutive formalism is suitable to describe the material constitutive behavior by measuring goodness-of-fit. A quantitative comparison of different fitting strategies on identifying Al6063/SiCp’s material parameters is made in terms of performance evaluation including noise elimination, correlation, and reliability. Eventually, a three-dimensional (3D) FE model in small-hole drilling of Al6063/SiCp composites, using multi-objective identified constitutive formalism, is developed. Comparison with the experimental observations in thrust force, torque, and chip morphology provides valid evidence on the applicability of the developed multi-objective identification strategy in identifying constitutive parameters. PMID:29324688

  9. Optimization of dual-energy xenon-computed tomography for quantitative assessment of regional pulmonary ventilation.

    PubMed

    Fuld, Matthew K; Halaweish, Ahmed F; Newell, John D; Krauss, Bernhard; Hoffman, Eric A

    2013-09-01

    Dual-energy x-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study, we sought to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon/oxygen gas mixtures (0%, 20%, 25%, 33%, 50%, 66%, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved 3-material decomposition calibration parameters. In addition, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Attenuation curves for xenon were obtained from the syringe test-objects and were used to develop improved 3-material decomposition parameters (Hounsfield unit enhancement per percentage xenon: within the chest phantom, 2.25 at 80 kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; in open air, 2.5 at 80 kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally nondependent portion of the airway tree test-object, while not affecting the quantitation of xenon in the 3-material decomposition DECT. The mixture 40% Xe/40% He/20% O2 provided good signal-to-noise ratio (SNR), greater than the Rose criterion (SNR > 5), while avoiding gravitational effects of similar concentrations of xenon in a 60% O2 mixture. Compared with 100/140 Sn kVp, 80/140 Sn kVp (Sn = tin filtered) provided improved SNR in a swine with an equivalent thoracic transverse density to a human subject with a body mass index of 33 kg/m. Airways were brighter in the 80/140 Sn kVp scan (80/140 Sn, 31.6%; 100/140 Sn, 25.1%) with considerably lower noise (80/140 Sn, coefficient of variation of 0.140; 100/140 Sn, coefficient of variation of 0.216). To provide a truly quantitative measure of regional lung function with xenon-DECT, the basic protocols and parameter calibrations need to be better understood and quantified. It is critically important to understand the fundamentals of new techniques to allow for proper implementation and interpretation of their results before widespread usage. With the use of an in-house derived xenon calibration curve for 3-material decomposition rather than the scanner supplied calibration and a xenon/helium/oxygen mixture, we demonstrate highly accurate quantitation of xenon gas volumes and avoid gravitational effects on gas distribution. This study provides a foundation for other researchers to use and test these methods with the goal of clinical translation.

  10. Impact of wall shear stress on initial bacterial adhesion in rotating annular reactor

    PubMed Central

    Saur, Thibaut; Morin, Emilie; Habouzit, Frédéric; Bernet, Nicolas

    2017-01-01

    The objective of this study was to investigate the bacterial adhesion under different wall shear stresses in turbulent flow and using a diverse bacterial consortium. A better understanding of the mechanisms governing microbial adhesion can be useful in diverse domains such as industrial processes, medical fields or environmental biotechnologies. The impact of wall shear stress—four values ranging from 0.09 to 7.3 Pa on polypropylene (PP) and polyvinyl chloride (PVC)—was carried out in rotating annular reactors to evaluate the adhesion in terms of morphological and microbiological structures. A diverse inoculum consisting of activated sludge was used. Epifluorescence microscopy was used to quantitatively and qualitatively characterize the adhesion. Attached bacterial communities were assessed by molecular fingerprinting profiles (CE-SSCP). It has been demonstrated that wall shear stress had a strong impact on both quantitative and qualitative aspects of the bacterial adhesion. ANOVA tests also demonstrated the significant impact of wall shear stress on all three tested morphological parameters (surface coverage, number of objects and size of objects) (p-values < 2.10−16). High wall shear stresses increased the quantity of attached bacteria but also altered their spatial distribution on the substratum surface. As the shear increased, aggregates or clusters appeared and their size grew when increasing the shears. Concerning the microbiological composition, the adhered bacterial communities changed gradually with the applied shear. PMID:28207869

  11. Application of shift-and-add algorithms for imaging objects within biological media

    NASA Astrophysics Data System (ADS)

    Aizert, Avishai; Moshe, Tomer; Abookasis, David

    2017-01-01

    The Shift-and-Add (SAA) technique is a simple mathematical operation developed to reconstruct, at high spatial resolution, atmospherically degraded solar images obtained from stellar speckle interferometry systems. This method shifts and assembles individual degraded short-exposure images into a single average image with significantly improved contrast and detail. Since the inhomogeneous refractive indices of biological tissue causes light scattering similar to that induced by optical turbulence in the atmospheric layers, we assume that SAA methods can be successfully implemented to reconstruct the image of an object within a scattering biological medium. To test this hypothesis, five SAA algorithms were evaluated for reconstructing images acquired from multiple viewpoints. After successfully retrieving the hidden object's shape, quantitative image quality metrics were derived, enabling comparison of imaging error across a spectrum of layer thicknesses, demonstrating the relative efficacy of each SAA algorithm for biological imaging.

  12. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards.

    PubMed

    Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten

    2017-01-01

    To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience. Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  13. Tooth color measurement using Chroma Meter: techniques, advantages, and disadvantages.

    PubMed

    Li, Yiming

    2003-01-01

    Tooth whitening has become a popular and routine dental procedure, and its efficacy and safety have been well documented. However, the measurement of tooth color, particularly in the evaluation of the efficacy of a system intended to enhance tooth whiteness, remains a challenge. One of the instruments used for assessing tooth color in clinical whitening studies is the Minolta Chroma Meter CR-321 (Minolta Corporation USA, Ramsey, NJ, USA). This article describes the instrument and discusses various measuring procedures and the Chroma Meter's advantages, limitations, and disadvantages. The available information indicates that, although Minolta Chroma Meter CR-321 provides quantitative and objective measurements of tooth color, it can be tedious to use with a custom alignment device. The Chroma Meter data are inconsistent with the commonly used visual instruments such as Vitapan Classical Shade Guide (Vita Zahnfabrik, Bad Säckingen, Germany), although in many cases the general trends are similar. It is also questionable whether the small area measured adequately represents the color of the whole tooth. A more critical challenge is the lack of methods for interpreting the Chroma Meter data regarding tooth color change in studies evaluating the efficacy of whitening systems. Consequently, at present the Chroma Meter data alone do not appear to be adequate for determining tooth color change in whitening research, although the quantitative measurements may be useful as supplemental or supportive data. Research is needed to develop and improve the instrument and technique for quantitative measurement of tooth color and interpretation of the data for evaluating tooth color change. This paper will help readers to understand the advantages and limitations of the Minolta Chroma Meter used for evaluating the efficacy of tooth-whitening systems so that proper judgment can be made in the interpretation of the results of clinical studies.

  14. A Hospice Rotation for Military Medical Residents: A Mixed Methods, Multi-Perspective Program Evaluation

    PubMed Central

    Boyden, Jackelyn Y.; Kalish, Virginia B.; Muir, J. Cameron; Richardson, Suzanne; Connor, Stephen R.

    2016-01-01

    Abstract Background: An estimated 6,000 to 18,000 additional hospice and palliative medicine (HPM) physicians are needed in the United States. A source could be the military graduate medical education system where 15% of U.S. medical residents are trained. A community-based hospice and palliative care organization created a one-week rotation for military residents including participation in interdisciplinary group visits at patients' homes, facilities, and an inpatient hospice unit. Objective: Our goal was to evaluate the effectiveness of a one-week community HPM rotation for military medical residents. Methods: A mixed-methods, multi-stakeholder perspective program evaluation model was used for program years 2011 to 2013. Data were managed and analyzed using Microsoft Excel and Atlas.ti. Participants in the rotation were residents training at two local military hospitals. Program evaluation data were collected from residents, military program liaisons, and hospice clinical preceptors. Quantitative data included pre- and post-tests based on Accreditation Council for Graduate Medical Education competencies completed by residents. Qualitative data included resident essays and semi-structured interviews with hospice preceptors and military program liaisons. Results: Quantitative and qualitative data suggested that the rotation increased military residents' knowledge, attitudes, and comfort level with HPM. Quantitative analysis of test scores indicated improvements from pre- to post-tests in each of five areas of learning. Qualitative data indicated the rotation created a greater appreciation for the overall importance of HPM and increased understanding of eligibility and methods for pain and symptom management. Conclusions: A one-week community hospice rotation for medical military residents impacts participant's knowledge of and attitudes toward HPM. PMID:27139524

  15. Quantifying Muscle Asymmetries in Cervical Dystonia with Electrical Impedance: A Preliminary Assessment

    PubMed Central

    Lungu, Codrin; Tarulli, Andrew W; Tarsy, Daniel; Mongiovi, Phillip; Vanderhorst, Veronique G; Rutkove, Seward B

    2010-01-01

    Objective Cervical Dystonia (CD) lacks an objective quantitative measure. Electrical impedance myography (EIM) is a non-invasive assessment method sensitive to changes in muscle structure and physiology. We evaluate the potential role of EIM in quantifying CD, hypothesizing that patients would demonstrate differences in the symmetry of muscle electrical resistance compared to controls, and that this asymmetry would decrease after botulinum neurotoxin (BoNT) treatment. Methods EIM was performed on the sternocleidomastoid (SCM) and cervical paraspinal (PS) muscles of CD patients and age-matched controls. 50kHz Resistance was analyzed, comparing side-to-side asymmetry in patients and controls, and, in patients, before and after BoNT treatment. Results 16 patients and 10 controls were included. Resistance asymmetry was on average 3-5 times higher in patients than controls. Receiver operating characteristic analysis demonstrated 91% accuracy of discriminating CD from normal. From pre-treatment to maximum BoNT effect, asymmetry decreased from 20.8 (13.9-26.1)% to 6.2 (3.1-9.9)% (SCM), and from 16.0(14.3-16.0)% to 8.4(7.0-9.2)% (PS), p<0.05 (median, interquartile range). Conclusions EIM effectively differentiates normal subjects from CD patients by revealing asymmetries in resistance values and detects improvement in muscle symmetry after treatment. Significance These results suggest that EIM, a painless, non-invasive measure, can provide a useful quantitative metric in CD evaluation and deserves further study. PMID:20943436

  16. Athena: Providing Insight into the History of the Universe

    NASA Technical Reports Server (NTRS)

    Murphy, Gloria A.

    2010-01-01

    The American Institute for Aeronautics and Astronautics has provided a Request for Proposal which calls for a manned mission to a Near-Earth Object. It is the goal of Team COLBERT to respond to their request by providing a reusable system that can be implemented as a solid stepping stone for future manned trips to Mars and beyond. Despite Team COLBERT consisting of only students in Aerospace Engineering, in order to achieve this feat, the team must employ the use of Systems Engineering. Tools and processes from Systems Engineering will provide quantitative and semi-quantitative tools for making design decisions and evaluating items such as budgets and schedules. This paper will provide an in-depth look at some of the Systems Engineering processes employed and will step through the design process of a Human Asteroid Exploration System.

  17. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    PubMed

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.

  18. Evaluation of colonoscopy technical skill levels by use of an objective kinematic-based system.

    PubMed

    Obstein, Keith L; Patil, Vaibhav D; Jayender, Jagadeesan; San José Estépar, Raúl; Spofford, Inbar S; Lengyel, Balazs I; Vosburgh, Kirby G; Thompson, Christopher C

    2011-02-01

    Colonoscopy requires training and experience to ensure accuracy and safety. Currently, no objective, validated process exists to determine when an endoscopist has attained technical competence. Kinematics data describing movements of laparoscopic instruments have been used in surgical skill assessment to define expert surgical technique. We have developed a novel system to record kinematics data during colonoscopy and quantitatively assess colonoscopist performance. To use kinematic analysis of colonoscopy to quantitatively assess endoscopic technical performance. Prospective cohort study. Tertiary-care academic medical center. This study involved physicians who perform colonoscopy. Application of a kinematics data collection system to colonoscopy evaluation. Kinematics data, validated task load assessment instrument, and technical difficulty visual analog scale. All 13 participants completed the colonoscopy to the terminal ileum on the standard colon model. Attending physicians reached the terminal ileum quicker than fellows (median time, 150.19 seconds vs 299.86 seconds; p<.01) with reduced path lengths for all 4 sensors, decreased flex (1.75 m vs 3.14 m; P=.03), smaller tip angulation, reduced absolute roll, and lower curvature of the endoscope. With performance of attending physicians serving as the expert reference standard, the mean kinematic score increased by 19.89 for each decrease in postgraduate year (P<.01). Overall, fellows experienced greater mental, physical, and temporal demand than did attending physicians. Small cohort size. Kinematic data and score calculation appear useful in the evaluation of colonoscopy technical skill levels. The kinematic score appears to consistently vary by year of training. Because this assessment is nonsubjective, it may be an improvement over current methods for determination of competence. Ongoing studies are establishing benchmarks and characteristic profiles of skill groups based on kinematics data. Copyright © 2011 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.

  19. Evaluations of UltraiQ software for objective ultrasound image quality assessment using images from a commercial scanner.

    PubMed

    Long, Zaiyang; Tradup, Donald J; Stekel, Scott F; Gorny, Krzysztof R; Hangiandreou, Nicholas J

    2018-03-01

    We evaluated a commercially available software package that uses B-mode images to semi-automatically measure quantitative metrics of ultrasound image quality, such as contrast response, depth of penetration (DOP), and spatial resolution (lateral, axial, and elevational). Since measurement of elevational resolution is not a part of the software package, we achieved it by acquiring phantom images with transducers tilted at 45 degrees relative to the phantom. Each measurement was assessed in terms of measurement stability, sensitivity, repeatability, and semi-automated measurement success rate. All assessments were performed on a GE Logiq E9 ultrasound system with linear (9L or 11L), curved (C1-5), and sector (S1-5) transducers, using a CIRS model 040GSE phantom. In stability tests, the measurements of contrast, DOP, and spatial resolution remained within a ±10% variation threshold in 90%, 100%, and 69% of cases, respectively. In sensitivity tests, contrast, DOP, and spatial resolution measurements followed the expected behavior in 100%, 100%, and 72% of cases, respectively. In repeatability testing, intra- and inter-individual coefficients of variations were equal to or less than 3.2%, 1.3%, and 4.4% for contrast, DOP, and spatial resolution (lateral and axial), respectively. The coefficients of variation corresponding to the elevational resolution test were all within 9.5%. Overall, in our assessment, the evaluated package performed well for objective and quantitative assessment of the above-mentioned image qualities under well-controlled acquisition conditions. We are finding it to be useful for various clinical ultrasound applications including performance comparison between scanners from different vendors. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  20. Objective evaluation of chemotherapy-induced peripheral neuropathy using quantitative pain measurement system (Pain Vision®), a pilot study.

    PubMed

    Sato, Junya; Mori, Megumi; Nihei, Satoru; Takeuchi, Satoshi; Kashiwaba, Masahiro; Kudo, Kenzo

    2017-01-01

    In an evaluation of chemotherapy-induced peripheral neuropathy (CIPN), objectivity may be poor because the evaluation is determined by the patient's subjective assessment. In such cases, management of neuropathy may be delayed and CIPN symptoms may become severe. In this pilot study, we attempted an objective evaluation of CIPN using a quantitative pain measurement system (Pain Vision ® ). The subjects were patients with gynecologic cancer who underwent chemotherapy using taxane and platinum drugs. The grade of the peripheral sensory nerve disorder was based on the Common Terminology Criteria for Adverse Events (CTC-AE) ver. 4.0 and was evaluated before the initiation of therapy and up to six chemotherapy cycles. A symptom scale assessed by the patients using a peripheral neuropathy questionnaire (PNQ) was also evaluated. Simultaneously during these evaluations, graded electric current was applied from the probe to a fingertip and measured both the lowest perceptible current and lowest current perceived as pain by Pain Vision ® . From these values, the pain degree was calculated from the following formula: (pain perception current value - lowest perceptible current value) ÷ lowest perceptible current value × 100. We compared the pain degrees by Pain Vision ® during CIPN development with the value obtained before chemotherapy initiation. Forty-one patients were enrolled. In the evaluation by a medical professional, 28 (64.3%) patients developed CIPN during 2.5 ± 1.1 chemotherapy cycles (mean ± standard deviation). The pain degree by Pain Vision ® at grade 1 and 2 CIPN development according to the evaluation (CTC-AE) was significantly decreased compared to that before chemotherapy initiation (126.0 ± 114.5 vs. 69.8 ± 46.8, p  = 0.001, and 126.0 ± 114.5 vs. 32.8 ± 32.6, p  = 0.004). Changes in the pain degree by Pain Vision ® were also found during scale B and C, D CIPN development in the patient evaluation (PNQ) (115.9 ± 112.4 vs. 70.6 ± 56.5, p  = 0.005, and 115.9 ± 112.4 vs. 46.3 ± 42.9, p  = 0.004). In the 13 patients in whom CIPN did not occur, no significant decrease in the pain degree by Pain Vision ® was detected ( p  = 0.764). There was no discontinuation of the measurements because of adverse events such as discomfort from the electric current. The decrease in the pain degree measured by Pain Vision ® was associated with the onset of CIPN symptoms. Particularly, detection of CIPN by Pain Vision ® was possible, though most of the CIPN that occurred was low grade or mild symptom. Pain Vision ® might become a noninvasive and convenient objective CIPN detection tool to supplement subjective CIPN evaluation. The study approval number in the institution; H25-140. Registered December 17, 2013.

  1. Quantitative evaluation of the requirements for the promotion as associate professor at German medical faculties.

    PubMed

    Sorg, Heiko; Knobloch, Karsten

    2012-01-01

    First quantitative evaluation of the requirements for the promotion as associate professor (AP) at German medical faculties. Analysis of the AP-regulations of German medical faculties according to a validated scoring system, which has been adapted to this study. The overall scoring for the AP-requirements at 35 German medical faculties was 13.5±0.6 of 20 possible scoring points (95% confidence interval 12.2-14.7). More than 88% of the AP-regulations demand sufficient performance in teaching and research with adequate scientific publication. Furthermore, 83% of the faculties expect an expert review of the candidate's performance. Conference presentations required as an assistant professor as well as the reduction of the minimum time as an assistant professor do only play minor roles. The requirements for assistant professors to get nominated as an associate professor at German medical faculties are high with an only small range. In detail, however, it can be seen that there still exists large heterogeneity, which hinders equal opportunities and career possibilities. These data might be used for the ongoing objective discussion.

  2. Pollen preservation and Quaternary environmental history in the southeastern United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delcourt, P.A.; Delcourt, H.R.

    Reconstructions of Quaternary environmental history based upon modern pollen/vegetation/climate calibrations are more tenable if the factors responsible for variation in pollen assemblages are evaluated. Examination of the state of preservation of Quaternary palynomorphs provides quantitative data concerning the degree of information loss due to alteration of pollen assemblages by syndepositional and post-depositional deterioration. The percentage, concentration, and influx values for total indeterminable pollen are useful criteria in providing an objective and quantitative basis for evaluating the comparability of pollen spectra within and between sites. Supporting data concerning sediment particle-size distribution, organic matter content, and concentration, influx, and taxonomic composition ofmore » both determinable pollen and plant macrofossils aid in reconstructing past depositional environments. The potential is high for deterioration of pollen in sediments from the southeastern United States, although considerable variation is found in both kind and degree of deterioration between lacustrine and alluvial sites of different ages and in different latitudes. Modern analogs are a basis for late Quaternary environmental reconstructions when pollen deterioration has not significantly biased the information content of fossil pollen assemblages.« less

  3. Gait disorders in patients with fibromyalgia.

    PubMed

    Auvinet, Bernard; Bileckot, Richard; Alix, Anne-Sophie; Chaleil, Denis; Barrey, Eric

    2006-10-01

    The objective of this study was to compare gait in patients with fibromyalgia and in matched controls. Measurements must be obtained in patients with fibromyalgia, as the evaluation scales for this disorder are semi-quantitative. We used a patented gait analysis system (Locometrix Centaure Metrix, France) developed by the French National Institute for Agricultural Research. Relaxed walking was evaluated in 14 women (mean age 50+/-5 years; mean height 162+/-5 cm; and mean body weight 68+/-13 kg) meeting American College of Rheumatology criteria for fibromyalgia and in 14 controls matched on sex, age, height, and body weight. Gait during stable walking was severely altered in the patients. Walking speed was significantly diminished (P<0.001) as a result of reductions in stride length (P<0.001) and cycle frequency (P<0.001). The resulting bradykinesia (P<0.001) was the best factor for separating the two groups. Regularity was affected in the patients (P<0.01); this variable is interesting because it is independent of age and sex in healthy, active adults. Measuring the variables that characterize relaxed walking provides useful quantitative data in patients with fibromyalgia.

  4. Quantitative evaluation of palatal bone thickness for the placement of orthodontic miniscrews in adults with different facial types

    PubMed Central

    Wang, Yunji; Qiu, Ye; Liu, Henglang; He, Jinlong; Fan, Xiaoping

    2017-01-01

    Objectives: To quantitatively evaluate palatal bone thickness in adults with different facial types using cone beam computed tomography (CBCT). Methods: The CBCT volumetric data of 123 adults (mean age, 26.8 years) collected between August 2014 and August 2016 was retrospectively studied. The subjects were divided into a low-angle group (39 subjects), a normal-angle group (48 subjects) and a high-angle group (36 subjects) based on facial types assigned by cephalometric radiography. The thickness of the palatal bone was assessed at designated points. A repeated-measure analysis of variance (rm-ANOVA) test was used to test the relationship between facial types and palatal bone thickness. Results: Compared to the low-angle group, the high-angle group had significantly thinner palatal bones (p<0.05), except for the anterior-midline, anterior-medial and middle-midline areas. Conclusion: The safest zone for the placement of microimplants is the anterior part of the paramedian palate. Clinicians should pay special attention to the probability of thinner bone plates and the risk of perforation in high-angle patients. PMID:28917071

  5. Proposed Objective Odor Control Test Methodology for Waste Containment

    NASA Technical Reports Server (NTRS)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  6. Quantitative evaluation of skeletal muscle defects in second harmonic generation images.

    PubMed

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  7. Quantitative evaluation of skeletal muscle defects in second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  8. Derivation and evaluation of a labeled hedonic scale.

    PubMed

    Lim, Juyun; Wood, Alison; Green, Barry G

    2009-11-01

    The objective of this study was to develop a semantically labeled hedonic scale (LHS) that would yield ratio-level data on the magnitude of liking/disliking of sensation equivalent to that produced by magnitude estimation (ME). The LHS was constructed by having 49 subjects who were trained in ME rate the semantic magnitudes of 10 common hedonic descriptors within a broad context of imagined hedonic experiences that included tastes and flavors. The resulting bipolar scale is statistically symmetrical around neutral and has a unique semantic structure. The LHS was evaluated quantitatively by comparing it with ME and the 9-point hedonic scale. The LHS yielded nearly identical ratings to those obtained using ME, which implies that its semantic labels are valid and that it produces ratio-level data equivalent to ME. Analyses of variance conducted on the hedonic ratings from the LHS and the 9-point scale gave similar results, but the LHS showed much greater resistance to ceiling effects and yielded normally distributed data, whereas the 9-point scale did not. These results indicate that the LHS has significant semantic, quantitative, and statistical advantages over the 9-point hedonic scale.

  9. Assessing Microneurosurgical Skill with Medico-Engineering Technology.

    PubMed

    Harada, Kanako; Morita, Akio; Minakawa, Yoshiaki; Baek, Young Min; Sora, Shigeo; Sugita, Naohiko; Kimura, Toshikazu; Tanikawa, Rokuya; Ishikawa, Tatsuya; Mitsuishi, Mamoru

    2015-10-01

    Most methods currently used to assess surgical skill are rather subjective or not adequate for microneurosurgery. Objective and quantitative microneurosurgical skill assessment systems that are capable of accurate measurements are necessary for the further development of microneurosurgery. Infrared optical motion tracking markers, an inertial measurement unit, and strain gauges were mounted on tweezers to measure many parameters related to instrument manipulation. We then recorded the activity of 23 neurosurgeons. The task completion time, tool path, and needle-gripping force were evaluated for three stitches made in an anastomosis of 0.7-mm artificial blood vessels. Videos of the activity were evaluated by three blinded expert surgeons. Surgeons who had recently done many bypass procedures demonstrated better skills. These skilled surgeons performed the anastomosis with in a shorter time, with a shorter tool path, and with a lesser force when extracting the needle. These results show the potential contribution of the system to microsurgical skill assessment. Quantitative and detailed analysis of surgical tasks helps surgeons better understand the key features of the required skills. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Assessing locomotor skills development in childhood using wearable inertial sensor devices: the running paradigm.

    PubMed

    Masci, Ilaria; Vannozzi, Giuseppe; Bergamini, Elena; Pesce, Caterina; Getchell, Nancy; Cappozzo, Aurelio

    2013-04-01

    Objective quantitative evaluation of motor skill development is of increasing importance to carefully drive physical exercise programs in childhood. Running is a fundamental motor skill humans adopt to accomplish locomotion, which is linked to physical activity levels, although the assessment is traditionally carried out using qualitative evaluation tests. The present study aimed at investigating the feasibility of using inertial sensors to quantify developmental differences in the running pattern of young children. Qualitative and quantitative assessment tools were adopted to identify a skill-sensitive set of biomechanical parameters for running and to further our understanding of the factors that determine progression to skilled running performance. Running performances of 54 children between the ages of 2 and 12 years were submitted to both qualitative and quantitative analysis, the former using sequences of developmental level, the latter estimating temporal and kinematic parameters from inertial sensor measurements. Discriminant analysis with running developmental level as dependent variable allowed to identify a set of temporal and kinematic parameters, within those obtained with the sensor, that best classified children into the qualitative developmental levels (accuracy higher than 67%). Multivariate analysis of variance with the quantitative parameters as dependent variables allowed to identify whether and which specific parameters or parameter subsets were differentially sensitive to specific transitions between contiguous developmental levels. The findings showed that different sets of temporal and kinematic parameters are able to tap all steps of the transitional process in running skill described through qualitative observation and can be prospectively used for applied diagnostic and sport training purposes. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Robust human machine interface based on head movements applied to assistive robotics.

    PubMed

    Perez, Elisa; López, Natalia; Orosco, Eugenio; Soria, Carlos; Mut, Vicente; Freire-Bastos, Teodiano

    2013-01-01

    This paper presents an interface that uses two different sensing techniques and combines both results through a fusion process to obtain the minimum-variance estimator of the orientation of the user's head. Sensing techniques of the interface are based on an inertial sensor and artificial vision. The orientation of the user's head is used to steer the navigation of a robotic wheelchair. Also, a control algorithm for assistive technology system is presented. The system is evaluated by four individuals with severe motors disability and a quantitative index was developed, in order to objectively evaluate the performance. The results obtained are promising since most users could perform the proposed tasks with the robotic wheelchair.

  12. Robust Human Machine Interface Based on Head Movements Applied to Assistive Robotics

    PubMed Central

    Perez, Elisa; López, Natalia; Orosco, Eugenio; Soria, Carlos; Mut, Vicente; Freire-Bastos, Teodiano

    2013-01-01

    This paper presents an interface that uses two different sensing techniques and combines both results through a fusion process to obtain the minimum-variance estimator of the orientation of the user's head. Sensing techniques of the interface are based on an inertial sensor and artificial vision. The orientation of the user's head is used to steer the navigation of a robotic wheelchair. Also, a control algorithm for assistive technology system is presented. The system is evaluated by four individuals with severe motors disability and a quantitative index was developed, in order to objectively evaluate the performance. The results obtained are promising since most users could perform the proposed tasks with the robotic wheelchair. PMID:24453877

  13. Measurement of erythema and tanning responses in human skin using a tri-stimulus colorimeter.

    PubMed

    Seitz, J C; Whitmore, C G

    1988-01-01

    A 'Minolta Tri-Stimulus Colorimeter II' was evaluated for obtaining objective measurements of early changes in erythema and tanning. The meter showed a subtle, continuous transition between the primary erythematous response and the delayed tanning of skin which was below the visual threshold for detection. Thereafter, the a* (redness) value of the meter showed a significant linear correlation with the dermatologist's perception of erythema while the b* (yellow) value showed a significant correlation with the perception of tanning. This capability of the tri-stimulus colorimeter to simultaneously evaluate the hue and saturation of skin color affords an improved opportunity to quantitate the transition from erythema to tanning without subjective bias.

  14. Assessment of the clinical relevance of quantitative sensory testing with Von Frey monofilaments in patients with allodynia and neuropathic pain. A pilot study.

    PubMed

    Keizer, D; van Wijhe, M; Post, W J; Uges, D R A; Wierda, J M K H

    2007-08-01

    Allodynia is a common and disabling symptom in many patients with neuropathic pain. Whereas quantification of pain mostly depends on subjective pain reports, allodynia can also be measured objectively with quantitative sensory testing. In this pilot study, we investigated the clinical relevance of quantitative sensory testing with Von Frey monofilaments in patients with allodynia as a consequence of a neuropathic pain syndrome, by means of correlating subjective pain scores with pain thresholds obtained with quantitative sensory testing. During a 4-week trial, we administered a cannabis extract to 17 patients with allodynia. We quantified the severity of the allodynia with Von Frey monofilaments before, during and after the patients finished the trial. We also asked the patients to rate their pain on a numeric rating scale at these three moments. We found that most of the effect of the cannabis occurred in the last 2 weeks of the trial. In this phase, we observed that the pain thresholds, as measured with Von Frey monofilaments, were inversely correlated with a decrease of the perceived pain intensity. These preliminary findings indicate clinical relevance of quantitative sensory testing with Von Frey monofilaments in the quantification of allodynia in patients with neuropathic pain, although confirmation of our data is still required in further studies to position this method of quantitative sensory testing as a valuable tool, for example, in the evaluation of therapeutic interventions for neuropathic pain.

  15. Process evaluation in a multisite, primary obesity-prevention trial in American Indian schoolchildren.

    PubMed

    Helitzer, D L; Davis, S M; Gittelsohn, J; Going, S B; Murray, D M; Snyder, P; Steckler, A B

    1999-04-01

    We describe the development, implementation, and use of the process evaluation component of a multisite, primary obesity prevention trial for American Indian schoolchildren. We describe the development and pilot testing of the instruments, provide some examples of the criteria for instrument selection, and provide examples of how process evaluation results were used to document and refine intervention components. The theoretical and applied framework of the process evaluation was based on diffusion theory, social learning theory, and the desire for triangulation of multiple modes of data collection. The primary objectives of the process evaluation were to systematically document the training process, content, and implementation of 4 components of the intervention. The process evaluation was developed and implemented collaboratively so that it met the needs of both the evaluators and those who would be implementing the intervention components. Process evaluation results revealed that observation and structured interviews provided the most informative data; however, these methods were the most expensive and time consuming and required the highest level of skill to undertake. Although the literature is full of idealism regarding the uses of process evaluation for formative and summative purposes, in reality, many persons are sensitive to having their work evaluated in such an in-depth, context-based manner as is described. For this reason, use of structured, quantitative, highly objective tools may be more effective than qualitative methods, which appear to be more dependent on the skills and biases of the researcher and the context in which they are used.

  16. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques

    PubMed Central

    Rosebrock, Adrian; Caban, Jesus J.; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2014-01-01

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute’s Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant. PMID:25722829

  17. A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images

    PubMed Central

    Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.

    1986-01-01

    The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16

  18. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    PubMed Central

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  19. Initial description of a quantitative, cross-species (chimpanzee-human) social responsiveness measure

    PubMed Central

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve; Constantino, John; Povinelli, Daniel; Pruett, John R.

    2011-01-01

    Objective Comparative studies of social responsiveness, an ability that is impaired in autistic spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species (human-chimpanzee) social responsiveness measure. Method We translated the Social Responsiveness Scale (SRS), an instrument that quantifies human social responsiveness, into an analogous instrument for chimpanzees. We then retranslated this "Chimp SRS" into a human "Cross-Species SRS" (XSRS). We evaluated three groups of chimpanzees (n=29) with the Chimp SRS and typical and autistic spectrum disorder (ASD) human children (n=20) with the XSRS. Results The Chimp SRS demonstrated strong inter-rater reliability at the three sites (ranges for individual ICCs: .534–.866 and mean ICCs: .851–.970). As has been observed in humans, exploratory principal components analysis of Chimp SRS scores supports a single factor underlying chimpanzee social responsiveness. Human subjects' XSRS scores were fully concordant with their SRS scores (r=.976, p=.001) and distinguished appropriately between typical and ASD subjects. One chimpanzee known for inappropriate social behavior displayed a significantly higher score than all other chimpanzees at its site, demonstrating the scale's ability to detect impaired social responsiveness in chimpanzees. Conclusion Our initial cross-species social responsiveness scale proved reliable and discriminated differences in social responsiveness across (in a relative sense) and within (in a more objectively quantifiable manner) humans and chimpanzees. PMID:21515200

  20. Current use of impact models for agri-environment schemes and potential for improvements of policy design and assessment.

    PubMed

    Primdahl, Jørgen; Vesterager, Jens Peter; Finn, John A; Vlahos, George; Kristensen, Lone; Vejre, Henrik

    2010-06-01

    Agri-Environment Schemes (AES) to maintain or promote environmentally-friendly farming practices were implemented on about 25% of all agricultural land in the EU by 2002. This article analyses and discusses the actual and potential use of impact models in supporting the design, implementation and evaluation of AES. Impact models identify and establish the causal relationships between policy objectives and policy outcomes. We review and discuss the role of impact models at different stages in the AES policy process, and present results from a survey of impact models underlying 60 agri-environmental schemes in seven EU member states. We distinguished among three categories of impact models (quantitative, qualitative or common sense), depending on the degree of evidence in the formal scheme description, additional documents, or key person interviews. The categories of impact models used mainly depended on whether scheme objectives were related to natural resources, biodiversity or landscape. A higher proportion of schemes dealing with natural resources (primarily water) were based on quantitative impact models, compared to those concerned with biodiversity or landscape. Schemes explicitly targeted either on particular parts of individual farms or specific areas tended to be based more on quantitative impact models compared to whole-farm schemes and broad, horizontal schemes. We conclude that increased and better use of impact models has significant potential to improve efficiency and effectiveness of AES. (c) 2009 Elsevier Ltd. All rights reserved.

  1. [Clinical research IV. Relevancy of the statistical test chosen].

    PubMed

    Talavera, Juan O; Rivas-Ruiz, Rodolfo

    2011-01-01

    When we look at the difference between two therapies or the association of a risk factor or prognostic indicator with its outcome, we need to evaluate the accuracy of the result. This assessment is based on a judgment that uses information about the study design and statistical management of the information. This paper specifically mentions the relevance of the statistical test selected. Statistical tests are chosen mainly from two characteristics: the objective of the study and type of variables. The objective can be divided into three test groups: a) those in which you want to show differences between groups or inside a group before and after a maneuver, b) those that seek to show the relationship (correlation) between variables, and c) those that aim to predict an outcome. The types of variables are divided in two: quantitative (continuous and discontinuous) and qualitative (ordinal and dichotomous). For example, if we seek to demonstrate differences in age (quantitative variable) among patients with systemic lupus erythematosus (SLE) with and without neurological disease (two groups), the appropriate test is the "Student t test for independent samples." But if the comparison is about the frequency of females (binomial variable), then the appropriate statistical test is the χ(2).

  2. Qualitative analysis of student beliefs and attitudes after an objective structured clinical evaluation: implications for affective domain learning in undergraduate nursing education.

    PubMed

    Cazzell, Mary; Rodriguez, Amber

    2011-12-01

    This qualitative study explored the feelings, beliefs, and attitudes of senior-level undergraduate pediatric nursing students upon completion of a medication administration Objective Structured Clinical Evaluation (OSCE). The affective domain is the most neglected domain in higher education, although it is deemed the "gateway to learning." Quantitative assessments of clinical skills performed during OSCEs usually address two of the three domains of learning: cognitive (knowledge) and psychomotor skills. Twenty students volunteered to participate in focus groups (10 per group) and were asked three questions relevant to their feelings, beliefs, and attitudes about their OSCE experiences. Students integrated the attitude of safety first into future practice but felt that anxiety, loss of control, reaction under pressure, and no feedback affected their ability to connect the OSCE performance with future clinical practice. The findings affect future affective domain considerations in the development, modification, and assessment of OSCEs across the undergraduate nursing curriculum.

  3. Returning to Work after Cancer: Quantitative Studies and Prototypical Narratives

    PubMed Central

    Steiner, John F.; Nowels, Carolyn T.; Main, Deborah S.

    2009-01-01

    Objective A combination of quantitative data and illustrative narratives may allow cancer survivorship researchers to disseminate their research findings more broadly. We identified recent, methodologically rigorous quantitative studies on return to work after cancer, summarized the themes from these studies, and illustrated those themes with narratives of individual cancer survivors. Methods We reviewed English-language studies of return to work for adult cancer survivors through June, 2008, and identified 13 general themes from papers that met methodological criteria (population-based sampling, prospective and longitudinal assessment, detailed assessment of work, evaluation of economic impact, assessment of moderators of work return, and large sample size). We drew survivorship narratives from a prior qualitative research study to illustrate these themes. Results Nine quantitative studies met 4 or more of our 6 methodological criteria. These studies suggested that most cancer survivors could return to work without residual disabilities. Cancer site, clinical prognosis, treatment modalities, socioeconomic status, and attributes of the job itself influenced the likelihood of work return. Three narratives - a typical survivor who returned to work after treatment, an individual unable to return to work, and an inspiring survivor who returned to work despite substantial barriers - illustrated many of the themes from the quantitative literature while providing additional contextual details. Conclusion Illustrative narratives can complement the findings of cancer survivorship research if researchers are rigorous and transparent in the selection, analysis, and retelling of those stories. PMID:19507264

  4. TECHNOLOGICAL INNOVATION IN NEUROSURGERY: A QUANTITATIVE STUDY

    PubMed Central

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-01-01

    Object Technological innovation within healthcare may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technologically intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical technique. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation respectively. Methods A patent database was searched between 1960 and 2010 using the search terms “neurosurgeon” OR “neurosurgical” OR “neurosurgery”. The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top performing technology cluster was then selected as an exemplar for more detailed analysis of individual patents. Results In all, 11,672 patents and 208,203 publications relating to neurosurgery were identified. The top performing technology clusters over the 50 years were: image guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes and endoscopes. Image guidance and neuromodulation devices demonstrated a highly correlated rapid rise in patents and publications, suggesting they are areas of technology expansion. In-depth analysis of neuromodulation patents revealed that the majority of high performing patents were related to Deep Brain Stimulation (DBS). Conclusions Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery. PMID:25699414

  5. Quantitative assessment of background parenchymal enhancement in breast magnetic resonance images predicts the risk of breast cancer.

    PubMed

    Hu, Xiaoxin; Jiang, Luan; Li, Qiang; Gu, Yajia

    2017-02-07

    The objective of this study was to evaluate the association betweenthe quantitative assessment of background parenchymal enhancement rate (BPER) and breast cancer. From 14,033 consecutive patients who underwent breast MRI in our center, we randomly selected 101 normal controls. Then, we selected 101 women with benign breast lesions and 101 women with breast cancer who were matched for age and menstruation status. We evaluated BPER at early (2 minutes), medium (4 minutes) and late (6 minutes) enhanced time phases of breast MRI for quantitative assessment. Odds ratios (ORs) for risk of breast cancer were calculated using the receiver operating curve. The BPER increased in a time-dependent manner after enhancement in both premenopausal and postmenopausal women. Premenopausal women had higher BPER than postmenopausal women at early, medium and late enhanced phases. In the normal population, the OR for probability of breast cancer for premenopausal women with high BPER was 4.1 (95% CI: 1.7-9.7) and 4.6 (95% CI: 1.7-12.0) for postmenopausal women. The OR of breast cancer morbidity in premenopausal women with high BPER was 2.6 (95% CI: 1.1-6.4) and 2.8 (95% CI: 1.2-6.1) for postmenopausal women. The BPER was found to be a predictive factor of breast cancer morbidity. Different time phases should be used to assess BPER in premenopausal and postmenopausal women.

  6. Technical and clinical view on ambulatory assessment in Parkinson's disease.

    PubMed

    Hobert, M A; Maetzler, W; Aminian, K; Chiari, L

    2014-09-01

    With the progress of technologies of recent years, methods have become available that use wearable sensors and ambulatory systems to measure aspects of--particular axial--motor function. As Parkinson's disease (PD) can be considered a model disorder for motor impairment, a significant number of studies have already been performed with these patients using such techniques. In general, motion sensors such as accelerometers and gyroscopes are used, in combination with lightweight electronics that do not interfere with normal human motion. A fundamental advantage in comparison with usual clinical assessment is that these sensors allow a more quantitative, objective, and reliable evaluation of symptoms; they have also significant advantages compared to in-lab technologies (e.g., optoelectronic motion capture) as they allow long-term monitoring under real-life conditions. In addition, based on recent findings particularly from studies using functional imaging, we learned that non-motor symptoms, specifically cognitive aspects, may be at least indirectly assessable. It is hypothesized that ambulatory quantitative assessment strategies will allow users, clinicians, and scientists in the future to gain more quantitative, unobtrusive, and everyday relevant data out of their clinical evaluation and can also be designed as pervasive (everywhere) and intensive (anytime) tools for ambulatory assessment and even rehabilitation of motor and (partly) non-motor symptoms in PD. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Quantitative Estimation of Plasma Free Drug Fraction in Patients With Varying Degrees of Hepatic Impairment: A Methodological Evaluation.

    PubMed

    Li, Guo-Fu; Yu, Guo; Li, Yanfei; Zheng, Yi; Zheng, Qing-Shan; Derendorf, Hartmut

    2018-07-01

    Quantitative prediction of unbound drug fraction (f u ) is essential for scaling pharmacokinetics through physiologically based approaches. However, few attempts have been made to evaluate the projection of f u values under pathological conditions. The primary objective of this study was to predict f u values (n = 105) of 56 compounds with or without the information of predominant binding protein in patients with varying degrees of hepatic insufficiency by accounting for quantitative changes in molar concentrations of either the major binding protein or albumin plus alpha 1-acid glycoprotein associated with differing levels of hepatic dysfunction. For the purpose of scaling, data pertaining to albumin and α1-acid glycoprotein levels in response to differing degrees of hepatic impairment were systematically collected from 919 adult donors. The results of the present study demonstrate for the first time the feasibility of physiologically based scaling f u in hepatic dysfunction after verifying with experimentally measured data of a wide variety of compounds from individuals with varying degrees of hepatic insufficiency. Furthermore, the high level of predictive accuracy indicates that the inter-relation between the severity of hepatic impairment and these plasma protein levels are physiologically accurate. The present study enhances the confidence in predicting f u in hepatic insufficiency, particularly for albumin-bound drugs. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  8. Methods for collecting algal samples as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  9. Evaluation of disease progression in INCL by MR spectroscopy

    PubMed Central

    Baker, Eva H; Levin, Sondra W; Zhang, Zhongjian; Mukherjee, Anil B

    2015-01-01

    Objective Infantile neuronal ceroid lipofuscinosis (INCL) is a devastating neurodegenerative storage disease caused by palmitoyl-protein thioesterase-1 deficiency, which impairs degradation of palmitoylated proteins (constituents of ceroid) by lysosomal hydrolases. Consequent lysosomal ceroid accumulation leads to neuronal injury. As part of a pilot study to evaluate treatment benefits of cysteamine bitartrate and N-acetylcysteine, we quantitatively measured brain metabolite levels using magnetic resonance spectroscopy (MRS). Methods A subset of two patients from a larger treatment and follow-up study underwent serial quantitative single-voxel MRS examinations of five anatomical sites. Three echo times were acquired in order to estimate metabolite T2. Measured metabolite levels included correction for partial volume of cerebrospinal fluid. Comparison of INCL patients was made to a reference group composed of asymptomatic and minimally symptomatic Niemann-Pick disease type C patients. Results In INCL patients, N-acetylaspartate (NAA) was abnormally low at all locations upon initial measurement, and further declined throughout the follow-up period. In the cerebrum (affected early in the disease course), choline and myo-inositol were initially elevated and fell during the follow-up period, whereas in the cerebellum and brainstem (affected later), choline and myo-inositol were initially normal and rose subsequently. Interpretation Choline and myo-inositol levels in our patients are consistent with patterns of neuroinflammation observed in two INCL mouse models. Low, persistently declining NAA was expected based on the progressive, irreversible nature of the disease. Progression of metabolite levels in INCL has not been previously quantified; therefore the results of this study serve as a reference for quantitative evaluation of future therapeutic interventions. PMID:26339674

  10. Evaluation of Nosocomial Infection Control Programs in health services 1

    PubMed Central

    Menegueti, Mayra Gonçalves; Canini, Silvia Rita Marin da Silva; Bellissimo-Rodrigues, Fernando; Laus, Ana Maria

    2015-01-01

    OBJECTIVES: to evaluate the Nosocomial Infection Control Programs in hospital institutions regarding structure and process indicators. METHOD: this is a descriptive, exploratory and quantitative study conducted in 2013. The study population comprised 13 Nosocomial Infection Control Programs of health services in a Brazilian city of the state of São Paulo. Public domain instruments available in the Manual of Evaluation Indicators of Nosocomial Infection Control Practices were used. RESULTS: The indicators with the highest average compliance were "Evaluation of the Structure of the Nosocomial Infection Control Programs" (75%) and "Evaluation of the Epidemiological Surveillance System of Nosocomial Infection" (82%) and those with the lowest mean compliance scores were "Evaluation of Operational Guidelines" (58.97%) and "Evaluation of Activities of Control and Prevention of Nosocomial Infection" (60.29%). CONCLUSION: The use of indicators identified that, despite having produced knowledge about prevention and control of nosocomial infections, there is still a large gap between the practice and the recommendations. PMID:25806637

  11. Nuclear medicine and imaging research (quantitative studies in radiopharmaceutical science)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1990-09-01

    This is a report of progress in Year Two (January 1, 1990--December 31, 1990) of Grant FG02-86ER60438, Quantitative Studies in Radiopharmaceutical Science,'' awarded for the three-year period January 1, 1989--December 31, 1991 as a competitive renewal following site visit in the fall of 1988. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further themore » development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 25 refs., 13 figs., 1 tab.« less

  12. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  13. Quantitative and Isolated Measurement of Far-Field Light Scattering by a Single Nanostructure

    NASA Astrophysics Data System (ADS)

    Kim, Donghyeong; Jeong, Kwang-Yong; Kim, Jinhyung; Ee, Ho-Seok; Kang, Ju-Hyung; Park, Hong-Gyu; Seo, Min-Kyo

    2017-11-01

    Light scattering by nanostructures has facilitated research on various optical phenomena and applications by interfacing the near fields and free-propagating radiation. However, direct quantitative measurement of far-field scattering by a single nanostructure on the wavelength scale or less is highly challenging. Conventional back-focal-plane imaging covers only a limited solid angle determined by the numerical aperture of the objectives and suffers from optical aberration and distortion. Here, we present a quantitative measurement of the differential far-field scattering cross section of a single nanostructure over the full hemisphere. In goniometer-based far-field scanning with a high signal-to-noise ratio of approximately 27.4 dB, weak scattering signals are efficiently isolated and detected under total-internal-reflection illumination. Systematic measurements reveal that the total and differential scattering cross sections of a Au nanorod are determined by the plasmonic Fabry-Perot resonances and the phase-matching conditions to the free-propagating radiation, respectively. We believe that our angle-resolved far-field measurement scheme provides a way to investigate and evaluate the physical properties and performance of nano-optical materials and phenomena.

  14. Quantitative phase retrieval with arbitrary pupil and illumination

    DOE PAGES

    Claus, Rene A.; Naulleau, Patrick P.; Neureuther, Andrew R.; ...

    2015-10-02

    We present a general algorithm for combining measurements taken under various illumination and imaging conditions to quantitatively extract the amplitude and phase of an object wave. The algorithm uses the weak object transfer function, which incorporates arbitrary pupil functions and partially coherent illumination. The approach is extended beyond the weak object regime using an iterative algorithm. Finally, we demonstrate the method on measurements of Extreme Ultraviolet Lithography (EUV) multilayer mask defects taken in an EUV zone plate microscope with both a standard zone plate lens and a zone plate implementing Zernike phase contrast.

  15. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  16. A diagnostic system for articular cartilage using non-destructive pulsed laser irradiation.

    PubMed

    Sato, Masato; Ishihara, Miya; Kikuchi, Makoto; Mochida, Joji

    2011-07-01

    Osteoarthritis involves dysfunction caused by cartilage degeneration, but objective evaluation methodologies based on the original function of the articular cartilage remain unavailable. Evaluations for osteoarthritis are mostly based simply on patient symptoms or the degree of joint space narrowing on X-ray images. Accurate measurement and quantitative evaluation of the mechanical characteristics of the cartilage is important, and the tissue properties of the original articular cartilage must be clarified to understand the pathological condition in detail and to correctly judge the efficacy of treatment. We have developed new methods to measure some essential properties of cartilage: a photoacoustic measurement method; and time-resolved fluorescence spectroscopy. A nanosecond-pulsed laser, which is completely non-destructive, is focused onto the target cartilage and induces a photoacoustic wave that will propagate with attenuation and is affected by the viscoelasticity of the surrounding cartilage. We also investigated whether pulsed laser irradiation and the measurement of excited autofluorescence allow real-time, non-invasive evaluation of tissue characteristics. The decay time, during which the amplitude of the photoacoustic wave is reduced by a factor of 1/e, represents the key numerical value used to characterize and evaluate the viscoelasticity and rheological behavior of the cartilage. Our findings show that time-resolved laser-induced autofluorescence spectroscopy (TR-LIFS) is useful for evaluating tissue-engineered cartilage. Photoacoustic measurement and TR-LIFS, predicated on the interactions between optics and living organs, is a suitable methodology for diagnosis during arthroscopy, allowing quantitative and multidirectional evaluation of the original function of the cartilage based on a variety of parameters. Copyright © 2011 Wiley-Liss, Inc.

  17. Evaluation of a rule-based compositing technique for Landsat-5 TM and Landsat-7 ETM+ images

    NASA Astrophysics Data System (ADS)

    Lück, W.; van Niekerk, A.

    2016-05-01

    Image compositing is a multi-objective optimization process. Its goal is to produce a seamless cloud and artefact-free artificial image. This is achieved by aggregating image observations and by replacing poor and cloudy data with good observations from imagery acquired within the timeframe of interest. This compositing process aims to minimise the visual artefacts which could result from different radiometric properties, caused by atmospheric conditions, phenologic patterns and land cover changes. It has the following requirements: (1) image compositing must be cloud free, which requires the detection of clouds and shadows, and (2) the image composite must be seamless, minimizing artefacts and visible across inter image seams. This study proposes a new rule-based compositing technique (RBC) that combines the strengths of several existing methods. A quantitative and qualitative evaluation is made of the RBC technique by comparing it to the maximum NDVI (MaxNDVI), minimum red (MinRed) and maximum ratio (MaxRatio) compositing techniques. A total of 174 Landsat TM and ETM+ images, covering three study sites and three different timeframes for each site, are used in the evaluation. A new set of quantitative/qualitative evaluation techniques for compositing quality measurement was developed and showed that the RBC technique outperformed all other techniques, with MaxRatio, MaxNDVI, and MinRed techniques in order of performance from best to worst.

  18. Multiview photometric stereo.

    PubMed

    Hernández Esteban, Carlos; Vogiatzis, George; Cipolla, Roberto

    2008-03-01

    This paper addresses the problem of obtaining complete, detailed reconstructions of textureless shiny objects. We present an algorithm which uses silhouettes of the object, as well as images obtained under changing illumination conditions. In contrast with previous photometric stereo techniques, ours is not limited to a single viewpoint but produces accurate reconstructions in full 3D. A number of images of the object are obtained from multiple viewpoints, under varying lighting conditions. Starting from the silhouettes, the algorithm recovers camera motion and constructs the object's visual hull. This is then used to recover the illumination and initialise a multi-view photometric stereo scheme to obtain a closed surface reconstruction. There are two main contributions in this paper: Firstly we describe a robust technique to estimate light directions and intensities and secondly, we introduce a novel formulation of photometric stereo which combines multiple viewpoints and hence allows closed surface reconstructions. The algorithm has been implemented as a practical model acquisition system. Here, a quantitative evaluation of the algorithm on synthetic data is presented together with complete reconstructions of challenging real objects. Finally, we show experimentally how even in the case of highly textured objects, this technique can greatly improve on correspondence-based multi-view stereo results.

  19. A Quantitative Comparison of Calibration Methods for RGB-D Sensors Using Different Technologies.

    PubMed

    Villena-Martínez, Víctor; Fuster-Guilló, Andrés; Azorín-López, Jorge; Saval-Calvo, Marcelo; Mora-Pascual, Jeronimo; Garcia-Rodriguez, Jose; Garcia-Garcia, Alberto

    2017-01-27

    RGB-D (Red Green Blue and Depth) sensors are devices that can provide color and depth information from a scene at the same time. Recently, they have been widely used in many solutions due to their commercial growth from the entertainment market to many diverse areas (e.g., robotics, CAD, etc.). In the research community, these devices have had good uptake due to their acceptable levelofaccuracyformanyapplicationsandtheirlowcost,butinsomecases,theyworkatthelimitof their sensitivity, near to the minimum feature size that can be perceived. For this reason, calibration processes are critical in order to increase their accuracy and enable them to meet the requirements of such kinds of applications. To the best of our knowledge, there is not a comparative study of calibration algorithms evaluating its results in multiple RGB-D sensors. Specifically, in this paper, a comparison of the three most used calibration methods have been applied to three different RGB-D sensors based on structured light and time-of-flight. The comparison of methods has been carried out by a set of experiments to evaluate the accuracy of depth measurements. Additionally, an object reconstruction application has been used as example of an application for which the sensor works at the limit of its sensitivity. The obtained results of reconstruction have been evaluated through visual inspection and quantitative measurements.

  20. Comparative evaluation of performance measures for shading correction in time-lapse fluorescence microscopy.

    PubMed

    Liu, L; Kan, A; Leckie, C; Hodgkin, P D

    2017-04-01

    Time-lapse fluorescence microscopy is a valuable technology in cell biology, but it suffers from the inherent problem of intensity inhomogeneity due to uneven illumination or camera nonlinearity, known as shading artefacts. This will lead to inaccurate estimates of single-cell features such as average and total intensity. Numerous shading correction methods have been proposed to remove this effect. In order to compare the performance of different methods, many quantitative performance measures have been developed. However, there is little discussion about which performance measure should be generally applied for evaluation on real data, where the ground truth is absent. In this paper, the state-of-the-art shading correction methods and performance evaluation methods are reviewed. We implement 10 popular shading correction methods on two artificial datasets and four real ones. In order to make an objective comparison between those methods, we employ a number of quantitative performance measures. Extensive validation demonstrates that the coefficient of joint variation (CJV) is the most applicable measure in time-lapse fluorescence images. Based on this measure, we have proposed a novel shading correction method that performs better compared to well-established methods for a range of real data tested. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  1. Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry

    NASA Astrophysics Data System (ADS)

    Lukomski, Michal; Krzemien, Leszek

    2013-05-01

    Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.

  2. An anthropomorphic phantom for quantitative evaluation of breast MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo

    2011-02-01

    In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of breast MRI imaging protocols for lesion detection and characterization.

  3. Face to Face Communications in Space

    NASA Technical Reports Server (NTRS)

    Cohen, Malcolm M.; Davon, Bonnie P. (Technical Monitor)

    1999-01-01

    It has been reported that human face-to-face communications in space are compromised by facial edema, variations in the orientations of speakers and listeners, and background noises that are encountered in the shuttle and in space stations. To date, nearly all reports have been anecdotal or subjective, in the form of post-flight interviews or questionnaires; objective and quantitative data are generally lacking. Although it is acknowledged that efficient face-to-face communications are essential for astronauts to work safely and effectively, specific ways in which the space environment interferes with non-linguistic communication cues are poorly documented. Because we have only a partial understanding of how non-linguistic communication cues may change with mission duration, it is critically important to obtain objective data, and to evaluate these cues under well-controlled experimental conditions.

  4. Initial Description of a Quantitative, Cross-Species (Chimpanzee-Human) Social Responsiveness Measure

    ERIC Educational Resources Information Center

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.

    2011-01-01

    Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…

  5. Comparison of DNA fragmentation and color thresholding for objective quantitation of apoptotic cells

    NASA Technical Reports Server (NTRS)

    Plymale, D. R.; Ng Tang, D. S.; Fermin, C. D.; Lewis, D. E.; Martin, D. S.; Garry, R. F.

    1995-01-01

    Apoptosis is a process of cell death characterized by distinctive morphological changes and fragmentation of cellular DNA. Using video imaging and color thresholding techniques, we objectively quantitated the number of cultured CD4+ T-lymphoblastoid cells (HUT78 cells, RH9 subclone) displaying morphological signs of apoptosis before and after exposure to gamma-irradiation. The numbers of apoptotic cells measured by objective video imaging techniques were compared to numbers of apoptotic cells measured in the same samples by sensitive apoptotic assays that quantitate DNA fragmentation. DNA fragmentation assays gave consistently higher values compared with the video imaging assays that measured morphological changes associated with apoptosis. These results suggest that substantial DNA fragmentation can precede or occur in the absence of the morphological changes which are associated with apoptosis in gamma-irradiated RH9 cells.

  6. [The functional independence of lexical numeric knowledge and the representation of magnitude: evidence from one case].

    PubMed

    Salguero-Alcañiz, M P; Lorca-Marín, J A; Alameda-Bailén, J R

    The ultimate purpose of cognitive neuropsychology is to find out how normal cognitive processes work. To this end, it studies subjects who have suffered brain damage but who, until their accident, were competent in the skills that are later to become the object of study. It is therefore necessary to study patients who have difficulty in processing numbers and in calculating in order to further our knowledge of these processes in the normal population. Our aim was to analyse the relationships between the different cognitive processes involved in numeric knowledge. We studied the case of a female patient who suffered an ischemic infarct in the perisylvian region, on both a superficial and deep level. She presented predominantly expressive mixed aphasia and predominantly brachial hemiparesis. Numeric processing and calculation were evaluated. The patient still had her lexical numeric knowledge but her quantitative numeric knowledge was impaired. These alterations in the quantitative numeric knowledge are evidenced by the difficulties the patient had in numeric comprehension tasks, as well as the severe impairments displayed in calculation. These findings allow us to conclude that quantitative numeric knowledge is functionally independent of lexical or non-quantitative numeric knowledge. From this functional autonomy, a possible structural independence can be inferred.

  7. A quantitative approach to evolution of music and philosophy

    NASA Astrophysics Data System (ADS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano

    2012-08-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.

  8. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  9. Calibration methods influence quantitative material decomposition in photon-counting spectral CT

    NASA Astrophysics Data System (ADS)

    Curtis, Tyler E.; Roeder, Ryan K.

    2017-03-01

    Photon-counting detectors and nanoparticle contrast agents can potentially enable molecular imaging and material decomposition in computed tomography (CT). Material decomposition has been investigated using both simulated and acquired data sets. However, the effect of calibration methods on material decomposition has not been systematically investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on quantitative material decomposition. A commerciallyavailable photon-counting spectral micro-CT (MARS Bioimaging) was used to acquire images with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material basis matrix values were determined using multiple linear regression models and material decomposition was performed using a maximum a posteriori estimator. The accuracy of quantitative material decomposition was evaluated by the root mean squared error (RMSE), specificity, sensitivity, and area under the curve (AUC). An increased maximum concentration (range) in the calibration significantly improved RMSE, specificity and AUC. The effects of an increased number of concentrations in the calibration were not statistically significant for the conditions in this study. The overall results demonstrated that the accuracy of quantitative material decomposition in spectral CT is significantly influenced by calibration methods, which must therefore be carefully considered for the intended diagnostic imaging application.

  10. Manned Versus Unmanned Risk and Complexity Considerations for Future Midsized X-Planes

    NASA Technical Reports Server (NTRS)

    Lechniak, Jason A.; Melton, John E.

    2017-01-01

    The objective of this work was to identify and estimate complexity and risks associated with the development and testing of new low-cost medium-scale X-plane aircraft primarily focused on air transport operations. Piloting modes that were evaluated for this task were manned, remotely piloted, and unmanned flight research programs. This analysis was conducted early in the data collection period for X-plane concept vehicles before preliminary designs were complete. Over 50 different aircraft and system topics were used to evaluate the three piloting control modes. Expert group evaluations from a diverse set of pilots, engineers, and other experts at Aeronautics Research Mission Directorate centers within the National Aeronautics and Space Administration provided qualitative reasoning on the many issues surrounding the decisions regarding piloting modes. The group evaluations were numerically rated to evaluate each topic quantitatively and were used to provide independent criteria for vehicle complexity and risk. An Edwards Air Force Base instruction document was identified that emerged as a source of the effects found in our qualitative and quantitative data. The study showed that a manned aircraft was the best choice to align with test activities for transport aircraft flight research from a low-complexity and low-risk perspective. The study concluded that a manned aircraft option would minimize the risk and complexity to improve flight-test efficiency and bound the cost of the flight-test portion of the program. Several key findings and discriminators between the three modes are discussed in detail.

  11. Occupational injury among migrant workers in China: a systematic review

    PubMed Central

    Fitzgerald, Simon; Chen, Xin; Qu, Hui; Sheff, Mira Grice

    2017-01-01

    Objectives This review considers the state of occupational injury surveillance and prevention among migrant workers in China and suggests areas of focus for future research on the topic. Methods Bibliographic databases were searched for qualitative and quantitative studies on surveillance of and interventions to prevent occupational injury among migrant workers in mainland China. Additional abstracts were identified from the citations of relevant articles from the database search. Studies fitting the inclusion criteria were evaluated, and findings were extracted and summarised. Results The search uncovered 726 studies in the English-language databases searched, and 3109 in the Chinese database. This article analyses a total of 19 research articles that fit the inclusion criteria with qualitative or quantitative data on occupational injury surveillance and prevention of migrant workers in China. Despite evidence of the vulnerability of migrant workers in the workplace, there is little systematic surveillance of occupational injury and few evaluated interventions. Conclusions Migrant workers account for a disproportionate burden of occupational injury morbidity and mortality in China. However, data are inconsistent and inadequate to detail injury incidence or to evaluate interventions. The following are suggestions to decrease injury incidence among migrants: strengthen the national system of occupational injury surveillance; focus surveillance and interventions on high-risk occupations employing migrants such as construction, manufacturing and small mining operations; improve occupational safety training and access to appropriate safety equipment; evaluate recent changes in occupational health and safety and evaluate outcome of multi-party interventions to reduce occupational injury among migrant workers. PMID:23710065

  12. Tomosynthesis can facilitate accurate measurement of joint space width under the condition of the oblique incidence of X-rays in patients with rheumatoid arthritis.

    PubMed

    Ono, Yohei; Kashihara, Rina; Yasojima, Nobutoshi; Kasahara, Hideki; Shimizu, Yuka; Tamura, Kenichi; Tsutsumi, Kaori; Sutherland, Kenneth; Koike, Takao; Kamishima, Tamotsu

    2016-06-01

    Accurate evaluation of joint space width (JSW) is important in the assessment of rheumatoid arthritis (RA). In clinical radiography of bilateral hands, the oblique incidence of X-rays is unavoidable, which may cause perceptional or measurement error of JSW. The objective of this study was to examine whether tomosynthesis, a recently developed modality, can facilitate a more accurate evaluation of JSW than radiography under the condition of oblique incidence of X-rays. We investigated quantitative errors derived from the oblique incidence of X-rays by imaging phantoms simulating various finger joint spaces using radiographs and tomosynthesis images. We then compared the qualitative results of the modified total Sharp score of a total of 320 joints from 20 patients with RA between these modalities. A quantitative error was prominent when the location of the phantom was shifted along the JSW direction. Modified total Sharp scores of tomosynthesis images were significantly higher than those of radiography, that is to say JSW was regarded as narrower in tomosynthesis than in radiography when finger joints were located where the oblique incidence of X-rays is expected in the JSW direction. Tomosynthesis can facilitate accurate evaluation of JSW in finger joints of patients with RA, even with oblique incidence of X-rays. Accurate evaluation of JSW is necessary for the management of patients with RA. Through phantom and clinical studies, we demonstrate that tomosynthesis may achieve more accurate evaluation of JSW.

  13. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-06-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  14. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  15. Evaluation Criteria for Nursing Student Application of Evidence-Based Practice: A Delphi Study.

    PubMed

    Bostwick, Lina; Linden, Lois

    2016-06-01

    Core clinical evaluation criteria do not exist for measuring prelicensure baccalaureate nursing students' application of evidence-based practice (EBP) during direct care assignments. The study objective was to achieve consensus among EBP nursing experts to create clinical criteria for faculty to use in evaluating students' application of EBP principles. A three-round Delphi method was used. Experts were invited to participate in Web-based surveys. Data were analyzed using qualitative coding and categorizing. Quantitative analyses were descriptive calculations for rating and ranking. Expert consensus occurred in the Delphi rounds. The study provides a set of 10 core clinical evaluation criteria for faculty evaluating students' progression toward competency in their application of EBP. A baccalaureate program curriculum requiring the use of Bostwick's EBP Core Clinical Evaluation Criteria will provide a clear definition for understanding basic core EBP competence as expected for the assessment of student learning. [J Nurs Educ. 2016;55(5):336-341.]. Copyright 2016, SLACK Incorporated.

  16. Integrating regional conservation priorities for multiple objectives into national policy

    PubMed Central

    Beger, Maria; McGowan, Jennifer; Treml, Eric A.; Green, Alison L.; White, Alan T.; Wolff, Nicholas H.; Klein, Carissa J.; Mumby, Peter J.; Possingham, Hugh P.

    2015-01-01

    Multinational conservation initiatives that prioritize investment across a region invariably navigate trade-offs among multiple objectives. It seems logical to focus where several objectives can be achieved efficiently, but such multi-objective hotspots may be ecologically inappropriate, or politically inequitable. Here we devise a framework to facilitate a regionally cohesive set of marine-protected areas driven by national preferences and supported by quantitative conservation prioritization analyses, and illustrate it using the Coral Triangle Initiative. We identify areas important for achieving six objectives to address ecosystem representation, threatened fauna, connectivity and climate change. We expose trade-offs between areas that contribute substantially to several objectives and those meeting one or two objectives extremely well. Hence there are two strategies to guide countries choosing to implement regional goals nationally: multi-objective hotspots and complementary sets of single-objective priorities. This novel framework is applicable to any multilateral or global initiative seeking to apply quantitative information in decision making. PMID:26364769

  17. Quantitative criteria for assessment of gamma-ray imager performance

    NASA Astrophysics Data System (ADS)

    Gottesman, Steve; Keller, Kristi; Malik, Hans

    2015-08-01

    In recent years gamma ray imagers such as the GammaCamTM and Polaris have demonstrated good imaging performance in the field. Imager performance is often summarized as "resolution", either angular, or spatial at some distance from the imager, however the definition of resolution is not always related to the ability to image an object. It is difficult to quantitatively compare imagers without a common definition of image quality. This paper examines three categories of definition: point source; line source; and area source. It discusses the details of those definitions and which ones are more relevant for different situations. Metrics such as Full Width Half Maximum (FWHM), variations on the Rayleigh criterion, and some analogous to National Imagery Interpretability Rating Scale (NIIRS) are discussed. The performance against these metrics is evaluated for a high resolution coded aperture imager modeled using Monte Carlo N-Particle (MCNP), and for a medium resolution imager measured in the lab.

  18. Space Transportation Operations: Assessment of Methodologies and Models

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla

    2001-01-01

    The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.

  19. Space Transportation Operations: Assessment of Methodologies and Models

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla

    2002-01-01

    The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.

  20. Application of principal component analysis (PCA) as a sensory assessment tool for fermented food products.

    PubMed

    Ghosh, Debasree; Chattopadhyay, Parimal

    2012-06-01

    The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.

  1. Investigation on microfluidic particles manipulation by holographic 3D tracking strategies

    NASA Astrophysics Data System (ADS)

    Cacace, Teresa; Paturzo, Melania; Memmolo, Pasquale; Vassalli, Massimo; Fraldi, Massimiliano; Mensitieri, Giuseppe; Ferraro, Pietro

    2017-06-01

    We demonstrate a 3D holographic tracking method to investigate particles motion in a microfluidic channel while unperturbed while inducing their migration through microfluidic manipulation. Digital holography (DH) in microscopy is a full-field, label-free imaging technique able to provide quantitative phase-contrast. The employed 3D tracking method is articulated in steps. First, the displacements along the optical axis are assessed by numerical refocusing criteria. In particular, an automatic refocusing method to recover the particles axial position is implemented employing a contrast-based refocusing criterion. Then, the transverse position of the in-focus object is evaluated through quantitative phase map segmentation methods and centroid-based 2D tracking strategy. The introduction of DH is thus suggested as a powerful approach for control of particles and biological samples manipulation, as well as a possible aid to precise design and implementation of advanced lab-on-chip microfluidic devices.

  2. Nursing Activities Score: nursing work load in a burns Intensive Care Unit1

    PubMed Central

    Camuci, Marcia Bernadete; Martins, Júlia Trevisan; Cardeli, Alexandrina Aparecida Maciel; Robazzi, Maria Lúcia do Carmo Cruz

    2014-01-01

    Objective to evaluate the nursing work load in a Burns Intensive Care Unit according to the Nursing Activities Score. Method an exploratory, descriptive cross-sectional study with a quantitative approach. The Nursing Activities Score was used for data collection between October 2011 and May 2012, totalling 1,221 measurements, obtained from 50 patients' hospital records. Data for qualitative variables was described in tables; for the quantitative variables, calculations using statistical measurements were used. Results the mean score for the Nursing Activities Score was 70.4% and the median was 70.3%, corresponding to the percentage of the time spent on direct care to the patient in 24 hours. Conclusion the Nursing Activities Score provided information which involves the process of caring for patients hospitalized in a Burns Intensive Care Unit, and indicated that there is a high work load for the nursing team of the sector studied. PMID:26107842

  3. Quantitative analysis of spatial variability of geotechnical parameters

    NASA Astrophysics Data System (ADS)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  4. After-School Program for urban youth: Evaluation of a health careers course in New York City high schools.

    PubMed

    Holden, Lynne; Berger, Wallace; Zingarelli, Rebecca; Siegel, Elliot

    Mentoring in Medicine (MIM) addresses an urgent national need for minority health professionals and promotes careers in health care for urban youth. The MIM After School Program (ASP or The Course) has as its primary objectives to provide academic enrichment in human biology and motivate disadvantaged youth to pursue a career in the health professions. Secondary objectives of The Course, although not evaluated here, are to improve students' health literacy and knowledge of healthy living behaviors. Since 2009, over 1500 middle and high school students have completed the New York City based Course, which is offered once a week over a 10 week semester in an out-of-school venue. This study assesses the success of The Course in achieving its primary objectives with 84 students at five New York City high schools during the fall 2014 semester. The Course curriculum was created especially for MIM, comprises the body's 11 organ systems, and is presented in discrete modules (one each semester), along with complementary educational activities, including field trips and class projects. This study reports on a formal evaluation using quantitative and qualitative methods. The quantitative evaluation found that the students significantly increased their knowledge of the Gastrointestinal System. Students across the academic spectrum appeared to have learned the MIM ASP Course content - high school GPA was not a predictor of knowledge acquisition. The students also reported that The Course significantly increased their self-confidence in their ability to succeed (self-efficacy). The students expressed a significant increase in five health care related attitudes and an additional increase in their ability to overcome personal issues to succeed in their career and significantly improving their feeling toward, and likely pursuit of, a health career. The students stated that The Course significantly increased their interest and intent to seek out more information about health care, participate in health care activities, and take more health care courses in high school. The qualitative evaluation found that the students and their parents were pleased with the MIM ASP Course's composition, presentation, and effectiveness. With a large majority of the parents stating that their child got out of The Course what they had hoped for and that The Course made it more likely that they would recommend a health career for their child. The students and instructional staff also identified The Course elements that they felt were most and least effective. Best practices that were used in designing and conducting The Course were identified. The MIM ASP Course appears to have achieved its principal educational objectives of providing academic enrichment in human biology and improving attitudes towards a health career for a self-selected population of disadvantaged, underrepresented minority high school students in an urban setting.

  5. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  6. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  7. [Design, implementation and evaluation of a health education program for the elderly].

    PubMed

    Pino, Margarita; Ricoy, Maria Carmen; Portela, Julio

    2010-09-01

    The objective was to design, implement and evaluate a health education program, using the analysis of the habits that harm the health of people over 65 years old. An evaluative research was carried out from a multiple case study in the North-West area of Spain, combining both the quantitative and the qualitative approach. A questionnaire and interview were used as tools for data collection. The elderly take a lot of medicines and also treat themselves. A small group smoke and drink alcohol. Over 25% have sedentary habits and their average body mass rate was 30.55. The implementation of the programme has significantly influenced their quality of life. Elderly people have deep rooted unhealthy habits. The achievement of educational contents improved their quality of life. However, they are reluctant to assume new habits, even though they are healthy.

  8. Spacecraft-plasma-debris interaction in an ion beam shepherd mission

    NASA Astrophysics Data System (ADS)

    Cichocki, Filippo; Merino, Mario; Ahedo, Eduardo

    2018-05-01

    This paper presents a study of the interaction between a spacecraft, a plasma thruster plume and a free floating object, in the context of an active space debris removal mission based on the ion beam shepherd concept. The analysis is performed with the EP2PLUS hybrid code and includes the evaluation of the transferred force and torque to the target debris, its surface sputtering due to the impinging hypersonic ions, and the equivalent electric circuit of the spacecraft-plasma-debris interaction. The electric potential difference that builds up between the spacecraft and the debris, the ion backscattering and the backsputtering contamination of the shepherd satellite are evaluated for a nominal scenario. A sensitivity analysis is carried out to evaluate quantitatively the effects of electron thermodynamics, ambient plasma, heavy species collisions, and debris position.

  9. Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.

    PubMed

    Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K

    2017-05-01

    Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.

  10. Statistical procedures for evaluating daily and monthly hydrologic model predictions

    USGS Publications Warehouse

    Coffey, M.E.; Workman, S.R.; Taraba, J.L.; Fogle, A.W.

    2004-01-01

    The overall study objective was to evaluate the applicability of different qualitative and quantitative methods for comparing daily and monthly SWAT computer model hydrologic streamflow predictions to observed data, and to recommend statistical methods for use in future model evaluations. Statistical methods were tested using daily streamflows and monthly equivalent runoff depths. The statistical techniques included linear regression, Nash-Sutcliffe efficiency, nonparametric tests, t-test, objective functions, autocorrelation, and cross-correlation. None of the methods specifically applied to the non-normal distribution and dependence between data points for the daily predicted and observed data. Of the tested methods, median objective functions, sign test, autocorrelation, and cross-correlation were most applicable for the daily data. The robust coefficient of determination (CD*) and robust modeling efficiency (EF*) objective functions were the preferred methods for daily model results due to the ease of comparing these values with a fixed ideal reference value of one. Predicted and observed monthly totals were more normally distributed, and there was less dependence between individual monthly totals than was observed for the corresponding predicted and observed daily values. More statistical methods were available for comparing SWAT model-predicted and observed monthly totals. The 1995 monthly SWAT model predictions and observed data had a regression Rr2 of 0.70, a Nash-Sutcliffe efficiency of 0.41, and the t-test failed to reject the equal data means hypothesis. The Nash-Sutcliffe coefficient and the R r2 coefficient were the preferred methods for monthly results due to the ability to compare these coefficients to a set ideal value of one.

  11. Bibliometrics: tracking research impact by selecting the appropriate metrics.

    PubMed

    Agarwal, Ashok; Durairajanayagam, Damayanthi; Tatagari, Sindhuja; Esteves, Sandro C; Harlev, Avi; Henkel, Ralf; Roychoudhury, Shubhadeep; Homa, Sheryl; Puchalt, Nicolás Garrido; Ramasamy, Ranjith; Majzoub, Ahmad; Ly, Kim Dao; Tvrda, Eva; Assidi, Mourad; Kesari, Kavindra; Sharma, Reecha; Banihani, Saleem; Ko, Edmund; Abu-Elmagd, Muhammad; Gosalvez, Jaime; Bashiri, Asher

    2016-01-01

    Traditionally, the success of a researcher is assessed by the number of publications he or she publishes in peer-reviewed, indexed, high impact journals. This essential yardstick, often referred to as the impact of a specific researcher, is assessed through the use of various metrics. While researchers may be acquainted with such matrices, many do not know how to use them to enhance their careers. In addition to these metrics, a number of other factors should be taken into consideration to objectively evaluate a scientist's profile as a researcher and academician. Moreover, each metric has its own limitations that need to be considered when selecting an appropriate metric for evaluation. This paper provides a broad overview of the wide array of metrics currently in use in academia and research. Popular metrics are discussed and defined, including traditional metrics and article-level metrics, some of which are applied to researchers for a greater understanding of a particular concept, including varicocele that is the thematic area of this Special Issue of Asian Journal of Andrology. We recommend the combined use of quantitative and qualitative evaluation using judiciously selected metrics for a more objective assessment of scholarly output and research impact.

  12. Shearography for Non-destructive Inspection with applications to BAT Mask Tile Adhesive Bonding and Specular Surface Honeycomb Panels

    NASA Technical Reports Server (NTRS)

    Lysak, Daniel B.

    2003-01-01

    The applicability of shearography techniques for non-destructive evaluation in two unique application areas is examined. In the first application, shearography is used to evaluate the quality of adhesive bonds holding lead tiles to the B.4T gamma ray mask for the NASA Swift program. Using a vibration excitation, the more poorly bonded tiles are readily identifiable in the shearography image. A quantitative analysis is presented that compares the shearography results with a destructive pull test measuring the force at bond failure. The second application is to evaluate the bonding between the skin and core of a honeycomb structure with a specular (mirror-like) surface. In standard shearography techniques, the object under test must have a diffuse surface to generate the speckle patterns in laser light, which are then sheared. A novel configuration using the specular surface as a mirror to image speckles from a diffuser is presented, opening up the use of shearography to a new class of objects that could not have been examined with the traditional approach. This new technique readily identifies large scale bond failures in the panel, demonstrating the validity of this approach.

  13. Physiology informed virtual surgical planning: a case study with a virtual airway surgical planner and BioGears

    NASA Astrophysics Data System (ADS)

    Potter, Lucas; Arikatla, Sreekanth; Bray, Aaron; Webb, Jeff; Enquobahrie, Andinet

    2017-03-01

    Stenosis of the upper airway affects approximately 1 in 200,000 adults per year1 , and occurs in neonates as well2 . Its treatment is often dictated by institutional factors and clinicians' experience or preferences 3 . Objective and quantitative methods of evaluating treatment options hold the potential to improve care in stenosis patients. Virtual surgical planning software tools are critically important for this. The Virtual Pediatric Airway Workbench (VPAW) is a software platform designed and evaluated for upper airway stenosis treatment planning. It incorporates CFD simulation and geometric authoring with objective metrics from both that help in informed evaluation and planning. However, this planner currently lacks physiological information which could impact the surgical planning outcomes. In this work, we integrated a lumped parameter, model based human physiological engine called BioGears with VPAW. We demonstrated the use of physiology informed virtual surgical planning platform for patient-specific stenosis treatment planning. The preliminary results show that incorporating patient-specific physiology in the pretreatment plan would play important role in patient-specific surgical trainers and planners in airway surgery and other types of surgery that are significantly impacted by physiological conditions during surgery.

  14. Bibliometrics: tracking research impact by selecting the appropriate metrics

    PubMed Central

    Agarwal, Ashok; Durairajanayagam, Damayanthi; Tatagari, Sindhuja; Esteves, Sandro C; Harlev, Avi; Henkel, Ralf; Roychoudhury, Shubhadeep; Homa, Sheryl; Puchalt, Nicolás Garrido; Ramasamy, Ranjith; Majzoub, Ahmad; Ly, Kim Dao; Tvrda, Eva; Assidi, Mourad; Kesari, Kavindra; Sharma, Reecha; Banihani, Saleem; Ko, Edmund; Abu-Elmagd, Muhammad; Gosalvez, Jaime; Bashiri, Asher

    2016-01-01

    Traditionally, the success of a researcher is assessed by the number of publications he or she publishes in peer-reviewed, indexed, high impact journals. This essential yardstick, often referred to as the impact of a specific researcher, is assessed through the use of various metrics. While researchers may be acquainted with such matrices, many do not know how to use them to enhance their careers. In addition to these metrics, a number of other factors should be taken into consideration to objectively evaluate a scientist's profile as a researcher and academician. Moreover, each metric has its own limitations that need to be considered when selecting an appropriate metric for evaluation. This paper provides a broad overview of the wide array of metrics currently in use in academia and research. Popular metrics are discussed and defined, including traditional metrics and article-level metrics, some of which are applied to researchers for a greater understanding of a particular concept, including varicocele that is the thematic area of this Special Issue of Asian Journal of Andrology. We recommend the combined use of quantitative and qualitative evaluation using judiciously selected metrics for a more objective assessment of scholarly output and research impact. PMID:26806079

  15. Validation of voxel-based morphometry (VBM) based on MRI

    NASA Astrophysics Data System (ADS)

    Yang, Xueyu; Chen, Kewei; Guo, Xiaojuan; Yao, Li

    2007-03-01

    Voxel-based morphometry (VBM) is an automated and objective image analysis technique for detecting differences in regional concentration or volume of brain tissue composition based on structural magnetic resonance (MR) images. VBM has been used widely to evaluate brain morphometric differences between different populations, but there isn't an evaluation system for its validation until now. In this study, a quantitative and objective evaluation system was established in order to assess VBM performance. We recruited twenty normal volunteers (10 males and 10 females, age range 20-26 years, mean age 22.6 years). Firstly, several focal lesions (hippocampus, frontal lobe, anterior cingulate, back of hippocampus, back of anterior cingulate) were simulated in selected brain regions using real MRI data. Secondly, optimized VBM was performed to detect structural differences between groups. Thirdly, one-way ANOVA and post-hoc test were used to assess the accuracy and sensitivity of VBM analysis. The results revealed that VBM was a good detective tool in majority of brain regions, even in controversial brain region such as hippocampus in VBM study. Generally speaking, much more severity of focal lesion was, better VBM performance was. However size of focal lesion had little effects on VBM analysis.

  16. Applications of Micro-CT scanning in medicine and dentistry: Microstructural analyses of a Wistar Rat mandible and a urinary tract stone

    NASA Astrophysics Data System (ADS)

    Latief, F. D. E.; Sari, D. S.; Fitri, L. A.

    2017-08-01

    High-resolution tomographic imaging by means of x-ray micro-computed tomography (μCT) has been widely utilized for morphological evaluations in dentistry and medicine. The use of μCT follows a standard procedure: image acquisition, reconstruction, processing, evaluation using image analysis, and reporting of results. This paper discusses methods of μCT using a specific scanning device, the Bruker SkyScan 1173 High Energy Micro-CT. We present a description of the general workflow, information on terminology for the measured parameters and corresponding units, and further analyses that can potentially be conducted with this technology. Brief qualitative and quantitative analyses, including basic image processing (VOI selection and thresholding) and measurement of several morphometrical variables (total VOI volume, object volume, percentage of total volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity) were conducted on two samples, the mandible of a wistar rat and a urinary tract stone, to illustrate the abilities of this device and its accompanying software package. The results of these analyses for both samples are reported, along with a discussion of the types of analyses that are possible using digital images obtained with a μCT scanning device, paying particular attention to non-diagnostic ex vivo research applications.

  17. Comprehensive benefit evaluation of direct power-purchase for large consumers

    NASA Astrophysics Data System (ADS)

    Liu, D. N.; Li, Z. H.; Zhou, H. M.; Zhao, Q.; Xu, X. F.

    2017-06-01

    Based on "several opinions of the CPC Central Committee and the State Council on further deepening the reform of electric power system" in 2015, this paper analyses the influence of direct power-purchase for large consumers on operation benefit of power grid. In three aspects, such as economic benefit, cleaning benefit and social benefit, the index system is proposed. In which, the profit of saving coal energy consumption, reducing carbon emissions and reducing pollutant emissions is quantitative calculated. Then the subjective and objective weights and index scores are figured out through the analytic hierarchy process, entropy weight method and interval number method. Finally, the comprehensive benefit is evaluated combined with the actual study, and some suggestions are made.

  18. Using GeoRePORT to report socio-economic potential for geothermal development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Katherine R.; Levine, Aaron

    The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT, http://en.openei.org/wiki/GeoRePORT) was developed for reporting resource grades and project readiness levels, providing the U.S. Department of Energy a consistent and comprehensible means of evaluating projects. The tool helps funding organizations (1) quantitatively identify barriers, (2) develop measureable goals, (3) objectively evaluate proposals, including contribution to goals, (4) monitor progress, and (5) report portfolio performance. GeoRePORT assesses three categories: geological, technical, and socio-economic. Here, we describe GeoRePORT, then focus on the socio-economic assessment and its applications for assessing deployment potential in the U.S. Socio-economic attributes include land access, permitting, transmission, and market.

  19. A numerical algorithm with preference statements to evaluate the performance of scientists.

    PubMed

    Ricker, Martin

    Academic evaluation committees have been increasingly receptive for using the number of published indexed articles, as well as citations, to evaluate the performance of scientists. It is, however, impossible to develop a stand-alone, objective numerical algorithm for the evaluation of academic activities, because any evaluation necessarily includes subjective preference statements. In a market, the market prices represent preference statements, but scientists work largely in a non-market context. I propose a numerical algorithm that serves to determine the distribution of reward money in Mexico's evaluation system, which uses relative prices of scientific goods and services as input. The relative prices would be determined by an evaluation committee. In this way, large evaluation systems (like Mexico's Sistema Nacional de Investigadores ) could work semi-automatically, but not arbitrarily or superficially, to determine quantitatively the academic performance of scientists every few years. Data of 73 scientists from the Biology Institute of Mexico's National University are analyzed, and it is shown that the reward assignation and academic priorities depend heavily on those preferences. A maximum number of products or activities to be evaluated is recommended, to encourage quality over quantity.

  20. Occupational hazard evaluation model underground coal mine based on unascertained measurement theory

    NASA Astrophysics Data System (ADS)

    Deng, Quanlong; Jiang, Zhongan; Sun, Yaru; Peng, Ya

    2017-05-01

    In order to study how to comprehensively evaluate the influence of several occupational hazard on miners’ physical and mental health, based on unascertained measurement theory, occupational hazard evaluation indicator system was established to make quantitative and qualitative analysis. Determining every indicator weight by information entropy and estimating the occupational hazard level by credible degree recognition criteria, the evaluation model was programmed by Visual Basic, applying the evaluation model to occupational hazard comprehensive evaluation of six posts under a coal mine, and the occupational hazard degree was graded, the evaluation results are consistent with actual situation. The results show that dust and noise is most obvious among the coal mine occupational hazard factors. Excavation face support workers are most affected, secondly, heading machine drivers, coal cutter drivers, coalface move support workers, the occupational hazard degree of these four types workers is II mild level. The occupational hazard degree of ventilation workers and safety inspection workers is I level. The evaluation model could evaluate underground coal mine objectively and accurately, and can be employed to the actual engineering.

  1. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  2. RNA-based determination of ESR1 and HER2 expression and response to neoadjuvant chemotherapy.

    PubMed

    Denkert, C; Loibl, S; Kronenwett, R; Budczies, J; von Törne, C; Nekljudova, V; Darb-Esfahani, S; Solbach, C; Sinn, B V; Petry, C; Müller, B M; Hilfrich, J; Altmann, G; Staebler, A; Roth, C; Ataseven, B; Kirchner, T; Dietel, M; Untch, M; von Minckwitz, G

    2013-03-01

    Hormone and human epidermal growth factor receptor 2 (HER2) receptors are the most important breast cancer biomarkers, and additional objective and quantitative test methods such as messenger RNA (mRNA)-based quantitative analysis are urgently needed. In this study, we investigated the clinical validity of RT-PCR-based evaluation of estrogen receptor (ESR1) and HER2 mRNA expression. A total of 1050 core biopsies from two retrospective (GeparTrio, GeparQuattro) and one prospective (PREDICT) neoadjuvant studies were evaluated by quantitative RT-PCR for ESR1 and HER2. ESR1 mRNA was significantly predictive for reduced response to neoadjuvant chemotherapy in univariate and multivariate analysis in all three cohorts. The complete pathologically documented response (pathological complete response, pCR) rate for ESR1+/HER2- tumors was 7.3%, 8.0% and 8.6%; for ESR1-/HER2- tumors it was 34.4%, 33.7% and 37.3% in GeparTrio, GeparQuattro and PREDICT, respectively (P < 0.001 in each cohort). In the Kaplan-Meier analysis in GeparTrio patients with ESR1+/HER2- tumors had the best prognosis, compared with ESR1-/HER2- and ESR1-/HER2+ tumors [disease-free survival (DFS): P < 0.0005, overall survival (OS): P < 0.0005]. Our results suggest that mRNA levels of ESR1 and HER2 predict response to neoadjuvant chemotherapy and are significantly associated with long-term outcome. As an additional option to standard immunohistochemistry and gene-array-based analysis, quantitative RT-PCR analysis might be useful for determination of the receptor status in breast cancer.

  3. Process evaluation in a multisite, primary obesity-prevention trial in American Indian schoolchildren1–3

    PubMed Central

    Helitzer, Deborah L; Davis, Sally M; Gittelsohn, Joel; Going, Scott B; Murray, David M; Snyder, Patricia; Steckler, Allan B

    2016-01-01

    We describe the development, implementation, and use of the process evaluation component of a multisite, primary obesity prevention trial for American Indian schoolchildren. We describe the development and pilot testing of the instruments, provide some examples of the criteria for instrument selection, and provide examples of how process evaluation results were used to document and refine intervention components. The theoretical and applied framework of the process evaluation was based on diffusion theory, social learning theory, and the desire for triangulation of multiple modes of data collection. The primary objectives of the process evaluation were to systematically document the training process, content, and implementation of 4 components of the intervention. The process evaluation was developed and implemented collaboratively so that it met the needs of both the evaluators and those who would be implementing the intervention components. Process evaluation results revealed that observation and structured interviews provided the most informative data; however, these methods were the most expensive and time consuming and required the highest level of skill to undertake. Although the literature is full of idealism regarding the uses of process evaluation for formative and summative purposes, in reality, many persons are sensitive to having their work evaluated in such an in-depth, context-based manner as is described. For this reason, use of structured, quantitative, highly objective tools may be more effective than qualitative methods, which appear to be more dependent on the skills and biases of the researcher and the context in which they are used. PMID:10195608

  4. Interventions for preventing neuropathy caused by cisplatin and related compounds.

    PubMed

    Albers, James W; Chaudhry, Vinay; Cavaletti, Guido; Donehower, Ross C

    2014-03-31

    Cisplatin and several related antineoplastic drugs used to treat many types of solid tumours are neurotoxic, and most patients completing a full course of cisplatin chemotherapy develop a clinically detectable sensory neuropathy. Effective neuroprotective therapies have been sought. To examine the efficacy and safety of purported chemoprotective agents to prevent or limit the neurotoxicity of cisplatin and related drugs. On 4 March 2013, we searched the Cochrane Neuromuscular Disease Group Specialized Register, CENTRAL, MEDLINE, EMBASE, LILACS, and CINAHL Plus for randomised trials designed to evaluate neuroprotective agents used to prevent or limit neurotoxicity of cisplatin and related drugs among human patients. We included randomised controlled trials (RCTs) or quasi-RCTs in which the participants received chemotherapy with cisplatin or related compounds, with a potential chemoprotectant (acetylcysteine, amifostine, adrenocorticotrophic hormone (ACTH), BNP7787, calcium and magnesium (Ca/Mg), diethyldithiocarbamate (DDTC), glutathione, Org 2766, oxcarbazepine, or vitamin E) compared to placebo, no treatment, or other treatments. We considered trials in which participants underwent evaluation zero to six months after completing chemotherapy using quantitative sensory testing (the primary outcome) or other measures including nerve conduction studies or neurological impairment rating using validated scales (secondary outcomes). Two review authors assessed each study, extracted the data and reached consensus, according to standard Cochrane methodology. As of 2013, the review includes 29 studies describing nine possible chemoprotective agents, as well as description of two published meta-analyses. Among these trials, there were sufficient data in some instances to combine the results from different studies, most often using data from secondary non-quantitative measures. Nine of the studies were newly included at this update. Few of the included studies were at a high risk of bias overall, although often there was too little information to make an assessment. At least two review authors performed a formal review of an additional 44 articles but we did not include them in the final review for a variety of reasons.Of seven eligible amifostine trials (743 participants in total), one used quantitative sensory testing (vibration perception threshold) and demonstrated a favourable outcome in terms of amifostine neuroprotection, but the vibration perception threshold result was based on data from only 14 participants receiving amifostine who completed the post-treatment evaluation and should be regarded with caution. Furthermore the change measured was subclinical. None of the three eligible Ca/Mg trials (or four trials if a single retrospective study was included) described our primary outcome measures. The four Ca/Mg trials included a total of 886 participants. Of the seven eligible glutathione trials (387 participants), one used quantitative sensory testing but reported only qualitative analyses. Four eligible Org 2766 trials (311 participants) employed quantitative sensory testing but reported disparate results; meta-analyses of three of these trials using comparable measures showed no significant vibration perception threshold neuroprotection. The remaining trial reported only descriptive analyses. Similarly, none of the three eligible vitamin E trials (246 participants) reported quantitative sensory testing. The eligible single trials involving acetylcysteine (14 participants), diethyldithiocarbamate (195 participants), oxcarbazepine (32 participants), and retinoic acid (92 participants) did not perform quantitative sensory testing. In all, this review includes data from 2906 participants. However, only seven trials reported data for the primary outcome measure of this review, (quantitative sensory testing) and only nine trials reported our objective secondary measure, nerve conduction test results. Additionally, methodological heterogeneity precluded pooling of the results in most cases. Nonetheless, a larger number of trials reported the results of secondary (non-quantitative and subjective) measures such as the National Cancer Institute Common Toxicity Criteria (NCI-CTC) for neuropathy (15 trials), and these results we pooled and reported as meta-analysis. Amifostine showed a significantly reduced risk of developing neurotoxicity NCI-CTC (or equivalent) ≥ 2 compared to placebo (RR 0.26, 95% CI 0.11 to 0.61). Glutathione was also efficacious with an RR of 0.29 (95% CI 0.10 to 0.85). In three vitamin E studies subjective measures not suitable for combination in meta analysis each favoured vitamin E. For other interventions the qualitative toxicity measures were either negative (N-acetyl cysteine, Ca/Mg, DDTC and retinoic acid) or not evaluated (oxcarbazepine and Org 2766).Adverse events were infrequent or not reported for most interventions. Amifostine was associated with transient hypotension in 8% to 62% of participants, retinoic acid with hypocalcaemia in 11%, and approximately 20% of participantss withdrew from treatment with DDTC because of toxicity. At present, the data are insufficient to conclude that any of the purported chemoprotective agents (acetylcysteine, amifostine, calcium and magnesium, diethyldithiocarbamate, glutathione, Org 2766, oxcarbazepine, retinoic acid, or vitamin E) prevent or limit the neurotoxicity of platin drugs among human patients, as determined using quantitative, objective measures of neuropathy. Amifostine, calcium and magnesium, glutathione, and vitamin E showed modest but promising (borderline statistically significant) results favouring their ability to reduce the neurotoxicity of cisplatin and related chemotherapies, as measured using secondary, non-quantitative and subjective measures such as the NCI-CTC neuropathy grading scale. Among these interventions, the efficacy of only vitamin E was evaluated using quantitative nerve conduction studies; the results were negative and did not support the positive findings based on the qualitative measures. In summary, the present studies are limited by the small number of participants receiving any particular agent, a lack of objective measures of neuropathy, and differing results among similar trials, which make it impossible to conclude that any of the neuroprotective agents tested prevent or limit the neurotoxicity of platinum drugs.

  5. Development of a quantitative assessment method of pigmentary skin disease using ultraviolet optical imaging.

    PubMed

    Lee, Onseok; Park, Sunup; Kim, Jaeyoung; Oh, Chilhwan

    2017-11-01

    The visual scoring method has been used as a subjective evaluation of pigmentary skin disorders. Severity of pigmentary skin disease, especially melasma, is evaluated using a visual scoring method, the MASI (melasma area severity index). This study differentiates between epidermal and dermal pigmented disease. The study was undertaken to determine methods to quantitatively measure the severity of pigmentary skin disorders under ultraviolet illumination. The optical imaging system consists of illumination (white LED, UV-A lamp) and image acquisition (DSLR camera, air cooling CMOS CCD camera). Each camera is equipped with a polarizing filter to remove glare. To analyze images of visible and UV light, images are divided into frontal, cheek, and chin regions of melasma patients. Each image must undergo image processing. To reduce the curvature error in facial contours, a gradient mask is used. The new method of segmentation of front and lateral facial images is more objective for face-area-measurement than the MASI score. Image analysis of darkness and homogeneity is adequate to quantify the conventional MASI score. Under visible light, active lesion margins appear in both epidermal and dermal melanin, whereas melanin is found in the epidermis under UV light. This study objectively analyzes severity of melasma and attempts to develop new methods of image analysis with ultraviolet optical imaging equipment. Based on the results of this study, our optical imaging system could be used as a valuable tool to assess the severity of pigmentary skin disease. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. Fluorescence imaging of tryptophan and collagen cross-links to evaluate wound closure ex vivo

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Ortega-Martinez, Antonio; Farinelli, Bill; Anderson, R. R.; Franco, Walfre

    2016-02-01

    Wound size is a key parameter in monitoring healing. Current methods to measure wound size are often subjective, time-consuming and marginally invasive. Recently, we developed a non-invasive, non-contact, fast and simple but robust fluorescence imaging (u-FEI) method to monitor the healing of skin wounds. This method exploits the fluorescence of native molecules to tissue as functional and structural markers. The objective of the present study is to demonstrate the feasibility of using variations in the fluorescence intensity of tryptophan and cross-links of collagen to evaluate proliferation of keratinocyte cells and quantitate size of wound during healing, respectively. Circular dermal wounds were created in ex vivo human skin and cultured in different media. Two serial fluorescence images of tryptophan and collagen cross-links were acquired every two days. Histology and immunohistology were used to validate correlation between fluorescence and epithelialization. Images of collagen cross-links show fluorescence of the exposed dermis and, hence, are a measure of wound area. Images of tryptophan show higher fluorescence intensity of proliferating keratinocytes forming new epithelium, as compared to surrounding keratinocytes not involved in epithelialization. These images are complementary since collagen cross-links report on structure while tryptophan reports on function. HE and immunohistology show that tryptophan fluorescence correlates with newly formed epidermis. We have established a fluorescence imaging method for studying epithelialization processes during wound healing in a skin organ culture model, our approach has the potential to provide a non-invasive, non-contact, quick, objective and direct method for quantitative measurements in wound healing in vivo.

  7. Fiber optic based multiparametric spectroscopy in vivo: Toward a new quantitative tissue vitality index

    NASA Astrophysics Data System (ADS)

    Kutai-Asis, Hofit; Barbiro-Michaely, Efrat; Deutsch, Assaf; Mayevsky, Avraham

    2006-02-01

    In our previous publication (Mayevsky et al SPIE 5326: 98-105, 2004) we described a multiparametric fiber optic system enabling the evaluation of 4 physiological parameters as indicators of tissue vitality. Since the correlation between the various parameters may differ in various pathophysiological conditions there is a need for an objective quantitative index that will integrate the relative changes measured in real time by the multiparametric monitoring system into a single number-vitality index. Such an approach to calculate tissue vitality index is critical for the possibility to use such an instrument in clinical environments. In the current presentation we are reporting our preliminary results indicating that calculation of an objective tissue vitality index is feasible. We used an intuitive empirical approach based on the comparison between the calculated index by the computer and the subjective evaluation made by an expert in the field of physiological monitoring. We used the in vivo brain of rats as an animal model in our current studies. The rats were exposed to anoxia, ischemia and cortical spreading depression and the responses were recorded in real time. At the end of the monitoring session the results were analyzed and the tissue vitality index was calculated offline. Mitochondrial NADH, tissue blood flow and oxy-hemoglobin were used to calculate the vitality index of the brain in vivo, where each parameter received a different weight, in each experiment type based on their significance. It was found that the mitochondrial NADH response was the main factor affected the calculated vitality index.

  8. Evaluation of Stratospheric Transport in New 3D Models Using the Global Modeling Initiative Grading Criteria

    NASA Technical Reports Server (NTRS)

    Strahan, Susan E.; Douglass, Anne R.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The Global Modeling Initiative (GMI) Team developed objective criteria for model evaluation in order to identify the best representation of the stratosphere. This work created a method to quantitatively and objectively discriminate between different models. In the original GMI study, 3 different meteorological data sets were used to run an offline chemistry and transport model (CTM). Observationally-based grading criteria were derived and applied to these simulations and various aspects of stratospheric transport were evaluated; grades were assigned. Here we report on the application of the GMI evaluation criteria to CTM simulations integrated with a new assimilated wind data set and a new general circulation model (GCM) wind data set. The Finite Volume Community Climate Model (FV-CCM) is a new GCM developed at Goddard which uses the NCAR CCM physics and the Lin and Rood advection scheme. The FV-Data Assimilation System (FV-DAS) is a new data assimilation system which uses the FV-CCM as its core model. One year CTM simulations of 2.5 degrees longitude by 2 degrees latitude resolution were run for each wind data set. We present the evaluation of temperature and annual transport cycles in the lower and middle stratosphere in the two new CTM simulations. We include an evaluation of high latitude transport which was not part of the original GMI criteria. Grades for the new simulations will be compared with those assigned during the original GMT evaluations and areas of improvement will be identified.

  9. A fuzzy logic expert system for evaluating policy progress towards sustainability goals.

    PubMed

    Cisneros-Montemayor, Andrés M; Singh, Gerald G; Cheung, William W L

    2017-12-16

    Evaluating progress towards environmental sustainability goals can be difficult due to a lack of measurable benchmarks and insufficient or uncertain data. Marine settings are particularly challenging, as stakeholders and objectives tend to be less well defined and ecosystem components have high natural variability and are difficult to observe directly. Fuzzy logic expert systems are useful analytical frameworks to evaluate such systems, and we develop such a model here to formally evaluate progress towards sustainability targets based on diverse sets of indicators. Evaluation criteria include recent (since policy enactment) and historical (from earliest known state) change, type of indicators (state, benefit, pressure, response), time span and spatial scope, and the suitability of an indicator in reflecting progress toward a specific objective. A key aspect of the framework is that all assumptions are transparent and modifiable to fit different social and ecological contexts. We test the method by evaluating progress towards four Aichi Biodiversity Targets in Canadian oceans, including quantitative progress scores, information gaps, and the sensitivity of results to model and data assumptions. For Canadian marine systems, national protection plans and biodiversity awareness show good progress, but species and ecosystem states overall do not show strong improvement. Well-defined goals are vital for successful policy implementation, as ambiguity allows for conflicting potential indicators, which in natural systems increases uncertainty in progress evaluations. Importantly, our framework can be easily adapted to assess progress towards policy goals with different themes, globally or in specific regions.

  10. Clusters of Insomnia Disorder: An Exploratory Cluster Analysis of Objective Sleep Parameters Reveals Differences in Neurocognitive Functioning, Quantitative EEG, and Heart Rate Variability

    PubMed Central

    Miller, Christopher B.; Bartlett, Delwyn J.; Mullins, Anna E.; Dodds, Kirsty L.; Gordon, Christopher J.; Kyle, Simon D.; Kim, Jong Won; D'Rozario, Angela L.; Lee, Rico S.C.; Comas, Maria; Marshall, Nathaniel S.; Yee, Brendon J.; Espie, Colin A.; Grunstein, Ronald R.

    2016-01-01

    Study Objectives: To empirically derive and evaluate potential clusters of Insomnia Disorder through cluster analysis from polysomnography (PSG). We hypothesized that clusters would differ on neurocognitive performance, sleep-onset measures of quantitative (q)-EEG and heart rate variability (HRV). Methods: Research volunteers with Insomnia Disorder (DSM-5) completed a neurocognitive assessment and overnight PSG measures of total sleep time (TST), wake time after sleep onset (WASO), and sleep onset latency (SOL) were used to determine clusters. Results: From 96 volunteers with Insomnia Disorder, cluster analysis derived at least two clusters from objective sleep parameters: Insomnia with normal objective sleep duration (I-NSD: n = 53) and Insomnia with short sleep duration (I-SSD: n = 43). At sleep onset, differences in HRV between I-NSD and I-SSD clusters suggest attenuated parasympathetic activity in I-SSD (P < 0.05). Preliminary work suggested three clusters by retaining the I-NSD and splitting the I-SSD cluster into two: I-SSD A (n = 29): defined by high WASO and I-SSD B (n = 14): a second I-SSD cluster with high SOL and medium WASO. The I-SSD B cluster performed worse than I-SSD A and I-NSD for sustained attention (P ≤ 0.05). In an exploratory analysis, q-EEG revealed reduced spectral power also in I-SSD B before (Delta, Alpha, Beta-1) and after sleep-onset (Beta-2) compared to I-SSD A and I-NSD (P ≤ 0.05). Conclusions: Two insomnia clusters derived from cluster analysis differ in sleep onset HRV. Preliminary data suggest evidence for three clusters in insomnia with differences for sustained attention and sleep-onset q-EEG. Clinical Trial Registration: Insomnia 100 sleep study: Australia New Zealand Clinical Trials Registry (ANZCTR) identification number 12612000049875. URL: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=347742. Citation: Miller CB, Bartlett DJ, Mullins AE, Dodds KL, Gordon CJ, Kyle SD, Kim JW, D'Rozario AL, Lee RS, Comas M, Marshall NS, Yee BJ, Espie CA, Grunstein RR. Clusters of Insomnia Disorder: an exploratory cluster analysis of objective sleep parameters reveals differences in neurocognitive functioning, quantitative EEG, and heart rate variability. SLEEP 2016;39(11):1993–2004. PMID:27568796

  11. Quantitative ptychographic reconstruction by applying a probe constraint

    NASA Astrophysics Data System (ADS)

    Reinhardt, J.; Schroer, C. G.

    2018-04-01

    The coherent scanning technique X-ray ptychography has become a routine tool for high-resolution imaging and nanoanalysis in various fields of research such as chemistry, biology or materials science. Often the ptychographic reconstruction results are analysed in order to yield absolute quantitative values for the object transmission and illuminating probe function. In this work, we address a common ambiguity encountered in scaling the object transmission and probe intensity via the application of an additional constraint to the reconstruction algorithm. A ptychographic measurement of a model sample containing nanoparticles is used as a test data set against which to benchmark in the reconstruction results depending on the type of constraint used. Achieving quantitative absolute values for the reconstructed object transmission is essential for advanced investigation of samples that are changing over time, e.g., during in-situ experiments or in general when different data sets are compared.

  12. An evaluation of beta-hydroxybutyrate in milk and blood for prediction of subclinical ketosis in dairy cows.

    PubMed

    Samiei, A; Liang, J B; Ghorbani, G R; Hirooka, H; Yaakub, H; Tabatabaei, M

    2010-01-01

    The first objective of this study was to investigate the relationship between concentrations of beta-hydroxybutyrate (BHBA) in milk and blood to assess the reliability of the BHBA concentrations in milk measured by a semi quantitative keto-test paper to detect subclinical ketosis (SCK) in 50 fresh high-producing Iranian Holstein cows in Golestan Province, Iran. The second objective was the effects of SCK on milk yield and components. Concentrations of nonesterified fatty acids (NEFA) and BHBA were analyzed quantitatively in blood plasma and commercial keto-test paper was used for semi quantitative determination of BHBA concentration in milk. Milk yield was measured until 60 d after calving but milk compositions were measured until 30 d after calving. The mean plasma BHBA, milk BHBA, plasma NEFA, milk yield, milk fat percentage and milk fat: protein ratio were 1,234 micromol/L, 145 micromol/L, 0.482 mEq/L, 29.5 kg, 3.9% and 1.4, respectively. Fifty eight percent of the cows had SCK during the first month of lactation. High correlation coefficients were observed between blood BHBA and blood NEFA, and between blood and milk BHBA. The milk yield of cattle with SCK decreased (P < 0.01) but the fat percentage and milk fat: protein ratio increased (P < 0.01). The commercial keto-test paper used had a low false positive result at a cut-off point of 200 fmol of BHBA/L of milk. The results showed that the best time to assess SCK using the commercial keto-test paper was d 10, 14 and 17 after calving.

  13. Lifestyle Factors and Visible Skin Aging in a Population of Japanese Elders

    PubMed Central

    Asakura, Keiko; Nishiwaki, Yuji; Milojevic, Ai; Michikawa, Takehiro; Kikuchi, Yuriko; Nakano, Makiko; Iwasawa, Satoko; Hillebrand, Greg; Miyamoto, Kukizo; Ono, Masaji; Kinjo, Yoshihide; Akiba, Suminori; Takebayashi, Toru

    2009-01-01

    Background The number of studies that use objective and quantitative methods to evaluate facial skin aging in elderly people is extremely limited, especially in Japan. Therefore, in this cross-sectional study we attempted to characterize the condition of facial skin (hyperpigmentation, pores, texture, and wrinkling) in Japanese adults aged 65 years or older by using objective and quantitative imaging methods. In addition, we aimed to identify lifestyle factors significantly associated with these visible signs of aging. Methods The study subjects were 802 community-dwelling Japanese men and women aged at least 65 years and living in the town of Kurabuchi (Takasaki City, Gunma Prefecture, Japan), a mountain community with a population of approximately 4800. The facial skin condition of subjects was assessed quantitatively using a standardized facial imaging system and subsequent computer image analysis. Lifestyle information was collected using a structured questionnaire. The association between skin condition and lifestyle factors was examined using multivariable regression analysis. Results Among women, the mean values for facial texture, hyperpigmentation, and pores were generally lower than those among age-matched men. There was no significant difference between sexes in the severity of facial wrinkling. Older age was associated with worse skin condition among women only. After adjusting for age, smoking status and topical sun protection were significantly associated with skin condition among both men and women. Conclusions Our study revealed significant differences between sexes in the severity of hyperpigmentation, texture, and pores, but not wrinkling. Smoking status and topical sun protection were significantly associated with signs of visible skin aging in this study population. PMID:19700917

  14. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    PubMed

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well. © The Author(s) 2013.

  15. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  16. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  17. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  18. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  19. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  20. Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Leonard, Desiree M.

    1991-01-01

    Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.

  1. Outcome assessment for spasticity management in the patient with traumatic brain injury: the state of the art.

    PubMed

    Elovic, Elie P; Simone, Lisa K; Zafonte, Ross

    2004-01-01

    The objective of this article was to (1) review the engineering and medical literature to structure the available information concerning the assessment of spasticity in the neurological population; (2) to discuss the strengths and weaknesses of the different methods currently in use in spasticity assessment; and (3) make recommendations for future efforts in spasticity outcome assessment. Spasticity textbooks, Web sites, and OVID, IEEE, and Medline searches from 1966 through 2003 of spasticity, quantitative measure, or outcome assessment in the rehabilitation population were used as data sources. Over 500 articles were reviewed. Articles that discussed outcome measures used to assess interventions and evaluation of spasticity were included. Authors reviewed the articles looking at inclusion criteria, data collection, methodology, assessment methods, and conclusions for validity and relevance to this article. Issues such as clinical relevance, real-world function and lack of objectivity, and time consumed during performance are important issues for spasticity assessment. Some measures such as the Ashworth Scale remain in common use secondary to ease of use despite their obvious functional limitations. More functional outcome goals are plagued by being more time consuming and a general inability to demonstrate changes after an intervention. This may be secondary to the other factors that combine with spasticity to cause dysfunction at that level. Quantitative metrics can provide more objective measurements but their clinical relevance is sometimes problematic. The assessment of spasticity outcome is still somewhat problematic. Further work is necessary to develop measures that have real-world functional significance to both the individuals being treated and the clinicians. A lack of objectivity is still a problem. In the future it is important for clinicians and the engineers to work together in the development of better outcome measures.

  2. Generating standardized image data for testing and calibrating quantification of volumes, surfaces, lengths, and object counts in fibrous and porous materials using X-ray microtomography.

    PubMed

    Jiřík, Miroslav; Bartoš, Martin; Tomášek, Petr; Malečková, Anna; Kural, Tomáš; Horáková, Jana; Lukáš, David; Suchý, Tomáš; Kochová, Petra; Hubálek Kalbáčová, Marie; Králíčková, Milena; Tonar, Zbyněk

    2018-06-01

    Quantification of the structure and composition of biomaterials using micro-CT requires image segmentation due to the low contrast and overlapping radioopacity of biological materials. The amount of bias introduced by segmentation procedures is generally unknown. We aim to develop software that generates three-dimensional models of fibrous and porous structures with known volumes, surfaces, lengths, and object counts in fibrous materials and to provide a software tool that calibrates quantitative micro-CT assessments. Virtual image stacks were generated using the newly developed software TeIGen, enabling the simulation of micro-CT scans of unconnected tubes, connected tubes, and porosities. A realistic noise generator was incorporated. Forty image stacks were evaluated using micro-CT, and the error between the true known and estimated data was quantified. Starting with geometric primitives, the error of the numerical estimation of surfaces and volumes was eliminated, thereby enabling the quantification of volumes and surfaces of colliding objects. Analysis of the sensitivity of the thresholding upon parameters of generated testing image sets revealed the effects of decreasing resolution and increasing noise on the accuracy of the micro-CT quantification. The size of the error increased with decreasing resolution when the voxel size exceeded 1/10 of the typical object size, which simulated the effect of the smallest details that could still be reliably quantified. Open-source software for calibrating quantitative micro-CT assessments by producing and saving virtually generated image data sets with known morphometric data was made freely available to researchers involved in morphometry of three-dimensional fibrillar and porous structures in micro-CT scans. © 2018 Wiley Periodicals, Inc.

  3. Quantitative parameters of CT texture analysis as potential markersfor early prediction of spontaneous intracranial hemorrhage enlargement.

    PubMed

    Shen, Qijun; Shan, Yanna; Hu, Zhengyu; Chen, Wenhui; Yang, Bing; Han, Jing; Huang, Yanfang; Xu, Wen; Feng, Zhan

    2018-04-30

    To objectively quantify intracranial hematoma (ICH) enlargement by analysing the image texture of head CT scans and to provide objective and quantitative imaging parameters for predicting early hematoma enlargement. We retrospectively studied 108 ICH patients with baseline non-contrast computed tomography (NCCT) and 24-h follow-up CT available. Image data were assessed by a chief radiologist and a resident radiologist. Consistency analysis between observers was tested. The patients were divided into training set (75%) and validation set (25%) by stratified sampling. Patients in the training set were dichotomized according to 24-h hematoma expansion ≥ 33%. Using the Laplacian of Gaussian bandpass filter, we chose different anatomical spatial domains ranging from fine texture to coarse texture to obtain a series of derived parameters (mean grayscale intensity, variance, uniformity) in order to quantify and evaluate all data. The parameters were externally validated on validation set. Significant differences were found between the two groups of patients within variance at V 1.0 and in uniformity at U 1.0 , U 1.8 and U 2.5 . The intraclass correlation coefficients for the texture parameters were between 0.67 and 0.99. The area under the ROC curve between the two groups of ICH cases was between 0.77 and 0.92. The accuracy of validation set by CTTA was 0.59-0.85. NCCT texture analysis can objectively quantify the heterogeneity of ICH and independently predict early hematoma enlargement. • Heterogeneity is helpful in predicting ICH enlargement. • CTTA could play an important role in predicting early ICH enlargement. • After filtering, fine texture had the best diagnostic performance. • The histogram-based uniformity parameters can independently predict ICH enlargement. • CTTA is more objective, more comprehensive, more independently operable, than previous methods.

  4. Using a modified Learning Potential Assessment Device and Mediated Learning Experiences to Assess Minority Student Progress and Program Goals in an Undergraduate Research Based Geoscience Program Serving American Indians

    NASA Astrophysics Data System (ADS)

    Mitchell, L. W.

    2002-12-01

    During the initiation of a new program at the University of North Dakota designed to promote American Indians to engage in geoscience research and complete geoscience related degrees, an evaluation procedure utilizing a modified Learning Potential Assessment Device (LPAD) and Mediated Learning Experiences (MLE) to assess minority student progress was implemented. The program, called Indians Into Geosciences (INGEOS), utilized a modified form of the Learning Potential Assessment Device first to assess cultural factors, determination, and other baseline information, and second, utilized a series of Mediated Learning Experiences to enhance minority students' opportunities in a culturally appropriate, culturally diverse, and scientifically challenging manner in an effort to prepare students for competitive research careers in the geosciences. All of the LPADs and MLEs corresponded directly to the three goals or eight objectives of INGEOS. The three goals of the INGEOS program are: 1) increasing the number of American Indians earning degrees at all levels, 2) engaging American Indians in challenging and technically based scientific research, and 3) preparing American Indians for successful geoscience careers through multicultural community involvement. The eight objectives of the INGEOS program, called the Eight Points of Success, are: 1) spiritual health, 2) social health, 3) physical health, 4) mental health, 5) financial management, 6) research involvement, 7) technical exposure, and 8) multicultural community education. The INGEOS program goals were evaluated strictly quantitatively utilizing a variety of data sources such as grade point averages, number of credits earned, research project information, and developed products. The INGEOS Program goals reflected a combined quantitative score of all participants, whereas the objectives reflected qualitative measures and are specific for each INGEOS participant. Initial results indicate that those participants which show progress through Mediated Learning Experiences within all of the Eight Points of Success, have a higher likelihood of contributing to all three of the INGEOS programs goals.

  5. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  6. Use of the Oslo-Potsdam Solution to test the effect of an environmental education model on tangible measures of environmental protection

    NASA Astrophysics Data System (ADS)

    Short, Philip Craig

    The fundamental goals of environmental education include the creation of an environmentally literate citizenry possessing the knowledge, skills, and motivation to objectively analyze environmental issues and engage in responsible behaviors leading to issue resolution and improved or maintained environmental quality. No existing research, however, has linked educational practices and environmental protection. In an original attempt to quantify the pedagogy - environmental protection relationship, both qualitative and quantitative methods were used to investigate local environmental records and environmental quality indices that reflected the results of student actions. The data were analyzed using an educational adaptation of the "Oslo-Potsdam Solution for International Environmental Regime Effectiveness." The new model, termed the Environmental Education Performance Indicator (EEPI), was developed and evaluated as a quantitative tool for testing and fairly comparing the efficacy of student-initiated environmental projects in terms of environmental quality measures. Five case studies were developed from descriptions of student actions and environmental impacts as revealed by surveys and interviews with environmental education teachers using the IEEIA (Investigating and Evaluating Environmental Issues and Actions) curriculum, former students, community members, and agency officials. Archival information was also used to triangulate the data. In addition to evaluating case study data on the basis of the EEPI model, an expert panel of evaluators consisting of professionals from environmental education, natural sciences, environmental policy, and environmental advocacy provided subjective assessments on the effectiveness of each case study. The results from this study suggest that environmental education interventions can equip and empower students to act on their own conclusions in a manner that leads to improved or maintained environmental conditions. The EEPI model shows promise in providing a more consistent, accurate and objective evaluation than is possible with subjective analysis. Recommendations are offered to guide further research on establishing the environmental education - environmental quality link. Ultimately, a research framework for determining which educational strategies are most effectively linked to demonstrable environmental quality outcomes will have utility in both educational and public policy arenas.

  7. [Evaluation of YAG-laser vitreolysis effectiveness based on quantitative characterization of vitreous floaters].

    PubMed

    Shaimova, V A; Shaimov, T B; Shaimov, R B; Galin, A Yu; Goloshchapova, Zh A; Ryzhkov, P K; Fomin, A V

    2018-01-01

    To develop methods for evaluating effectiveness of YAG-laser vitreolysis of vitreous floaters. The study included 144 patients (173 eyes) who had underwent YAG-laser vitreolysis and were under observation from 01.09.16 to 31.01.18. The patients were 34 to 86 years old (mean age 62.7±10.2 years), 28 (19.4%) patients were male, 116 (80.6%) - female. All patients underwent standard and additional examination: ultrasonography (Accutome B-scan plus, U.S.A.), optic biometry (Lenstar 900, Haag-Streit, Switzerland), spectral optical coherence tomography using RTVue XR Avanti scanner (Optovue, U.S.A.) in modes Enhanced HD Line, 3D Retina, 3D Widefield MCT, Cross Line, Angio Retina, and scanning laser ophthalmoscopy (SLO) using Navilas 577s system. Laser vitreolysis was performed using the Ultra Q Reflex laser (Ellex, Australia). This paper presents methods of objective quantitative and qualitative assessment of artifactual shadows of vitreous floaters with spectral optical coherence tomographic scanner RTVue xR Avanti employing an algorithm of automatic detection of non-perfusion zones in modes Angio Retina, HD Angio Retina, as well as foveal avascular zone (FAZ) measurement with Angio Analytics® software. SLO performed with Navilas 577s was used as method of visualizing floaters and artifactual shadows in retinal surface layers prior to surgical treatment and after YAG-laser vitreolysis. Suggested methods of quantitative and qualitative assessment of artifactual shadows of the floaters in retinal layers are promising and may prove to be highly relevant for clinical monitoring of patients, optimization of treatment indications and evaluating effectiveness of YAG-laser vitreolysis. Further research of laser vitreolysis effectiveness in patients with vitreous floaters is necessary.

  8. Diffusion-weighted imaging: Apparent diffusion coefficient histogram analysis for detecting pathologic complete response to chemoradiotherapy in locally advanced rectal cancer.

    PubMed

    Choi, Moon Hyung; Oh, Soon Nam; Rha, Sung Eun; Choi, Joon-Il; Lee, Sung Hak; Jang, Hong Seok; Kim, Jun-Gi; Grimm, Robert; Son, Yohan

    2016-07-01

    To investigate the usefulness of apparent diffusion coefficient (ADC) values derived from histogram analysis of the whole rectal cancer as a quantitative parameter to evaluate pathologic complete response (pCR) on preoperative magnetic resonance imaging (MRI). We enrolled a total of 86 consecutive patients who had undergone surgery for rectal cancer after neoadjuvant chemoradiotherapy (CRT) at our institution between July 2012 and November 2014. Two radiologists who were blinded to the final pathological results reviewed post-CRT MRI to evaluate tumor stage. Quantitative image analysis was performed using T2 -weighted and diffusion-weighted images independently by two radiologists using dedicated software that performed histogram analysis to assess the distribution of ADC in the whole tumor. After surgery, 16 patients were confirmed to have achieved pCR (18.6%). All parameters from pre- and post-CRT ADC histogram showed good or excellent agreement between two readers. The minimum, 10th, 25th, 50th, and 75th percentile and mean ADC from post-CRT ADC histogram were significantly higher in the pCR group than in the non-pCR group for both readers. The 25th percentile value from ADC histogram in post-CRT MRI had the best diagnostic performance for detecting pCR, with an area under the receiver operating characteristic curve of 0.796. Low percentile values derived from the ADC histogram analysis of rectal cancer on MRI after CRT showed a significant difference between pCR and non-pCR groups, demonstrating the utility of the ADC value as a quantitative and objective marker to evaluate complete pathologic response to preoperative CRT in rectal cancer. J. Magn. Reson. Imaging 2016;44:212-220. © 2015 Wiley Periodicals, Inc.

  9. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  10. Saving Educational Dollars through Quality Objectives.

    ERIC Educational Resources Information Center

    Alvir, Howard P.

    This document is a collection of working papers written to meet the specific needs of teachers who are starting to think about and write performance objectives. It emphasizes qualitative objectives as opposed to quantitative classroom goals. The author describes quality objectives as marked by their clarity, accessibility, accountability, and…

  11. Tracking-Learning-Detection.

    PubMed

    Kalal, Zdenek; Mikolajczyk, Krystian; Matas, Jiri

    2012-07-01

    This paper investigates long-term tracking of unknown objects in a video stream. The object is defined by its location and extent in a single frame. In every frame that follows, the task is to determine the object's location and extent or indicate that the object is not present. We propose a novel tracking framework (TLD) that explicitly decomposes the long-term tracking task into tracking, learning, and detection. The tracker follows the object from frame to frame. The detector localizes all appearances that have been observed so far and corrects the tracker if necessary. The learning estimates the detector's errors and updates it to avoid these errors in the future. We study how to identify the detector's errors and learn from them. We develop a novel learning method (P-N learning) which estimates the errors by a pair of "experts": (1) P-expert estimates missed detections, and (2) N-expert estimates false alarms. The learning process is modeled as a discrete dynamical system and the conditions under which the learning guarantees improvement are found. We describe our real-time implementation of the TLD framework and the P-N learning. We carry out an extensive quantitative evaluation which shows a significant improvement over state-of-the-art approaches.

  12. Assessment of calcium scoring performance in cardiac computed tomography.

    PubMed

    Ulzheimer, Stefan; Kalender, Willi A

    2003-03-01

    Electron beam tomography (EBT) has been used for cardiac diagnosis and the quantitative assessment of coronary calcium since the late 1980s. The introduction of mechanical multi-slice spiral CT (MSCT) scanners with shorter rotation times opened new possibilities of cardiac imaging with conventional CT scanners. The purpose of this work was to qualitatively and quantitatively evaluate the performance for EBT and MSCT for the task of coronary artery calcium imaging as a function of acquisition protocol, heart rate, spiral reconstruction algorithm (where applicable) and calcium scoring method. A cardiac CT semi-anthropomorphic phantom was designed and manufactured for the investigation of all relevant image quality parameters in cardiac CT. This phantom includes various test objects, some of which can be moved within the anthropomorphic phantom in a manner that mimics realistic heart motion. These tools were used to qualitatively and quantitatively demonstrate the accuracy of coronary calcium imaging using typical protocols for an electron beam (Evolution C-150XP, Imatron, South San Francisco, Calif.) and a 0.5-s four-slice spiral CT scanner (Sensation 4, Siemens, Erlangen, Germany). A special focus was put on the method of quantifying coronary calcium, and three scoring systems were evaluated (Agatston, volume, and mass scoring). Good reproducibility in coronary calcium scoring is always the result of a combination of high temporal and spatial resolution; consequently, thin-slice protocols in combination with retrospective gating on MSCT scanners yielded the best results. The Agatston score was found to be the least reproducible scoring method. The hydroxyapatite mass, being better reproducible and comparable on different scanners and being a physical quantitative measure, appears to be the method of choice for future clinical studies. The hydroxyapatite mass is highly correlated to the Agatston score. The introduced phantoms can be used to quantitatively assess the performance characteristics of, for example, different scanners, reconstruction algorithms, and quantification methods in cardiac CT. This is especially important for quantitative tasks, such as the determination of the amount of calcium in the coronary arteries, to achieve high and constant quality in this field.

  13. Automatic trajectory measurement of large numbers of crowded objects

    NASA Astrophysics Data System (ADS)

    Li, Hui; Liu, Ye; Chen, Yan Qiu

    2013-06-01

    Complex motion patterns of natural systems, such as fish schools, bird flocks, and cell groups, have attracted great attention from scientists for years. Trajectory measurement of individuals is vital for quantitative and high-throughput study of their collective behaviors. However, such data are rare mainly due to the challenges of detection and tracking of large numbers of objects with similar visual features and frequent occlusions. We present an automatic and effective framework to measure trajectories of large numbers of crowded oval-shaped objects, such as fish and cells. We first use a novel dual ellipse locator to detect the coarse position of each individual and then propose a variance minimization active contour method to obtain the optimal segmentation results. For tracking, cost matrix of assignment between consecutive frames is trainable via a random forest classifier with many spatial, texture, and shape features. The optimal trajectories are found for the whole image sequence by solving two linear assignment problems. We evaluate the proposed method on many challenging data sets.

  14. Correlation of radiologists' image quality perception with quantitative assessment parameters: just-noticeable difference vs. peak signal-to-noise ratios

    NASA Astrophysics Data System (ADS)

    Siddiqui, Khan M.; Siegel, Eliot L.; Reiner, Bruce I.; Johnson, Jeffrey P.

    2005-04-01

    The authors identify a fundamental disconnect between the ways in which industry and radiologists assess and even discuss product performance. What is needed is a quantitative methodology that can assess both subjective image quality and observer task performance. In this study, we propose and evaluate the use of a visual discrimination model (VDM) that assesses just-noticeable differences (JNDs) to serve this purpose. The study compares radiologists' subjective perceptions of image quality of computer tomography (CT) and computed radiography (CR) images with quantitative measures of peak signal-to-noise ratio (PSNR) and JNDs as measured by a VDM. The study included 4 CT and 6 CR studies with compression ratios ranging from lossless to 90:1 (total of 80 sets of images were generated [n = 1,200]). Eleven radiologists reviewed the images and rated them in terms of overall quality and readability and identified images not acceptable for interpretation. Normalized reader scores were correlated with compression, objective PSNR, and mean JND values. Results indicated a significantly higher correlation between observer performance and JND values than with PSNR methods. These results support the use of the VDM as a metric not only for the threshold discriminations for which it was calibrated, but also as a general image quality metric. This VDM is a highly promising, reproducible, and reliable adjunct or even alternative to human observer studies for research or to establish clinical guidelines for image compression, dose reductions, and evaluation of various display technologies.

  15. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699

  16. Quantitative and perceived visual changes of the nasolabial fold following orthodontic retraction of lip protrusion.

    PubMed

    Baek, Eui Seon; Hwang, Soonshin; Choi, Yoon Jeong; Roh, Mi Ryung; Nguyen, Tung; Kim, Kyung-Ho; Chung, Chooryung J

    2018-07-01

    The objectives of this study were to evaluate the quantitative and perceived visual changes of the nasolabial fold (NLF) after maximum retraction in adults and to determine its contributing factors. A total of 39 adult women's cone-beam computed tomography images were collected retrospectively and divided into the retraction group (age 26.9 ± 8.80) that underwent maximum retraction following 4 premolar extraction and the control group (age 24.6 ± 5.36) with minor changes of the incisors. Three-dimensional morphologic changes of hard and soft tissue including NLF were measured by pre- and posttreatment cone-beam computed tomography. In addition, perceived visual change of the NLF was monitored using the modified Global Aesthetic Improvement Scale. The influence of age, initial severity of NLF, and initial soft tissue thickness was evaluated. Anterior retraction induced significant changes of the facial soft tissue including the lips, perioral, and the NLF when compared with the controls ( P < .01). Perceived visual changes of the NLF was noted only in women younger than age 30 ( P < .05), with the odds ratio (95% confidence interval) of 2.44 (1.3461-4.4226), indicating greater possibility for improvement of NLF esthetics in young women of the retraction group when compared with the controls. Orthodontic retraction induced quantitative and perceived visual changes of the NLF. For adult women younger than age 30, the appearance of the NLF improved after maximum retraction despite the greater posterior change of the NLF.

  17. Techniques and Methods for Testing the Postural Function in Healthy and Pathological Subjects

    PubMed Central

    Paillard, Thierry; Noé, Frédéric

    2015-01-01

    The different techniques and methods employed as well as the different quantitative and qualitative variables measured in order to objectify postural control are often chosen without taking into account the population studied, the objective of the postural test, and the environmental conditions. For these reasons, the aim of this review was to present and justify the different testing techniques and methods with their different quantitative and qualitative variables to make it possible to precisely evaluate each sensory, central, and motor component of the postural function according to the experiment protocol under consideration. The main practical and technological methods and techniques used in evaluating postural control were explained and justified according to the experimental protocol defined. The main postural conditions (postural stance, visual condition, balance condition, and test duration) were also analyzed. Moreover, the mechanistic exploration of the postural function often requires implementing disturbing postural conditions by using motor disturbance (mechanical disturbance), sensory stimulation (sensory manipulation), and/or cognitive disturbance (cognitive task associated with maintaining postural balance) protocols. Each type of disturbance was tackled in order to facilitate understanding of subtle postural control mechanisms and the means to explore them. PMID:26640800

  18. Techniques and Methods for Testing the Postural Function in Healthy and Pathological Subjects.

    PubMed

    Paillard, Thierry; Noé, Frédéric

    2015-01-01

    The different techniques and methods employed as well as the different quantitative and qualitative variables measured in order to objectify postural control are often chosen without taking into account the population studied, the objective of the postural test, and the environmental conditions. For these reasons, the aim of this review was to present and justify the different testing techniques and methods with their different quantitative and qualitative variables to make it possible to precisely evaluate each sensory, central, and motor component of the postural function according to the experiment protocol under consideration. The main practical and technological methods and techniques used in evaluating postural control were explained and justified according to the experimental protocol defined. The main postural conditions (postural stance, visual condition, balance condition, and test duration) were also analyzed. Moreover, the mechanistic exploration of the postural function often requires implementing disturbing postural conditions by using motor disturbance (mechanical disturbance), sensory stimulation (sensory manipulation), and/or cognitive disturbance (cognitive task associated with maintaining postural balance) protocols. Each type of disturbance was tackled in order to facilitate understanding of subtle postural control mechanisms and the means to explore them.

  19. Physics-Based Image Segmentation Using First Order Statistical Properties and Genetic Algorithm for Inductive Thermography Imaging.

    PubMed

    Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun

    2018-05-01

    Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.

  20. Shear-Wave Elastography: Basic Physics and Musculoskeletal Applications.

    PubMed

    Taljanovic, Mihra S; Gimber, Lana H; Becker, Giles W; Latt, L Daniel; Klauser, Andrea S; Melville, David M; Gao, Liang; Witte, Russell S

    2017-01-01

    In the past 2 decades, sonoelastography has been progressively used as a tool to help evaluate soft-tissue elasticity and add to information obtained with conventional gray-scale and Doppler ultrasonographic techniques. Recently introduced on clinical scanners, shear-wave elastography (SWE) is considered to be more objective, quantitative, and reproducible than compression sonoelastography with increasing applications to the musculoskeletal system. SWE uses an acoustic radiation force pulse sequence to generate shear waves, which propagate perpendicular to the ultrasound beam, causing transient displacements. The distribution of shear-wave velocities at each pixel is directly related to the shear modulus, an absolute measure of the tissue's elastic properties. Shear-wave images are automatically coregistered with standard B-mode images to provide quantitative color elastograms with anatomic specificity. Shear waves propagate faster through stiffer contracted tissue, as well as along the long axis of tendon and muscle. SWE has a promising role in determining the severity of disease and treatment follow-up of various musculoskeletal tissues including tendons, muscles, nerves, and ligaments. This article describes the basic ultrasound physics of SWE and its applications in the evaluation of various traumatic and pathologic conditions of the musculoskeletal system. © RSNA, 2017.

  1. Growing community: the impact of the Stephanie Alexander Kitchen Garden Program on the social and learning environment in primary schools.

    PubMed

    Block, Karen; Gibbs, Lisa; Staiger, Petra K; Gold, Lisa; Johnson, Britt; Macfarlane, Susie; Long, Caroline; Townsend, Mardie

    2012-08-01

    This article presents results from a mixed-method evaluation of a structured cooking and gardening program in Australian primary schools, focusing on program impacts on the social and learning environment of the school. In particular, we address the Stephanie Alexander Kitchen Garden Program objective of providing a pleasurable experience that has a positive impact on student engagement, social connections, and confidence within and beyond the school gates. Primary evidence for the research question came from qualitative data collected from students, parents, teachers, volunteers, school principals, and specialist staff through interviews, focus groups, and participant observations. This was supported by analyses of quantitative data on child quality of life, cooperative behaviors, teacher perceptions of the school environment, and school-level educational outcome and absenteeism data. Results showed that some of the program attributes valued most highly by study participants included increased student engagement and confidence, opportunities for experiential and integrated learning, teamwork, building social skills, and connections and links between schools and their communities. In this analysis, quantitative findings failed to support findings from the primary analysis. Limitations as well as benefits of a mixed-methods approach to evaluation of complex community interventions are discussed.

  2. Quantitative multi-modal NDT data analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundantmore » information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.« less

  3. Shear-Wave Elastography: Basic Physics and Musculoskeletal Applications

    PubMed Central

    Gimber, Lana H.; Becker, Giles W.; Latt, L. Daniel; Klauser, Andrea S.; Melville, David M.; Gao, Liang; Witte, Russell S.

    2017-01-01

    In the past 2 decades, sonoelastography has been progressively used as a tool to help evaluate soft-tissue elasticity and add to information obtained with conventional gray-scale and Doppler ultrasonographic techniques. Recently introduced on clinical scanners, shear-wave elastography (SWE) is considered to be more objective, quantitative, and reproducible than compression sonoelastography with increasing applications to the musculoskeletal system. SWE uses an acoustic radiation force pulse sequence to generate shear waves, which propagate perpendicular to the ultrasound beam, causing transient displacements. The distribution of shear-wave velocities at each pixel is directly related to the shear modulus, an absolute measure of the tissue’s elastic properties. Shear-wave images are automatically coregistered with standard B-mode images to provide quantitative color elastograms with anatomic specificity. Shear waves propagate faster through stiffer contracted tissue, as well as along the long axis of tendon and muscle. SWE has a promising role in determining the severity of disease and treatment follow-up of various musculoskeletal tissues including tendons, muscles, nerves, and ligaments. This article describes the basic ultrasound physics of SWE and its applications in the evaluation of various traumatic and pathologic conditions of the musculoskeletal system. ©RSNA, 2017 PMID:28493799

  4. An importance-performance analysis of hospital information system attributes: A nurses' perspective.

    PubMed

    Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J

    2016-02-01

    Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7 farmers carried out quantitative visual observations all independently from each other. All observers assessed five sites, having a sand, peat or clay soil. For almost all quantitative visual observations the spread of observed values was low (coefficient of variation < 1.0), except for the number of biopores and gley mottles. Furthermore, farmers' observed mean values were significantly higher than soil scientists' mean values, for soil structure, amount of gley mottles and compaction. This study showed that VSA could be a valuable tool to assess soil quality. Subjectivity, due to the background of the observer, might influence the outcome of visual assessment of some soil properties. In countries where soil analyses can easily be carried out, VSA might be a good replenishment to available soil chemical analyses, and in countries where it is not feasible to carry out soil analyses, VSA might be a good start to assess soil quality.

  6. Interspecies physiological variation as a tool for cross-species assessments of global warming-induced endangerment: validation of an intrinsic determinant of macroecological and phylogeographic structure.

    PubMed

    Bernardo, Joseph; Ossola, Ryan J; Spotila, James; Crandall, Keith A

    2007-12-22

    Global warming is now recognized as the dominant threat to biodiversity because even protected populations and habitats are susceptible. Nonetheless, current criteria for evaluating species' relative endangerment remain purely ecological, and the accepted conservation strategies of habitat preservation and population management assume that species can mount ecological responses if afforded protection. The insidious threat from climate change is that it will attenuate or preclude ecological responses by species that are physiologically constrained; yet, quantitative, objective criteria for assessing relative susceptibility of diverse taxa to warming-induced stress are wanting. We explored the utility of using interspecies physiological variation for this purpose by relating species' physiological phenotypes to landscape patterns of ecological and genetic exchange. Using a salamander model system in which ecological, genetic and physiological diversity are well characterized, we found strong quantitative relationships of basal metabolic rates (BMRs) to both macroecological and phylogeographic patterns, with decreasing BMR leading to dispersal limitation (small contemporary ranges with marked phylogeographic structure). Measures of intrinsic physiological tolerance, which vary systematically with macroecological and phylogeographic patterns, afford objective criteria for assessing endangerment across a wide range of species and should be incorporated into conservation assessment criteria that currently rely exclusively upon ecological predictors.

  7. Evaluation and comparison of current fetal ultrasound image segmentation methods for biometric measurements: a grand challenge.

    PubMed

    Rueda, Sylvia; Fathima, Sana; Knight, Caroline L; Yaqub, Mohammad; Papageorghiou, Aris T; Rahmatullah, Bahbibi; Foi, Alessandro; Maggioni, Matteo; Pepe, Antonietta; Tohka, Jussi; Stebbing, Richard V; McManigle, John E; Ciurte, Anca; Bresson, Xavier; Cuadra, Meritxell Bach; Sun, Changming; Ponomarev, Gennady V; Gelfand, Mikhail S; Kazanov, Marat D; Wang, Ching-Wei; Chen, Hsiang-Chou; Peng, Chun-Wei; Hung, Chu-Mei; Noble, J Alison

    2014-04-01

    This paper presents the evaluation results of the methods submitted to Challenge US: Biometric Measurements from Fetal Ultrasound Images, a segmentation challenge held at the IEEE International Symposium on Biomedical Imaging 2012. The challenge was set to compare and evaluate current fetal ultrasound image segmentation methods. It consisted of automatically segmenting fetal anatomical structures to measure standard obstetric biometric parameters, from 2D fetal ultrasound images taken on fetuses at different gestational ages (21 weeks, 28 weeks, and 33 weeks) and with varying image quality to reflect data encountered in real clinical environments. Four independent sub-challenges were proposed, according to the objects of interest measured in clinical practice: abdomen, head, femur, and whole fetus. Five teams participated in the head sub-challenge and two teams in the femur sub-challenge, including one team who tackled both. Nobody attempted the abdomen and whole fetus sub-challenges. The challenge goals were two-fold and the participants were asked to submit the segmentation results as well as the measurements derived from the segmented objects. Extensive quantitative (region-based, distance-based, and Bland-Altman measurements) and qualitative evaluation was performed to compare the results from a representative selection of current methods submitted to the challenge. Several experts (three for the head sub-challenge and two for the femur sub-challenge), with different degrees of expertise, manually delineated the objects of interest to define the ground truth used within the evaluation framework. For the head sub-challenge, several groups produced results that could be potentially used in clinical settings, with comparable performance to manual delineations. The femur sub-challenge had inferior performance to the head sub-challenge due to the fact that it is a harder segmentation problem and that the techniques presented relied more on the femur's appearance.

  8. A Quantitative Technique for Beginning Microscopists.

    ERIC Educational Resources Information Center

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  9. Rational drug therapy education in clinical phase carried out by task-based learning

    PubMed Central

    Bilge, S. Sırrı; Akyüz, Bahar; Ağrı, Arzu Erdal; Özlem, Mıdık

    2017-01-01

    Objectives: Irrational drug use results in drug interactions, treatment noncompliance, and drug resistance. Rational pharmacotherapy education is being implemented in many faculties of medicine. Our aim is to introduce rational pharmacotherapy education by clinicians and to evaluate task-based rational drug therapy education in the clinical context. Methods: The Kirkpatrick's evaluation model was used for the evaluation of the program. The participants evaluated the program in terms of constituents of the program, utilization, and contribution to learning. Voluntary participants responded to the evaluation forms after the educational program. Data are evaluated using both quantitative and qualitative tools. SPSS (version 21) used for quantitative data for determining mean and standard deviation values. Descriptive qualitative analysis approach is used for the analysis of open-ended questions. Results: It was revealed that the program and its components have been favorable. A total 95.9% of the students consider the education to be beneficial. Simulated patients practice and personal drug choice/problem-based learning sessions were appreciated by the students in particular. 93.9% of the students stated that all students of medicine should undergo this educational program. Among the five presentations contained in the program, “The Principles of Prescribing” received the highest points (9 ± 1.00) from participating students in general evaluation of the educational program. Conclusion: This study was carried out to improve task-based rational drug therapy education. According to feedback from the students concerning content, method, resource, assessment, and program design; some important changes, especially in number of facilitators and indications, are made in rational pharmacotherapy education in clinical task-based learning program. PMID:28458432

  10. Factors affecting adoption, implementation fidelity, and sustainability of the Redesigned Community Health Fund in Tanzania: a mixed methods protocol for process evaluation in the Dodoma region

    PubMed Central

    Kalolo, Albino; Radermacher, Ralf; Stoermer, Manfred; Meshack, Menoris; De Allegri, Manuela

    2015-01-01

    Background Despite the implementation of various initiatives to address low enrollment in voluntary micro health insurance (MHI) schemes in sub-Saharan Africa, the problem of low enrollment remains unresolved. The lack of process evaluations of such interventions makes it difficult to ascertain whether their poor results are because of design failures or implementation weaknesses. Objective In this paper, we describe a process evaluation protocol aimed at opening the ‘black box’ to evaluate the implementation processes of the Redesigned Community Health Fund (CHF) program in the Dodoma region of Tanzania. Design The study employs a cross-sectional mixed methods design and is being carried out 3 years after the launch of the Redesigned CHF program. The study is grounded in a conceptual framework which rests on the Diffusion of Innovation Theory and the Implementation Fidelity Framework. The study utilizes a mixture of quantitative and qualitative data collection tools (questionnaires, focus group discussions, in-depth interviews, and document review), and aligns the evaluation to the Theory of Intervention developed by our team. Quantitative data will be used to measure program adoption, implementation fidelity, and their moderating factors. Qualitative data will be used to explore the responses of stakeholders to the intervention, contextual factors, and moderators of adoption, implementation fidelity, and sustainability. Discussion This protocol describes a systematic process evaluation in relation to the implementation of a reformed MHI. We trust that the theoretical approaches and methodologies described in our protocol may be useful to inform the design of future process evaluations focused on the assessment of complex interventions, such as MHI schemes. PMID:26679408

  11. Implementation and Evaluation of a Smartphone-Based Telemonitoring Program for Patients With Heart Failure: Mixed-Methods Study Protocol

    PubMed Central

    Ross, Heather J; Cafazzo, Joseph A; Laporte, Audrey; Seto, Emily

    2018-01-01

    Background Meta-analyses of telemonitoring for patients with heart failure conclude that it can lower the utilization of health services and improve health outcomes compared with the standard of care. A smartphone-based telemonitoring program is being implemented as part of the standard of care at a specialty care clinic for patients with heart failure in Toronto, Canada. Objective The objectives of this study are to (1) evaluate the impact of the telemonitoring program on health service utilization, patient health outcomes, and their ability to self-care; (2) identify the contextual barriers and facilitators of implementation at the physician, clinic, and institutional level; (3) describe patient usage patterns to determine adherence and other behaviors in the telemonitoring program; and (4) evaluate the costs associated with implementation of the telemonitoring program from the perspective of the health care system (ie, public payer), hospital, and patient. Methods The evaluation will use a mixed-methods approach. The quantitative component will include a pragmatic pre- and posttest study design for the impact and cost analyses, which will make use of clinical data and questionnaires administered to at least 108 patients at baseline and 6 months. Furthermore, outcome data will be collected at 1, 12, and 24 months to explore the longitudinal impact of the program. In addition, quantitative data related to implementation outcomes and patient usage patterns of the telemonitoring system will be reported. The qualitative component involves an embedded single case study design to identify the contextual factors that influenced the implementation. The implementation evaluation will be completed using semistructured interviews with clinicians, and other program staff at baseline, 4 months, and 12 months after the program start date. Interviews conducted with patients will be triangulated with usage data to explain usage patterns and adherence to the system. Results The telemonitoring program was launched in August 2016 and patient enrollment is ongoing. Conclusions The methods described provide an example for conducting comprehensive evaluations of telemonitoring programs. The combination of impact, implementation, and cost evaluations will inform the quality improvement of the existing program and will yield insights into the sustainability of smartphone-based telemonitoring programs for patients with heart failure within a specialty care setting. PMID:29724704

  12. Dual-channel in-line digital holographic double random phase encryption

    PubMed Central

    Das, Bhargab; Yelleswarapu, Chandra S; Rao, D V G L N

    2012-01-01

    We present a robust encryption method for the encoding of 2D/3D objects using digital holography and virtual optics. Using our recently developed dual-plane in-line digital holography technique, two in-line digital holograms are recorded at two different planes and are encrypted using two different double random phase encryption configurations, independently. The process of using two mutually exclusive encryption channels makes the system more robust against attacks since both the channels should be decrypted accurately in order to get a recognizable reconstruction. Results show that the reconstructed object is unrecognizable even when the portion of the correct phase keys used during decryption is close to 75%. The system is verified against blind decryptions by evaluating the SNR and MSE. Validation of the proposed method and sensitivities of the associated parameters are quantitatively analyzed and illustrated. PMID:23471012

  13. Incremental Structured Dictionary Learning for Video Sensor-Based Object Tracking

    PubMed Central

    Xue, Ming; Yang, Hua; Zheng, Shibao; Zhou, Yi; Yu, Zhenghua

    2014-01-01

    To tackle robust object tracking for video sensor-based applications, an online discriminative algorithm based on incremental discriminative structured dictionary learning (IDSDL-VT) is presented. In our framework, a discriminative dictionary combining both positive, negative and trivial patches is designed to sparsely represent the overlapped target patches. Then, a local update (LU) strategy is proposed for sparse coefficient learning. To formulate the training and classification process, a multiple linear classifier group based on a K-combined voting (KCV) function is proposed. As the dictionary evolves, the models are also trained to timely adapt the target appearance variation. Qualitative and quantitative evaluations on challenging image sequences compared with state-of-the-art algorithms demonstrate that the proposed tracking algorithm achieves a more favorable performance. We also illustrate its relay application in visual sensor networks. PMID:24549252

  14. [Simultaneous quantitative analysis of five alkaloids in Sophora flavescens by multi-components assay by single marker].

    PubMed

    Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang

    2013-05-01

    To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.

  15. The systematic development of ROsafe: an intervention to promote STI testing among vocational school students.

    PubMed

    Wolfers, Mireille; de Zwart, Onno; Kok, Gerjo

    2012-05-01

    This article describes the development of ROsafe, an intervention to promote sexually transmitted infection (STI) testing at vocational schools in the Netherlands. Using the planning model of intervention mapping (IM), an educational intervention was designed that consisted of two lessons, an Internet site, and sexual health services at the school sites. IM is a stepwise approach for theory- and evidence-based development and implementation of interventions. It includes six steps: needs assessment, specification of the objectives in matrices, selection of theoretical methods and practical strategies, program design, implementation planning, and evaluation. The processes and outcomes that are performed during Steps 1 to 4 of IM are presented, that is, literature review and qualitative and quantitative research in needs assessment, leading to the definition of the desired behavioral outcomes and objectives. The matrix of change objectives for STI-testing behavior is presented, and then the development of theory into program is described, using examples from the program. Finally, the planning for implementation and evaluation is discussed. The educational intervention used methods that were derived from the social cognitive theory, the elaboration likelihood model, the persuasive communication matrix, and theories about risk communication. Strategies included short movies, discussion, knowledge quiz, and an interactive behavioral self-test through the Internet.

  16. The Impact of the Condenser on Cytogenetic Image Quality in Digital Microscope System

    PubMed Central

    Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong

    2013-01-01

    Background: Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. OBJECTIVE: This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Methods: Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. Results: The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%–70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Conclusions: Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice. PMID:23676284

  17. Participatory modeling and structured decision making

    USGS Publications Warehouse

    Robinson, Kelly F.; Fuller, Angela K.

    2016-01-01

    Structured decision making (SDM) provides a framework for making sound decisions even when faced with uncertainty, and is a transparent, defensible, and replicable method used to understand complex problems. A hallmark of SDM is the explicit incorporation of values and science, which often includes participation from multiple stakeholders, helping to garner trust and ultimately result in a decision that is more likely to be implemented. The core steps in the SDM process are used to structure thinking about natural resources management choices, and include: (1) properly defining the problem and the decision context, (2) determining the objectives that help describe the aspirations of the decision maker, (3) devising management actions or alternatives that can achieve those objectives, (4) evaluating the outcomes or consequences of each alternative on each of the objectives, (5) evaluating trade-offs, and (6) implementing the decision. Participatory modeling for SDM includes engaging stakeholders in some or all of the steps of the SDM process listed above. In addition, participatory modeling often is crucial for creating qualitative and quantitative models of how the system works, providing data for these models, and eliciting expert opinion when data are unavailable. In these ways, SDM provides a framework for decision making in natural resources management that includes participation from stakeholder groups throughout the process, including the modeling phase.

  18. VASSAR: Value assessment of system architectures using rules

    NASA Astrophysics Data System (ADS)

    Selva, D.; Crawley, E. F.

    A key step of the mission development process is the selection of a system architecture, i.e., the layout of the major high-level system design decisions. This step typically involves the identification of a set of candidate architectures and a cost-benefit analysis to compare them. Computational tools have been used in the past to bring rigor and consistency into this process. These tools can automatically generate architectures by enumerating different combinations of decisions and options. They can also evaluate these architectures by applying cost models and simplified performance models. Current performance models are purely quantitative tools that are best fit for the evaluation of the technical performance of mission design. However, assessing the relative merit of a system architecture is a much more holistic task than evaluating performance of a mission design. Indeed, the merit of a system architecture comes from satisfying a variety of stakeholder needs, some of which are easy to quantify, and some of which are harder to quantify (e.g., elegance, scientific value, political robustness, flexibility). Moreover, assessing the merit of a system architecture at these very early stages of design often requires dealing with a mix of: a) quantitative and semi-qualitative data; objective and subjective information. Current computational tools are poorly suited for these purposes. In this paper, we propose a general methodology that can used to assess the relative merit of several candidate system architectures under the presence of objective, subjective, quantitative, and qualitative stakeholder needs. The methodology called VASSAR (Value ASsessment for System Architectures using Rules). The major underlying assumption of the VASSAR methodology is that the merit of a system architecture can assessed by comparing the capabilities of the architecture with the stakeholder requirements. Hence for example, a candidate architecture that fully satisfies all critical sta- eholder requirements is a good architecture. The assessment process is thus fundamentally seen as a pattern matching process where capabilities match requirements, which motivates the use of rule-based expert systems (RBES). This paper describes the VASSAR methodology and shows how it can be applied to a large complex space system, namely an Earth observation satellite system. Companion papers show its applicability to the NASA space communications and navigation program and the joint NOAA-DoD NPOESS program.

  19. Application of Nemerow Index Method and Integrated Water Quality Index Method in Water Quality Assessment of Zhangze Reservoir

    NASA Astrophysics Data System (ADS)

    Zhang, Qian; Feng, Minquan; Hao, Xiaoyan

    2018-03-01

    [Objective] Based on the water quality historical data from the Zhangze Reservoir from the last five years, the water quality was assessed by the integrated water quality identification index method and the Nemerow pollution index method. The results of different evaluation methods were analyzed and compared and the characteristics of each method were identified.[Methods] The suitability of the water quality assessment methods were compared and analyzed, based on these results.[Results] the water quality tended to decrease over time with 2016 being the year with the worst water quality. The sections with the worst water quality were the southern and northern sections.[Conclusion] The results produced by the traditional Nemerow index method fluctuated greatly in each section of water quality monitoring and therefore could not effectively reveal the trend of water quality at each section. The combination of qualitative and quantitative measures of the comprehensive pollution index identification method meant it could evaluate the degree of water pollution as well as determine that the river water was black and odorous. However, the evaluation results showed that the water pollution was relatively low.The results from the improved Nemerow index evaluation were better as the single indicators and evaluation results are in strong agreement; therefore the method is able to objectively reflect the water quality of each water quality monitoring section and is more suitable for the water quality evaluation of the reservoir.

  20. Accuracy of lung nodule density on HRCT: analysis by PSF-based image simulation.

    PubMed

    Ohno, Ken; Ohkubo, Masaki; Marasinghe, Janaka C; Murao, Kohei; Matsumoto, Toru; Wada, Shinichi

    2012-11-08

    A computed tomography (CT) image simulation technique based on the point spread function (PSF) was applied to analyze the accuracy of CT-based clinical evaluations of lung nodule density. The PSF of the CT system was measured and used to perform the lung nodule image simulation. Then, the simulated image was resampled at intervals equal to the pixel size and the slice interval found in clinical high-resolution CT (HRCT) images. On those images, the nodule density was measured by placing a region of interest (ROI) commonly used for routine clinical practice, and comparing the measured value with the true value (a known density of object function used in the image simulation). It was quantitatively determined that the measured nodule density depended on the nodule diameter and the image reconstruction parameters (kernel and slice thickness). In addition, the measured density fluctuated, depending on the offset between the nodule center and the image voxel center. This fluctuation was reduced by decreasing the slice interval (i.e., with the use of overlapping reconstruction), leading to a stable density evaluation. Our proposed method of PSF-based image simulation accompanied with resampling enables a quantitative analysis of the accuracy of CT-based evaluations of lung nodule density. These results could potentially reveal clinical misreadings in diagnosis, and lead to more accurate and precise density evaluations. They would also be of value for determining the optimum scan and reconstruction parameters, such as image reconstruction kernels and slice thicknesses/intervals.

Top