Science.gov

Sample records for image analysis criteria

  1. Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD): Development of Image Analysis Criteria and Examiner Reliability for Image Analysis

    PubMed Central

    Ahmad, Mansur; Hollender, Lars; Odont; Anderson, Quentin; Kartha, Krishnan; Ohrbach, Richard K.; Truelove, Edmond L.; John, Mike T.; Schiffman, Eric L.

    2011-01-01

    Introduction As a part of a multi-site RDC/TMD Validation Project, comprehensive TMJ diagnostic criteria were developed for image analysis using panoramic radiography, magnetic resonance imaging (MRI), and computed tomography (CT). Methods Inter-examiner reliability was estimated using the kappa (k) statistic, and agreement between rater pairs was characterized by overall, positive, and negative percent agreement. CT was the reference standard for assessing validity of other imaging modalities for detecting osteoarthritis (OA). Results For the radiological diagnosis of OA, reliability of the three examiners was poor for panoramic radiography (k = 0.16), fair for MRI (k = 0.46), and close to the threshold for excellent for CT (k = 0.71). Using MRI, reliability was excellent for diagnosing disc displacements (DD) with reduction (k = 0.78) and for DD without reduction (k = 0.94), and was good for effusion (k = 0.64). Overall percent agreement for pair-wise ratings was ≥ 82% for all conditions. Positive percent agreement for diagnosing OA was 19% for panoramic radiography, 59% for MRI, and 84% for CT. Using MRI, positive percent agreement for diagnoses of any DD was 95% and for effusion was 81%. Negative percent agreement was ≥ 88% for all conditions. Compared to CT, panoramic radiography and MRI had poor to marginal sensitivity, respectively, but excellent specificity, in detecting OA. Conclusion Comprehensive image analysis criteria for RDC/TMD Validation Project were developed, which can reliably be employed for assessing OA using CT, and for disc position and effusion using MRI. PMID:19464658

  2. Difference image analysis: automatic kernel design using information criteria

    NASA Astrophysics Data System (ADS)

    Bramich, D. M.; Horne, Keith; Alsubai, K. A.; Bachelet, E.; Mislis, D.; Parley, N.

    2016-03-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularization. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unregularized delta basis functions, combined with either the Akaike or Takeuchi information criterion, is the best kernel solution method in terms of photometric accuracy. Our results are validated by tests performed on two independent sets of real data. Finally, we provide some important recommendations for software implementations of difference image analysis.

  3. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  4. Terahertz Wide-Angle Imaging and Analysis on Plane-wave Criteria Based on Inverse Synthetic Aperture Techniques

    NASA Astrophysics Data System (ADS)

    Gao, Jing Kun; Qin, Yu Liang; Deng, Bin; Wang, Hong Qiang; Li, Jin; Li, Xiang

    2016-04-01

    This paper presents two parts of work around terahertz imaging applications. The first part aims at solving the problems occurred with the increasing of the rotation angle. To compensate for the nonlinearity of terahertz radar systems, a calibration signal acquired from a bright target is always used. Generally, this compensation inserts an extra linear phase term in the intermediate frequency (IF) echo signal which is not expected in large-rotation angle imaging applications. We carried out a detailed theoretical analysis on this problem, and a minimum entropy criterion was employed to estimate and compensate for the linear-phase errors. In the second part, the effects of spherical wave on terahertz inverse synthetic aperture imaging are analyzed. Analytic criteria of plane-wave approximation were derived in the cases of different rotation angles. Experimental results of corner reflectors and an aircraft model based on a 330-GHz linear frequency-modulated continuous wave (LFMCW) radar system validated the necessity and effectiveness of the proposed compensation. By comparing the experimental images obtained under plane-wave assumption and spherical-wave correction, it also showed to be highly consistent with the analytic criteria we derived.

  5. Epilepsy Imaging Study Guideline Criteria

    PubMed Central

    Gaillard, William D; Cross, J Helen; Duncan, John S; Stefan, Hermann; Theodore, William H

    2011-01-01

    Recognition of limited economic resources, as well as potential adverse effects of ‘over testing,’ has increased interest in ‘evidence-based’ assessment of new medical technology. This creates a particular problem for evaluation and treatment of epilepsy, increasingly dependent on advanced imaging and electrophysiology, since there is a marked paucity of epilepsy diagnostic and prognostic studies that meet rigorous standards for evidence classification. The lack of high quality data reflects fundamental weaknesses in many imaging studies but also limitations in the assumptions underlying evidence classification schemes as they relate to epilepsy, and to the practicalities of conducting adequately powered studies of rapidly evolving technologies. We review the limitations of current guidelines and propose elements for imaging studies that can contribute meaningfully to the epilepsy literature. PMID:21740417

  6. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  7. Criteria for phonological process analysis.

    PubMed

    McReynolds, L V; Elbert, M

    1981-05-01

    Investigators have proposed that children with functional articulation disorders should be relabelled phonologically disordered. To support this proposal, evidence has been presented in the literature demonstrating that children's error patterns reflect the operation of phonological processes. No quantitative or qualitative criteria have been offered to differentiate these processes from surface error patterns. The purpose of the present descriptive study was to determine if differences would be found when two kinds of process analyses were employed: a nonquantitative criteria analysis as conducted in the studies reported in the literature, and a quantitative criteria analysis. Speech samples were obtained from 13 children with functional articulation problems. Their errors were submitted to the two analysis procedures. Results indicated that the number of identified processes were reduced when minimum quantitative criteria were used from the number identified when no quantitative criteria were imposed. The decrease occurred in individual children's patterns as well as across the patterns of the 13 children. It is suggested that there is a need to establish reasonable quantitative and qualitative criteria for phonological process identification.

  8. Accident analysis and DOE criteria

    SciTech Connect

    Graf, J.M.; Elder, J.C.

    1982-01-01

    In analyzing the radiological consequences of major accidents at DOE facilities one finds that many facilities fall so far below the limits of DOE Order 6430 that compliance is easily demonstrated by simple analysis. For those cases where the amount of radioactive material and the dispersive energy available are enough for accident consequences to approach the limits, the models and assumptions used become critical. In some cases the models themselves are the difference between meeting the criteria or not meeting them. Further, in one case, we found that not only did the selection of models determine compliance but the selection of applicable criteria from different chapters of Order 6430 also made the difference. DOE has recognized the problem of different criteria in different chapters applying to one facility, and has proceeded to make changes for the sake of consistency. We have proposed to outline the specific steps needed in an accident analysis and suggest appropriate models, parameters, and assumptions. As a result we feed DOE siting and design criteria will be more fairly and consistently applied.

  9. Assessing the performance of four different categories of histological criteria in brain tumours grading by means of a computer-aided diagnosis image analysis system.

    PubMed

    Kostopoulos, S; Konstandinou, C; Sidiropoulos, K; Ravazoula, P; Kalatzis, I; Asvestas, P; Cavouras, D; Glotsos, D

    2015-10-01

    Brain tumours are considered one of the most lethal and difficult to treat forms of cancer, with unknown aetiology and lack of any realistic screening. In this study, we examine, whether the combination of descriptive criteria, used by expert histopathologists in assessing histologic tissue samples, and quantitative image analysis features may improve the diagnostic accuracy of brain tumour grading. Data comprised 61 cases of brain cancers (astrocytomas, oligodendrogliomas, meningiomas) collected from the archives of the University Hospital of Patras, Greece. Incorporating physician's descriptive criteria and image analysis's quantitative features into a discriminant function, a computer-aided diagnosis system was designed for discriminating low-grade from high-grade brain tumours. Physician's descriptive features, when solely used in the system, proved of high discrimination accuracy (93.4%). When verbal descriptive features were combined with quantitative image analysis features in the system, discrimination accuracy improved to 98.4%. The generalization of the proposed system to unseen data converged to an overall prediction accuracy of 86.7% ± 5.4%. Considering that histological grading affects treatment selection and diagnostic errors may be notable in clinical practice, the utilization of the proposed system may safeguard against diagnostic misinterpretations in every day clinical practice. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  10. Users' Relevance Criteria in Image Retrieval in American History.

    ERIC Educational Resources Information Center

    Choi, Youngok; Rasmussen, Edie M.

    2002-01-01

    Discussion of the availability of digital images focuses on a study of American history faculty and graduate students that investigated the criteria which image users apply when making judgments about the relevance of an image. Considers topicality and image quality and suggests implications for image retrieval system design. (Contains 63…

  11. Evaluation of refocus criteria for holographic particle imaging

    NASA Astrophysics Data System (ADS)

    Picart, Pascal; Kara Mohammed, Soumaya; Bouamama, Larbi; Bahloul, Derradji

    2017-06-01

    This paper proposes a quality assessment of focusing criteria for imaging in digital off-axis holography. In literature, several refocus criteria have been proposed in the past to get the best refocus distance in digital holography. As a general rule, the best focusing plane is determined by the reconstruction distance for which the criterion function presents a maximum or a minimum. To evaluate the robustness of these criteria, a set of thirteen criteria is compared with application on both amplitude and phase images from off-axis holographic data. Experimental results lead to define general rule and to exhibit the most robust criteria for accurate and rapid refocusing in digital holography.

  12. Repository operational criteria comparative analysis

    SciTech Connect

    Hageman, J.P.; Chowdhury, A.H.

    1994-06-01

    The objective of the ``Repository Operational Criteria (ROC) Feasibility Studies`` (or ROC task) was to conduct comprehensive and integrated analyses of repository design, construction, and operations criteria in 10 CFR Part 60 regulations considering the interfaces among the components of the regulations and impacts of any potential changes to those regulations. The ROC task addresses regulatory criteria and uncertainties related to the preclosure aspects of the geologic repository. Those parts of 10 CFR Part 60 that require routine guidance or minor changes to the rule were addressed in Hageman and Chowdhury, 1992. The ROC task shows a possible need for further regulatory clarity, by major changes to the rule, related to the design bases and siting of a geologic repository operations area and radiological emergency planning in order to assure defense-in-depth. The analyses, presented in this report, resulted in the development and refinement of regulatory concepts and their supporting rationale for recommendations for potential major changes to 10 CFR Pan 0 regulations.

  13. Mangrove vulnerability modelling in parts of Western Niger Delta, Nigeria using satellite images, GIS techniques and Spatial Multi-Criteria Analysis (SMCA).

    PubMed

    Omo-Irabor, Omo O; Olobaniyi, Samuel B; Akunna, Joe; Venus, Valentijn; Maina, Joseph M; Paradzayi, Charles

    2011-07-01

    Mangroves are known for their global environmental and socioeconomic value. Despite their importance, mangrove like other ecosystems is now being threatened by natural and human-induced processes that damage them at alarming rates, thereby diminishing the limited number of existing mangrove vegetation. The development of a spatial vulnerability assessment model that takes into consideration environmental and socioeconomic criteria, in spatial and non-spatial formats has been attempted in this study. According to the model, 11 different input parameters are required in modelling mangrove vulnerability. These parameters and their effects on mangrove vulnerability were selected and weighted by experts in the related fields. Criteria identification and selection were mainly based on effects of environmental and socioeconomic changes associated with mangrove survival. The results obtained revealed the dominance of socioeconomic criteria such as population pressure and deforestation, with high vulnerability index of 0.75. The environmental criteria was broadly dispersed in the study area and represents vulnerability indices ranging from 0.00-0.75. This category reflects the greater influence of pollutant input from oil wells and pipelines and minimal contribution from climatic factors. This project has integrated spatial management framework for mangrove vulnerability assessment that utilises information technology in conjunction with expert knowledge and multi-criteria analysis to aid planners and policy/ decision makers in the protection of this very fragile ecosystem.

  14. Quality criteria for simulator images - A literature review

    NASA Astrophysics Data System (ADS)

    Padmos, Pieter; Milders, Maarten V.

    1992-12-01

    Quality criteria are presented for each of about 30 different outside-world image features of computer-generated image systems on vehicle simulators (e.g., airplane, tank, ship). Criteria derived are based on a literature review. In addition to purely physical properties related to image presentation (e.g., field size, contrast ratio, update frequency), attention is paid to image content (e.g., number of polygons, surface treatments, moving objects) and various other features (e.g., electro-optical aids, vehicle-terrain interactions, modeling tools, instruction tools). Included in this paper are an introduction on visual perception, separate discussions of each image feature including terminology definitions, and suggestions for further research.

  15. Prioritization criteria of objective index for disaster management by satellite image processing

    NASA Astrophysics Data System (ADS)

    Poursaber, Mohammad R.; Ariki, Yasuo; Safi, Mohammad

    2014-10-01

    The outputs obtained from satellite image processing generally presents various information based on the interpretation technique, selected objects for object based processing, precision of processing, the number and time of images used for this process. This issue should be managed well during a disaster management process based on satellite images. Very high resolution (VHR) optical satellite data are potential sources to provide detailed information on damage and geological changes for a large area in a short time. In this paper, we studied tsunami triggered area, which was caused on 11 March 2011 by Tohoku earthquake, using VHR data from GeoEye-1satellite images. A set of pre and post-earthquake images were used to perform visual change analysis through comparison of these data. These images include the data of the same area before the disaster in normal condition and after the disaster which caused changes and also some modification imposed to that area. Upon occurrence of a disaster, the images are used to estimate the extent of the damage. Then based on disaster management criteria and the needs for recovery and reconstruction, the priorities for object based classification indexes are defined. In post-disaster management, they are used for reconstruction and sustainable development activities. Finally a classified characteristic definition has been proposed which can be used as sample indexes prioritization criteria for disaster management based on satellite image processing. This prioritization criteria are based on an object based processing technique and can be further developed for other image processing methods.

  16. Vestibular migraine: comparative analysis between diagnostic criteria.

    PubMed

    Salmito, Márcio Cavalcante; Morganti, Lígia Oliveira Gonçalves; Nakao, Bruno Higa; Simões, Juliana Caminha; Duarte, Juliana Antoniolli; Ganança, Fernando Freitas

    2015-01-01

    There is a strong association between vertigo and migraine. Vestibular migraine (VM) was described in 1999, and diagnostic criteria were proposed in 2001 and revised in 2012. To compare the diagnostic criteria for VM proposed in 2001 with 2012 criteria with respect to their diagnostic power and therapeutic effect of VM prophylaxis. Clinical chart review of patients attended to in a VM clinic. The 2012 criteria made the diagnosis more specific, restricting the diagnosis of VM to a smaller number of patients, such that 87.7% of patients met 2001 criteria and 77.8% met 2012 criteria. Prophylaxis for VM was effective both for patients diagnosed by either set of criteria and for those who did not meet any of the criteria. The 2012 diagnostic criteria for VM limited the diagnosis of the disease to a smaller number of patients, mainly because of the type, intensity, and duration of dizziness. Patients diagnosed with migraine and associated dizziness demonstrated improvement after prophylactic treatment of VM, even when they did not meet diagnostic criteria. Copyright © 2015 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  17. HER2 amplification in gastroesophageal adenocarcinoma: correlation of two antibodies using gastric cancer scoring criteria, H score, and digital image analysis with fluorescence in situ hybridization.

    PubMed

    Radu, Oana M; Foxwell, Tyler; Cieply, Kathleen; Navina, Sarah; Dacic, Sanja; Nason, Katie S; Davison, Jon M

    2012-04-01

    We assessed 103 resected gastroesophageal adenocarcinomas for HER2 amplification by fluorescence in situ hybridization (FISH) and 2 commercial immunohistochemical assays. Of 103, 30 (29%) were FISH-amplified. Both immunohistochemical assays had greater than 95% concordance with FISH. However, as a screening test for FISH amplification, the Ventana Medical Systems (Tucson, AZ) 4B5 antibody demonstrated superior sensitivity (87%) compared with the DAKO (Carpinteria, CA) A0485 (70%). Of the cases, 28 were immunohistochemically 3+ or immunohistochemically 2+/FISH-amplified with the 4B5 assay compared with only 22 cases with the A0485 assay, representing a large potential difference in patient eligibility for anti-HER2 therapy. Cases with low-level FISH amplification (HER2/CEP17, 2.2-4.0) express lower levels of HER2 protein compared with cases with high-level amplification (HER2/CEP17, ≥4.0), raising the possibility of a differential response to anti-HER2 therapy. The H score and digital image analysis may have a limited role in improving HER2 test performance.

  18. Retinal Imaging and Image Analysis

    PubMed Central

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:21743764

  19. Analysis of the impact of safeguards criteria

    SciTech Connect

    Mullen, M.F.; Reardon, P.T.

    1981-01-01

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) of the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.

  20. Morphological criteria of feminine upper eyelashes, quantified by a new semi-automatized image analysis: Application to the objective assessment of mascaras.

    PubMed

    Shaiek, A; Flament, F; François, G; Vicic, M; Cointereau-Chardron, S; Curval, E; Canevet-Zaida, S; Coubard, O; Idelcaid, Y

    2017-09-24

    The wide diversity of feminine eyelashes in shape, length, and curvature makes it a complex domain that remains to be quantified in vivo, together with their changes brought by application of mascaras that are visually assessed by women themselves or make-up experts. A dedicated software was developed to semi-automatically extract and quantify, from digital images (frontal and lateral pictures), the major parameters of feminine eyelashes of Mexican and Caucasian women and to record the changes brought by the applications of various mascaras and their brushes, being self or professionally applied. The diversity of feminine eyelashes appears as a major influencing factor in the application of mascaras and their related results. Eight marketed mascaras and their respective brushes were tested and their quantitative profiles, in terms of coverage, morphology, or curvature were assessed. Standard applications by trained aestheticians led to higher and more homogeneous deposits of mascara, as compared to those resulting from self-applications. The developed software appears a precious tool for both quantifying the major characteristics of eyelashes and assessing the making-up results brought by mascaras and their associated brushes. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Algorithms and Array Design Criteria for Robust Imaging in Interferometry

    DTIC Science & Technology

    2016-04-01

    of a pattern family popular in the literature . The end result of this set of analysis is, to the best of our knowledge, the first sufficient condition...scenes which closely match its point- source-collection assumption. The astronomical community has hence been led to consider more generally- applicable ...a compact image intensity. This kind of joint metric has been suggested for optical interferometric applications before in the work of Thiébaut (2013

  2. An Analysis of Stopping Criteria in Artificial Neural Networks

    DTIC Science & Technology

    1994-03-01

    I’AD-A278 491(1 AN ANALYSIS OF STOPPING CRITERIA IN ARTIFICIAL NEURAL NETWORKS THESIS Bruce Kostal Captain, USAF AFIT/GST/ENS/94M 07 D I ELECTE APR...ANALYSIS OF STOPPING CRITERIA IN ARTIFICIAL NEURAL NETWORKS THESIS Bruce Kostal Captain, USAF AFIT/GST/ENS/94M-07 ETIC ELECTE 94-12275 APR2 1994 U Approved...for public release; distributi6 unlimited D94󈧮i •6 AFIT/GST/ENS/94M-07 AN ANALYSIS OF STOPPING CRITERIA IN ARTIFICIAL NEURAL NETWORKS THESIS

  3. On Model Selection Criteria in Multimodel Analysis

    NASA Astrophysics Data System (ADS)

    Meyer, P. D.; Ye, M.; Neuman, S. P.

    2007-12-01

    Hydrologic systems are open and complex, rendering them prone to multiple conceptualizations and mathematical descriptions. There has been a growing tendency to postulate several alternative hydrologic models for a site and use model selection criteria to (a) rank these models, (b) eliminate some of them and/or (c) weigh and average predictions and statistics generated by multiple models. This has led to some debate among hydrogeologists about the merits and demerits of common model selection (also known as model discrimination or information) criteria such as AIC, AICc, BIC, and KIC and some lack of clarity about the proper interpretation and mathematical representation of each criterion. In particular, whereas we [Neuman, 2003; Ye et al., 2004, 2005; Meyer et al., 2007] have based our approach to multimodel hydrologic ranking and inference on the Bayesian criterion KIC (which reduces asymptotically to BIC), Poeter and Anderson [2005] have voiced a strong preference for the information-theoretic criterion AICc (which reduces asymptotically to AIC). Their preference stems in part from a perception that KIC and BIC require a "true" or "quasi-true" model to be in the set of alternatives while AIC and AICc are free of such an unreasonable requirement. We examine the model selection literature to find that (a) all published rigorous derivations of AIC and AICc require that the (true) model having generated the observational data be in the set of candidate models; (b) though BIC and KIC were originally derived by assuming that such a model is in the set, BIC has been rederived by Cavanaugh and Neath [1999] without the need for such an assumption; (c) KIC reduces to BIC as the number of observations becomes large relative to the number of adjustable model parameters, implying that it likewise does not require the existence of a true model in the set of alternatives; (d) if a true model is in the set, BIC and KIC select with probability one the true model as sample size

  4. ACR Appropriateness Criteria(®) Imaging of Possible Tuberculosis.

    PubMed

    Ravenel, James G; Chung, Jonathan H; Ackman, Jeanne B; de Groot, Patricia M; Johnson, Geoffrey B; Jokerst, Clinton; Maldonado, Fabien; McComb, Barbara L; Steiner, Robert M; Mohammed, Tan-Lucien

    2017-05-01

    Pulmonary tuberculosis remains a major cause of disease worldwide and an important public health hazard in the United States. The imaging evaluation depends to a large degree on clinical symptoms and whether active disease is suspected or a subject is at high risk for developing active disease. The American College of Radiology Appropriateness Criteria are evidence-based guidelines for specific clinical conditions that are reviewed annually by a multidisciplinary expert panel. The guideline development and revision include an extensive analysis of current medical literature from peer reviewed journals and the application of well-established methodologies (RAND/UCLA Appropriateness Method and Grading of Recommendations Assessment, Development, and Evaluation or GRADE) to rate the appropriateness of imaging and treatment procedures for specific clinical scenarios. In those instances where evidence is lacking or equivocal, expert opinion may supplement the available evidence to recommend imaging or treatment. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  5. A new tool for analysis of cleanup criteria decisions.

    SciTech Connect

    Klemic, G. A.; Bailey, P.; Elcock, D.; Environmental Assessment; USDOE

    2003-08-01

    Radionuclides and other hazardous materials resulting from processes used in nuclear weapons production contaminate soil, groundwater, and buildings around the United States. Cleanup criteria for environmental contaminants are agreed on prior to remediation and underpin the scope and legacy of the cleanup process. Analysis of cleanup criteria can be relevant for future agreements and may also provide insight into a complex decision making process where science and policy issues converge. An Internet accessible database has been established to summarize cleanup criteria and related factors involved in U.S. Department of Energy remediation decisions. This paper reports on a new user interface for the database that is designed to integrate related information into graphic displays and tables with interactive features that allow exploratory data analysis of cleanup criteria. Analysis of 137Cs in surface soil is presented as an example.

  6. Image Analysis and Modeling

    DTIC Science & Technology

    1976-03-01

    This report summarizes the results of the research program on Image Analysis and Modeling supported by the Defense Advanced Research Projects Agency...The objective is to achieve a better understanding of image structure and to use this knowledge to develop improved image models for use in image ... analysis and processing tasks such as information extraction, image enhancement and restoration, and coding. The ultimate objective of this research is

  7. Image-analysis library

    NASA Technical Reports Server (NTRS)

    1980-01-01

    MATHPAC image-analysis library is collection of general-purpose mathematical and statistical routines and special-purpose data-analysis and pattern-recognition routines for image analysis. MATHPAC library consists of Linear Algebra, Optimization, Statistical-Summary, Densities and Distribution, Regression, and Statistical-Test packages.

  8. Basics of image analysis

    USDA-ARS?s Scientific Manuscript database

    Hyperspectral imaging technology has emerged as a powerful tool for quality and safety inspection of food and agricultural products and in precision agriculture over the past decade. Image analysis is a critical step in implementing hyperspectral imaging technology; it is aimed to improve the qualit...

  9. Evaluation of Dairy Effluent Management Options Using Multiple Criteria Analysis

    NASA Astrophysics Data System (ADS)

    Hajkowicz, Stefan A.; Wheeler, Sarah A.

    2008-04-01

    This article describes how options for managing dairy effluent on the Lower Murray River in South Australia were evaluated using multiple criteria analysis (MCA). Multiple criteria analysis is a framework for combining multiple environmental, social, and economic objectives in policy decisions. At the time of the study, dairy irrigation in the region was based on flood irrigation which involved returning effluent to the river. The returned water contained nutrients, salts, and microbial contaminants leading to environmental, human health, and tourism impacts. In this study MCA was used to evaluate 11 options against 6 criteria for managing dairy effluent problems. Of the 11 options, the MCA model selected partial rehabilitation of dairy paddocks with the conversion of remaining land to other agriculture. Soon after, the South Australian Government adopted this course of action and is now providing incentives for dairy farmers in the region to upgrade irrigation infrastructure and/or enter alternative industries.

  10. Evaluation of dairy effluent management options using multiple criteria analysis.

    PubMed

    Hajkowicz, Stefan A; Wheeler, Sarah A

    2008-04-01

    This article describes how options for managing dairy effluent on the Lower Murray River in South Australia were evaluated using multiple criteria analysis (MCA). Multiple criteria analysis is a framework for combining multiple environmental, social, and economic objectives in policy decisions. At the time of the study, dairy irrigation in the region was based on flood irrigation which involved returning effluent to the river. The returned water contained nutrients, salts, and microbial contaminants leading to environmental, human health, and tourism impacts. In this study MCA was used to evaluate 11 options against 6 criteria for managing dairy effluent problems. Of the 11 options, the MCA model selected partial rehabilitation of dairy paddocks with the conversion of remaining land to other agriculture. Soon after, the South Australian Government adopted this course of action and is now providing incentives for dairy farmers in the region to upgrade irrigation infrastructure and/or enter alternative industries.

  11. GIS Based Multi-Criteria Decision Analysis For Cement Plant Site Selection For Cuddalore District

    NASA Astrophysics Data System (ADS)

    Chhabra, A.

    2015-12-01

    India's cement industry is a vital part of its economy, providing employment to more than a million people. On the back of growing demands, due to increased construction and infrastructural activities cement market in India is expected to grow at a compound annual growth rate (CAGR) of 8.96 percent during the period 2014-2019. In this study, GIS-based spatial Multi Criteria Decision Analysis (MCDA) is used to determine the optimum and alternative sites to setup a cement plant. This technique contains a set of evaluation criteria which are quantifiable indicators of the extent to which decision objectives are realized. In intersection with available GIS (Geographical Information System) and local ancillary data, the outputs of image analysis serves as input for the multi-criteria decision making system. Moreover, the following steps were performed so as to represent the criteria in GIS layers, which underwent the GIS analysis in order to get several potential sites. Satellite imagery from LANDSAT 8 and ASTER DEM were used for the analysis. Cuddalore District in Tamil Nadu was selected as the study site as limestone mining is already being carried out in that region which meets the criteria of raw material for cement production. Several other criteria considered were land use land cover (LULC) classification (built-up area, river, forest cover, wet land, barren land, harvest land and agriculture land), slope, proximity to road, railway and drainage networks.

  12. 10 CFR 434.607 - Life cycle cost analysis criteria.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Life cycle cost analysis criteria. 434.607 Section 434.607 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.607 Life cycle cost...

  13. 10 CFR 434.607 - Life cycle cost analysis criteria.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Life cycle cost analysis criteria. 434.607 Section 434.607 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.607 Life cycle cost...

  14. 10 CFR 434.607 - Life cycle cost analysis criteria.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Life cycle cost analysis criteria. 434.607 Section 434.607 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.607 Life cycle cost...

  15. 10 CFR 434.607 - Life cycle cost analysis criteria.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Life cycle cost analysis criteria. 434.607 Section 434.607 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.607 Life cycle cost...

  16. 10 CFR 434.607 - Life cycle cost analysis criteria.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Life cycle cost analysis criteria. 434.607 Section 434.607 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.607 Life cycle cost...

  17. Resolution criteria in double-slit microscopic imaging experiments

    NASA Astrophysics Data System (ADS)

    You, Shangting; Kuang, Cuifang; Zhang, Baile

    2016-09-01

    Double-slit imaging is widely used for verifying the resolution of high-resolution and super-resolution microscopies. However, due to the fabrication limits, the slit width is generally non-negligible, which can affect the claimed resolution. In this paper we theoretically calculate the electromagnetic field distribution inside and near the metallic double slit using waveguide mode expansion method, and acquire the far-field image by vectorial Fourier optics. We find that the slit width has minimal influence when the illuminating light is polarized parallel to the slits. In this case, the claimed resolution should be based on the center-to-center distance of the double-slit.

  18. Forensic video image analysis

    NASA Astrophysics Data System (ADS)

    Edwards, Thomas R.

    1997-02-01

    Forensic video image analysis is a new scientific tool for perpetrator enhancement and identification in poorly recorded crime scene situations. Forensic video image analysis is emerging technology for law enforcement, industrial security and surveillance addressing the following problems often found in these poor quality video recorded incidences.

  19. A Study on the Basic Criteria for Selecting Heterogeneity Parameters of F18-FDG PET Images.

    PubMed

    Forgacs, Attila; Pall Jonsson, Hermann; Dahlbom, Magnus; Daver, Freddie; D DiFranco, Matthew; Opposits, Gabor; K Krizsan, Aron; Garai, Ildiko; Czernin, Johannes; Varga, Jozsef; Tron, Lajos; Balkay, Laszlo

    2016-01-01

    Textural analysis might give new insights into the quantitative characterization of metabolically active tumors. More than thirty textural parameters have been investigated in former F18-FDG studies already. The purpose of the paper is to declare basic requirements as a selection strategy to identify the most appropriate heterogeneity parameters to measure textural features. Our predefined requirements were: a reliable heterogeneity parameter has to be volume independent, reproducible, and suitable for expressing quantitatively the degree of heterogeneity. Based on this criteria, we compared various suggested measures of homogeneity. A homogeneous cylindrical phantom was measured on three different PET/CT scanners using the commonly used protocol. In addition, a custom-made inhomogeneous tumor insert placed into the NEMA image quality phantom was imaged with a set of acquisition times and several different reconstruction protocols. PET data of 65 patients with proven lung lesions were retrospectively analyzed as well. Four heterogeneity parameters out of 27 were found as the most attractive ones to characterize the textural properties of metabolically active tumors in FDG PET images. These four parameters included Entropy, Contrast, Correlation, and Coefficient of Variation. These parameters were independent of delineated tumor volume (bigger than 25-30 ml), provided reproducible values (relative standard deviation< 10%), and showed high sensitivity to changes in heterogeneity. Phantom measurements are a viable way to test the reliability of heterogeneity parameters that would be of interest to nuclear imaging clinicians.

  20. Improvement and Extension of Shape Evaluation Criteria in Multi-Scale Image Segmentation

    NASA Astrophysics Data System (ADS)

    Sakamoto, M.; Honda, Y.; Kondo, A.

    2016-06-01

    From the last decade, the multi-scale image segmentation is getting a particular interest and practically being used for object-based image analysis. In this study, we have addressed the issues on multi-scale image segmentation, especially, in improving the performances for validity of merging and variety of derived region's shape. Firstly, we have introduced constraints on the application of spectral criterion which could suppress excessive merging between dissimilar regions. Secondly, we have extended the evaluation for smoothness criterion by modifying the definition on the extent of the object, which was brought for controlling the shape's diversity. Thirdly, we have developed new shape criterion called aspect ratio. This criterion helps to improve the reproducibility on the shape of object to be matched to the actual objectives of interest. This criterion provides constraint on the aspect ratio in the bounding box of object by keeping properties controlled with conventional shape criteria. These improvements and extensions lead to more accurate, flexible, and diverse segmentation results according to the shape characteristics of the target of interest. Furthermore, we also investigated a technique for quantitative and automatic parameterization in multi-scale image segmentation. This approach is achieved by comparing segmentation result with training area specified in advance by considering the maximization of the average area in derived objects or satisfying the evaluation index called F-measure. Thus, it has been possible to automate the parameterization that suited the objectives especially in the view point of shape's reproducibility.

  1. A Study on the Basic Criteria for Selecting Heterogeneity Parameters of F18-FDG PET Images

    PubMed Central

    Forgacs, Attila; Pall Jonsson, Hermann; Dahlbom, Magnus; Daver, Freddie; D. DiFranco, Matthew; Opposits, Gabor; K. Krizsan, Aron; Garai, Ildiko; Czernin, Johannes; Varga, Jozsef; Tron, Lajos; Balkay, Laszlo

    2016-01-01

    Textural analysis might give new insights into the quantitative characterization of metabolically active tumors. More than thirty textural parameters have been investigated in former F18-FDG studies already. The purpose of the paper is to declare basic requirements as a selection strategy to identify the most appropriate heterogeneity parameters to measure textural features. Our predefined requirements were: a reliable heterogeneity parameter has to be volume independent, reproducible, and suitable for expressing quantitatively the degree of heterogeneity. Based on this criteria, we compared various suggested measures of homogeneity. A homogeneous cylindrical phantom was measured on three different PET/CT scanners using the commonly used protocol. In addition, a custom-made inhomogeneous tumor insert placed into the NEMA image quality phantom was imaged with a set of acquisition times and several different reconstruction protocols. PET data of 65 patients with proven lung lesions were retrospectively analyzed as well. Four heterogeneity parameters out of 27 were found as the most attractive ones to characterize the textural properties of metabolically active tumors in FDG PET images. These four parameters included Entropy, Contrast, Correlation, and Coefficient of Variation. These parameters were independent of delineated tumor volume (bigger than 25–30 ml), provided reproducible values (relative standard deviation< 10%), and showed high sensitivity to changes in heterogeneity. Phantom measurements are a viable way to test the reliability of heterogeneity parameters that would be of interest to nuclear imaging clinicians. PMID:27736888

  2. Endovascular Stroke Treatment Outcomes After Patient Selection Based on Magnetic Resonance Imaging and Clinical Criteria.

    PubMed

    Leslie-Mazwi, Thabele M; Hirsch, Joshua A; Falcone, Guido J; Schaefer, Pamela W; Lev, Michael H; Rabinov, James D; Rost, Natalia S; Schwamm, Lee; González, R Gilberto

    2016-01-01

    Which imaging modality is optimal to select patients for endovascular stroke treatment remains unclear. To evaluate the effectiveness of specific magnetic resonance imaging (MRI) and clinical criteria in the selection of patients with acute ischemic stroke for thrombectomy. In this observational, single-center, prospective cohort study, we studied 72 patients with middle cerebral artery or terminal internal carotid artery occlusion using computed tomographic angiography, followed by core infarct volume determination by diffusion weighted MRI, who underwent thrombectomy after meeting institutional criteria from January 1, 2012, through December 31, 2014. In this period, 31 patients with similar ischemic strokes underwent endovascular treatment without MRI and are categorized as computed tomography only and considered in a secondary analysis. Patients were prospectively classified as likely to benefit (LTB) or uncertain to benefit (UTB) using diffusion-weighted imaging lesion volume and clinical criteria (age, National Institutes of Health Stroke Scale score, time from onset, baseline modified Rankin Scale [mRS] score, life expectancy). The 90-day mRS score, with favorable defined as a 90-day mRS score of 2 or less. Forty patients were prospectively classified as LTB and 32 as UTB. Reperfusion (71 of 103 patients) and prospective categorization as LTB (40 of 103 patients) were associated with favorable outcomes (P < .001 and P < .005, respectively). Successful reperfusion positively affected the distribution of mRS scores of the LTB cohort (P < .001). Reperfusion was achieved in 27 LTB patients (67.5%) and 24 UTB patients (75.0%) (P = .86). Favorable outcomes were obtained in 21 (52.5%) and 8 (25.0%) of LTB and UTB patients who were treated, respectively (P = .02). Favorable outcomes were observed in 20 of the 27 LTB patients (74.1%) who had successful reperfusion compared with 8 of the 24 UTB patients (33.3%) who had successful reperfusion (P

  3. Improving diagnostic criteria for Propionibacterium acnes osteomyelitis: a retrospective analysis.

    PubMed

    Asseray, Nathalie; Papin, Christophe; Touchais, Sophie; Bemer, Pascale; Lambert, Chantal; Boutoille, David; Tequi, Brigitte; Gouin, François; Raffi, François; Passuti, Norbert; Potel, Gilles

    2010-07-01

    The identification of Propionibacterium acnes in cultures of bone and joint samples is always difficult to interpret because of the ubiquity of this microorganism. The aim of this study was to propose a diagnostic strategy to distinguish infections from contaminations. This was a retrospective analysis of all patient charts of those patients with >or=1 deep samples culture-positive for P. acnes. Every criterion was tested for sensitivity, specificity, and positive likelihood ratio, and then the diagnostic probability of combinations of criteria was calculated. Among 65 patients, 52 (80%) were considered truly infected with P. acnes, a diagnosis based on a multidisciplinary process. The most valuable diagnostic criteria were: >or=2 positive deep samples, peri-operative findings (necrosis, hardware loosening, etc.), and >or=2 surgical procedures. However, no single criterion was sufficient to ascertain the diagnosis. The following combinations of criteria had a diagnostic probability of >90%: >or=2 positive cultures + 1 criterion among: peri-operative findings, local signs of infection, >or=2 previous operations, orthopaedic devices; 1 positive culture + 3 criteria among: peri-operative findings, local signs of infection, >or=2 previous surgical operations, orthopaedic devices, inflammatory syndrome. The diagnosis of P. acnes osteomyelitis was greatly improved by combining different criteria, allowing differentiation between infection and contamination.

  4. Resolution criteria in double-slit microscopic imaging experiments

    PubMed Central

    You, Shangting; Kuang, Cuifang; Zhang, Baile

    2016-01-01

    Double-slit imaging is widely used for verifying the resolution of high-resolution and super-resolution microscopies. However, due to the fabrication limits, the slit width is generally non-negligible, which can affect the claimed resolution. In this paper we theoretically calculate the electromagnetic field distribution inside and near the metallic double slit using waveguide mode expansion method, and acquire the far-field image by vectorial Fourier optics. We find that the slit width has minimal influence when the illuminating light is polarized parallel to the slits. In this case, the claimed resolution should be based on the center-to-center distance of the double-slit. PMID:27640808

  5. Multisensor Image Analysis System

    DTIC Science & Technology

    1993-04-15

    AD-A263 679 II Uli! 91 Multisensor Image Analysis System Final Report Authors. Dr. G. M. Flachs Dr. Michael Giles Dr. Jay Jordan Dr. Eric...or decision, unless so designated by other documentation. 93-09739 *>ft s n~. now illlllM3lMVf Multisensor Image Analysis System Final...Multisensor Image Analysis System 3. REPORT TYPE AND DATES COVERED FINAL: LQj&tt-Z JZOfVL 5. FUNDING NUMBERS 93 > 6. AUTHOR(S) Drs. Gerald

  6. A Comparative Investigation of Rotation Criteria Within Exploratory Factor Analysis.

    PubMed

    Sass, Daniel A; Schmitt, Thomas A

    2010-01-29

    Exploratory factor analysis (EFA) is a commonly used statistical technique for examining the relationships between variables (e.g., items) and the factors (e.g., latent traits) they depict. There are several decisions that must be made when using EFA, with one of the more important being choice of the rotation criterion. This selection can be arduous given the numerous rotation criteria available and the lack of research/literature that compares their function and utility. Historically, researchers have chosen rotation criteria based on whether or not factors are correlated and have failed to consider other important aspects of their data. This study reviews several rotation criteria, demonstrates how they may perform with different factor pattern structures, and highlights for researchers subtle but important differences between each rotation criterion. The choice of rotation criterion is critical to ensure researchers make informed decisions as to when different rotation criteria may or may not be appropriate. The results suggest that depending on the rotation criterion selected and the complexity of the factor pattern matrix, the interpretation of the interfactor correlations and factor pattern loadings can vary substantially. Implications and future directions are discussed.

  7. Coastal zone management with stochastic multi-criteria analysis.

    PubMed

    Félix, A; Baquerizo, A; Santiago, J M; Losada, M A

    2012-12-15

    The methodology for coastal management proposed in this study takes into account the physical processes of the coastal system and the stochastic nature of forcing agents. Simulation techniques are used to assess the uncertainty in the performance of a set of predefined management strategies based on different criteria representing the main concerns of interest groups. This statistical information as well as the distribution function that characterizes the uncertainty regarding the preferences of the decision makers is fed into a stochastic multi-criteria acceptability analysis that provides the probability of alternatives obtaining certain ranks and also calculates the preferences of a typical decision maker who supports an alternative. This methodology was applied as a management solution for Playa Granada in the Guadalfeo River Delta (Granada, Spain), where the construction of a dam in the river basin is causing severe erosion. The analysis of shoreline evolution took into account the coupled action of atmosphere, ocean, and land agents and their intrinsic stochastic character. This study considered five different management strategies. The criteria selected for the analysis were the economic benefits for three interest groups: (i) indirect beneficiaries of tourist activities; (ii) beach homeowners; and (iii) the administration. The strategies were ranked according to their effectiveness, and the relative importance given to each criterion was obtained.

  8. Release criteria and pathway analysis for radiological remediation

    SciTech Connect

    Subbaraman, G.; Tuttle, R.J.; Oliver, B.M. . Rocketdyne Div.); Devgun, J.S. )

    1991-01-01

    Site-specific activity concentrations were derived for soils contaminated with mixed fission products (MFP), or uranium-processing residues, using the Department of Energy (DOE) pathway analysis computer code RESRAD at four different sites. The concentrations and other radiological parameters, such as limits on background-subtracted gamma exposure rate were used as the basis to arrive at release criteria for two of the sites. Valid statistical parameters, calculated for the distribution of radiological data obtained from site surveys, were then compared with the criteria to determine releasability or need for further decontamination. For the other two sites, RESRAD has been used as a preremediation planning tool to derive residual material guidelines for uranium. 11 refs., 4 figs., 3 tabs.

  9. Investigation of various criteria for evaluation of aluminum thin foil ''smart sensors'' images

    NASA Astrophysics Data System (ADS)

    Panin, S. V.; Eremin, A. V.; Lyubutin, P. S.; Burkov, M. V.

    2014-10-01

    Various criteria for processing of aluminum foil ''smart sensors'' images for fatigue evaluation of carbon fiber reinforced polymer (CFRP) were analyzed. There are informative parameters used to assess image quality and surface relief and accordingly to characterize the fatigue damage state of CFRP. The sensitivity of all criteria to distortion influences, particularly, to Gaussian noise, blurring and JPEG compression was investigated. The main purpose of the research is related to the search of informative parameters for fatigue evaluation, which are the least sensitive to different distortions.

  10. Description, Recognition and Analysis of Biological Images

    SciTech Connect

    Yu Donggang; Jin, Jesse S.; Luo Suhuai; Pham, Tuan D.; Lai Wei

    2010-01-25

    Description, recognition and analysis biological images plays an important role for human to describe and understand the related biological information. The color images are separated by color reduction. A new and efficient linearization algorithm is introduced based on some criteria of difference chain code. A series of critical points is got based on the linearized lines. The series of curvature angle, linearity, maximum linearity, convexity, concavity and bend angle of linearized lines are calculated from the starting line to the end line along all smoothed contours. The useful method can be used for shape description and recognition. The analysis, decision, classification of the biological images are based on the description of morphological structures, color information and prior knowledge, which are associated each other. The efficiency of the algorithms is described based on two applications. One application is the description, recognition and analysis of color flower images. Another one is related to the dynamic description, recognition and analysis of cell-cycle images.

  11. [Semen analysis: spermiogram according to WHO 2010 criteria].

    PubMed

    Gottardo, F; Kliesch, S

    2011-01-01

    Semen analysis plays a key role in the diagnostics of male infertility. Semen analysis has to be performed according to World Health Organisation (WHO) criteria. The updated version of the WHO manual was completed at the end of 2009 and published in 2010. Standard procedures in semen analysis include evaluation of sperm concentration, motility, morphology and vitality. In this new version particular attention has been paid to internal and external quality control, helping to identify and correct incidental and systematic errors both in routine analysis as well as in the field of research. The new manual describes all laboratory solutions, procedures and calculation formulas, and focuses on the definition of cryptozoospermia or azoospermia. A chapter concerning cryopreservation of spermatozoa has been newly integrated. The following overview presents the most important aspects of the updated WHO manual.

  12. Discrimination of Different Brain Metastases and Primary CNS Lymphomas Using Morphologic Criteria and Diffusion Tensor Imaging.

    PubMed

    Bette, S; Wiestler, B; Delbridge, C; Huber, T; Boeckh-Behrens, T; Meyer, B; Zimmer, C; Gempt, J; Kirschke, J

    2016-12-01

    Purpose: Brain metastases are a common complication of cancer and occur in about 15 - 40 % of patients with malignancies. The aim of this retrospective study was to differentiate between metastases from different primary tumors/CNS lymphyomas using morphologic criteria, fractional anisotropy (FA) and apparent diffusion coefficient (ADC). Materials and Methods: Morphologic criteria such as hemorrhage, cysts, pattern of contrast enhancement and location were reported in 200 consecutive patients with brain metastases/primary CNS lymphomas. FA and ADC values were measured in regions of interest (ROIs) placed in the contrast-enhancing tumor part, the necrosis and the non-enhancing peritumoral region (NEPTR). Differences between histopathological subtypes of metastases were analyzed using non-parametric tests, decision trees and hierarchical clustering analysis. Results: Significant differences were found in morphologic criteria such as hemorrhage or pattern of contrast enhancement. In diffusion measurements, significant differences between the different tumor entities were only found in ADC analyzed in the contrast-enhancing tumor part. Among single tumor entities, primary CNS lymphomas showed significantly lower median ADC values in the contrast-enhancing tumor part (ADClymphoma 0.92 [0.83 - 1.07] vs. ADCno_lymphoma 1.35 [1.10 - 1.64] P = 0.001). Further differentiation between types of metastases was not possible using FA and ADC. Conclusion: There were morphologic differences among the main subtypes of brain metastases/CNS lymphomas. However, due to a high variability of common types of metastases and low specificity, prospective differentiation remained challenging. DTI including FA and ADC was not a reliable tool for differentiation between different histopathological subtypes of brain metastases except for CNS lymphomas showing lower ADC values. Biopsy, surgery and staging remain essential for diagnosis. Key Points:

  13. Adherence to criteria for transvaginal ultrasound imaging and measurement of cervical length.

    PubMed

    Iams, Jay D; Grobman, William A; Lozitska, Albina; Spong, Catherine Y; Saade, George; Mercer, Brian M; Tita, Alan T; Rouse, Dwight J; Sorokin, Yoram; Wapner, Ronald J; Leveno, Kenneth J; Blackwell, Sean C; Esplin, M Sean; Tolosa, Jorge E; Thorp, John M; Caritis, Steve N; Van Dorsten, Peter J

    2013-10-01

    Adherence to published criteria for transvaginal imaging and measurement of cervical length is uncertain. We sought to assess adherence by evaluating images submitted to certify research sonographers for participation in a clinical trial. We reviewed qualifying test results of sonographers seeking certification to image and measure cervical length in a clinical trial. Participating sonographers were required to access training materials and submit 15 images, 3 each from 5 pregnant women not enrolled in the trial. One of 2 sonologists reviewed all qualifying images. We recorded the proportion of images that did not meet standard criteria (excess compression, landmarks not seen, improper image size, or full maternal bladder) and the proportion in which the cervical length was measured incorrectly. Failure for a given patient was defined as >1 unacceptable image, or >2 acceptable images with incorrect caliper placement or erroneous choice of the "shortest best" cervical length. Certification required satisfactory images and cervical length measurement from ≥4 patients. A total of 327 sonographers submitted 4905 images. A total of 271 sonographers (83%) were certified on the first, 41 (13%) on the second, and 2 (0.6%) on the third submission. Thirteen never achieved certification. Of 314 who passed, 196 submitted 15 acceptable images that were appropriately measured for all 5 women. There were 1277 deficient images: 493 were acceptable but incorrectly measured images from sonographers who passed certification because mismeasurement occurred no more than twice. Of 784 deficient images submitted by sonographers who failed the certification, 471 were rejected because of improper measurement (caliper placement and/or failure to identify the shortest best image), and 313 because of failure to obtain a satisfactory image (excessive compression, required landmarks not visible, incorrect image size, brief examination, and/or full maternal bladder). Although 83% of

  14. Use of Model-Segmentation Criteria in Clustering and Segmentation of Time Series and Digital Images.

    DTIC Science & Technology

    1983-05-05

    DANS LAS REPARTITION? ET LA SEGMENTATION DES SERIES TEMPORELLES ET DES IMAGES NUMtRICALES Cet article traite le d~veloppement et l’utilisation des...multidimensionnelles et du no-bre des classes des segments dans la segmentation des series temporelles et des imaqes numericales. Les criteres comme ceux de Akaike...NATIONAL BURCAU OF STAND)AROS 1963 A USE OF MODEL-SEGMENTATION CRITERIA IN CLUSTERING AND SEGMENTATION OF TIME SERIES AND DIGITAL IMAGES by STANLEY L

  15. Comparison of magnetic resonance imaging mismatch criteria to select patients for endovascular stroke therapy.

    PubMed

    Mishra, Nishant K; Albers, Gregory W; Christensen, Søren; Marks, Michael; Hamilton, Scott; Straka, Matus; Liggins, John T P; Kemp, Stephanie; Mlynash, Michael; Bammer, Roland; Lansberg, Maarten G

    2014-05-01

    The Diffusion and Perfusion Imaging Evaluation for Understanding Stroke Evolution 2 (DEFUSE 2) study has shown that clinical response to endovascular reperfusion differs between patients with and without perfusion-diffusion (perfusion-weighted imaging-diffusion-weighted imaging, PWI-DWI) mismatch: patients with mismatch have a favorable clinical response to reperfusion, whereas patients without mismatch do not. This study examined whether alternative mismatch criteria can also differentiate patients according to their response to reperfusion. Patients from the DEFUSE 2 study were categorized according to vessel occlusion on magnetic resonance angiography (MRA) and DWI lesion volume criteria (MRA-DWI mismatch) and symptom severity and DWI criteria (clinical-DWI mismatch). Favorable clinical response was defined as an improvement of ≥8 points on the National Institutes of Health Stroke Scale (NIHSS) by day 30 or an NIHSS score of ≤1 at day 30. We assessed, for each set of criteria, whether the association between reperfusion and favorable clinical response differed according to mismatch status. A differential response to reperfusion was observed between patients with and without MRA-DWI mismatch defined as an internal carotid artery or M1 occlusion and a DWI lesion<50 mL. Reperfusion was associated with good functional outcome in patients who met these MRA-DWI mismatch criteria (odds ratio [OR], 8.5; 95% confidence interval [CI], 2.3-31.3), whereas no association was observed in patients who did not meet these criteria (OR, 0.5; 95% CI, 0.08-3.1; P for difference between the odds, 0.01). No differential response to reperfusion was observed with other variations of the MRA-DWI or clinical-DWI mismatch criteria. The MRA-DWI mismatch is a promising alternative to DEFUSE 2's PWI-DWI mismatch for patient selection in endovascular stroke trials.

  16. Systemic Sclerosis Classification Criteria: Developing methods for multi-criteria decision analysis with 1000Minds

    PubMed Central

    Johnson, Sindhu R.; Naden, Raymond P.; Fransen, Jaap; van den Hoogen, Frank; Pope, Janet E.; Baron, Murray; Tyndall, Alan; Matucci-Cerinic, Marco; Denton, Christopher P.; Distler, Oliver; Gabrielli, Armando; van Laar, Jacob M.; Mayes, Maureen; Steen, Virginia; Seibold, James R.; Clements, Phillip; Medsger, Thomas A.; Carreira, Patricia E.; Riemekasten, Gabriela; Chung, Lorinda; Fessler, Barri J.; Merkel, Peter A.; Silver, Richard; Varga, John; Allanore, Yannick; Mueller-Ladner, Ulf; Vonk, Madelon C.; Walker, Ulrich A.; Cappelli, Susanna; Khanna, Dinesh

    2014-01-01

    Objective Classification criteria for systemic sclerosis (SSc) are being developed. The objectives were to: develop an instrument for collating case-data and evaluate its sensibility; use forced-choice methods to reduce and weight criteria; and explore agreement between experts on the probability that cases were classified as SSc. Study Design and Setting A standardized instrument was tested for sensibility. The instrument was applied to 20 cases covering a range of probabilities that each had SSc. Experts rank-ordered cases from highest to lowest probability; reduced and weighted the criteria using forced-choice methods; and re-ranked the cases. Consistency in rankings was evaluated using intraclass correlation coefficients (ICC). Results Experts endorsed clarity (83%), comprehensibility (100%), face and content validity (100%). Criteria were weighted (points): finger skin thickening (14–22), finger-tip lesions (9–21), friction rubs (21), finger flexion contractures (16), pulmonary fibrosis (14), SSc-related antibodies (15), Raynaud’s phenomenon (13), calcinosis (12), pulmonary hypertension (11), renal crisis (11), telangiectasia (10), abnormal nailfold capillaries (10), esophageal dilation (7) and puffy fingers (5). The ICC across experts was 0.73 (95%CI 0.58,0.86) and improved to 0.80 (95%CI 0.68,0.90). Conclusions Using a sensible instrument and forced-choice methods, the number of criteria were reduced by 39% (23 to 14) and weighted. Our methods reflect the rigors of measurement science, and serves as a template for developing classification criteria. PMID:24721558

  17. Knowledge based imaging for terrain analysis

    NASA Technical Reports Server (NTRS)

    Holben, Rick; Westrom, George; Rossman, David; Kurrasch, Ellie

    1992-01-01

    A planetary rover will have various vision based requirements for navigation, terrain characterization, and geological sample analysis. In this paper we describe a knowledge-based controller and sensor development system for terrain analysis. The sensor system consists of a laser ranger and a CCD camera. The controller, under the input of high-level commands, performs such functions as multisensor data gathering, data quality monitoring, and automatic extraction of sample images meeting various criteria. In addition to large scale terrain analysis, the system's ability to extract useful geological information from rock samples is illustrated. Image and data compression strategies are also discussed in light of the requirements of earth bound investigators.

  18. Image Retrieval: Theoretical Analysis and Empirical User Studies on Accessing Information in Images.

    ERIC Educational Resources Information Center

    Ornager, Susanne

    1997-01-01

    Discusses indexing and retrieval for effective searches of digitized images. Reports on an empirical study about criteria for analysis and indexing digitized images, and the different types of user queries done in newspaper image archives in Denmark. Concludes that it is necessary that the indexing represent both a factual and an expressional…

  19. Image Retrieval: Theoretical Analysis and Empirical User Studies on Accessing Information in Images.

    ERIC Educational Resources Information Center

    Ornager, Susanne

    1997-01-01

    Discusses indexing and retrieval for effective searches of digitized images. Reports on an empirical study about criteria for analysis and indexing digitized images, and the different types of user queries done in newspaper image archives in Denmark. Concludes that it is necessary that the indexing represent both a factual and an expressional…

  20. Methods and criteria for safety analysis (FIN L2535)

    SciTech Connect

    Not Available

    1992-12-01

    In response to the NRC request for a proposal dated October 20, 1992, Westinghouse Savannah River Company (WSRC) submit this proposal to provide contractural assistance for FIN L2535, ``Methods and Criteria for Safety Analysis,`` as specified in the Statement of Work attached to the request for proposal. The Statement of Work involves development of safety analysis guidance for NRC licensees, arranging a workshop on this guidance, and revising NRC Regulatory Guide 3.52. This response to the request for proposal offers for consideration the following advantages of WSRC in performing this work: Experience, Qualification of Personnel and Resource Commitment, Technical and Organizational Approach, Mobilization Plan, Key Personnel and Resumes. In addition, attached are the following items required by the NRC: Schedule II, Savannah River Site - Job Cost Estimate, NRC Form 189, Project and Budget Proposal for NRC Work, page 1, NRC Form 189, Project and Budget Proposal for NRC Work, page 2, Project Description.

  1. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  2. [New ASAS criteria for the diagnosis of spondyloarthritis: diagnosing sacroiliitis by magnetic resonance imaging].

    PubMed

    Banegas Illescas, M E; López Menéndez, C; Rozas Rodríguez, M L; Fernández Quintero, R M

    2014-01-01

    Radiographic sacroiliitis has been included in the diagnostic criteria for spondyloarthropathies since the Rome criteria were defined in 1961. However, in the last ten years, magnetic resonance imaging (MRI) has proven more sensitive in the evaluation of the sacroiliac joints in patients with suspected spondyloarthritis and symptoms of sacroiliitis; MRI has proven its usefulness not only for diagnosis of this disease, but also for the follow-up of the disease and response to treatment in these patients. In 2009, The Assessment of SpondyloArthritis international Society (ASAS) developed a new set of criteria for classifying and diagnosing patients with spondyloarthritis; one important development with respect to previous classifications is the inclusion of MRI positive for sacroiliitis as a major diagnostic criterion. This article focuses on the radiologic part of the new classification. We describe and illustrate the different alterations that can be seen on MRI in patients with sacroiliitis, pointing out the limitations of the technique and diagnostic pitfalls.

  3. Clinical, Neurocognitive, Structural Imaging and Dermatogliphics in Schizophrenia According to Kraepelin Criteria

    PubMed Central

    GÜLEÇ, Hüseyin; ULUSOY KAYMAK, Semra; BİLİCİ, Mustafa; GANGAL, Ali; KAYIKÇ IOĞLU, Temel; SARI, Ahmet; TAN, Üner

    2013-01-01

    Introduction A century ago, Kraepelin stated that the distinctive feature of schizophrenia was progressive deterioration. Kraepelin criteria for schizophrenia are: (1) continuous hospitalization or complete dependence on others for obtaining basic necessities of life, (2) unemployment and (3) no remission for the past five years. We aimed to determine the clinical appearance and structural biological features of Kraepelinian schizophrenia. Methods The sample consisted of 17 Kraepelinian patients, 30 non-Kraepelinian schizophrenic patients and 43 healthy controls. The Clinical Global Impressions (CGI) and the Positive and Negative Syndrome Scales (PANSS) were used for clinical assessment. The Frontal Assessment Battery (FAB) and the Verbal Fluency and Color Trail Test (CTT) were included in the cognitive battery. Brain magnetic resonance imaging and dermatoglyphic measurements were performed for structural features. Result Duration of illness, hospitalization, suicide attempts, admission type, presence of a stressor and treatment choice were similar between the two patient groups. Treatment resistance and family history of schizophrenia were more common in Kraepelinian patients. PANSS and CGI subscales scores were also higher in this group. Only the category fluency and CTT-I were different in Kraepelinian patients in comparison to the other patient group. Structural findings were not different between the three groups. Conclusion Category fluency, which was lower in Kraepelinian patients, is an important marker of a degenerative process. The collection of severe clinical symptoms, family history of psychiatric illness and nonresponse to treatment in this particular group of patients points to the need to conduct further studies including cluster analysis in methodology. PMID:28360552

  4. Clinical, Neurocognitive, Structural Imaging and Dermatogliphics in Schizophrenia According to Kraepelin Criteria.

    PubMed

    Güleç, Hüseyin; Ulusoy Kaymak, Semra; Bilici, Mustafa; Gangal, Ali; Kayikç Ioğlu, Temel; Sari, Ahmet; Tan, Üner

    2013-09-01

    A century ago, Kraepelin stated that the distinctive feature of schizophrenia was progressive deterioration. Kraepelin criteria for schizophrenia are: (1) continuous hospitalization or complete dependence on others for obtaining basic necessities of life, (2) unemployment and (3) no remission for the past five years. We aimed to determine the clinical appearance and structural biological features of Kraepelinian schizophrenia. The sample consisted of 17 Kraepelinian patients, 30 non-Kraepelinian schizophrenic patients and 43 healthy controls. The Clinical Global Impressions (CGI) and the Positive and Negative Syndrome Scales (PANSS) were used for clinical assessment. The Frontal Assessment Battery (FAB) and the Verbal Fluency and Color Trail Test (CTT) were included in the cognitive battery. Brain magnetic resonance imaging and dermatoglyphic measurements were performed for structural features. Duration of illness, hospitalization, suicide attempts, admission type, presence of a stressor and treatment choice were similar between the two patient groups. Treatment resistance and family history of schizophrenia were more common in Kraepelinian patients. PANSS and CGI subscales scores were also higher in this group. Only the category fluency and CTT-I were different in Kraepelinian patients in comparison to the other patient group. Structural findings were not different between the three groups. Category fluency, which was lower in Kraepelinian patients, is an important marker of a degenerative process. The collection of severe clinical symptoms, family history of psychiatric illness and nonresponse to treatment in this particular group of patients points to the need to conduct further studies including cluster analysis in methodology.

  5. A Speedy Cardiovascular Diseases Classifier Using Multiple Criteria Decision Analysis

    PubMed Central

    Lee, Wah Ching; Hung, Faan Hei; Tsang, Kim Fung; Tung, Hoi Ching; Lau, Wing Hong; Rakocevic, Veselin; Lai, Loi Lei

    2015-01-01

    Each year, some 30 percent of global deaths are caused by cardiovascular diseases. This figure is worsening due to both the increasing elderly population and severe shortages of medical personnel. The development of a cardiovascular diseases classifier (CDC) for auto-diagnosis will help address solve the problem. Former CDCs did not achieve quick evaluation of cardiovascular diseases. In this letter, a new CDC to achieve speedy detection is investigated. This investigation incorporates the analytic hierarchy process (AHP)-based multiple criteria decision analysis (MCDA) to develop feature vectors using a Support Vector Machine. The MCDA facilitates the efficient assignment of appropriate weightings to potential patients, thus scaling down the number of features. Since the new CDC will only adopt the most meaningful features for discrimination between healthy persons versus cardiovascular disease patients, a speedy detection of cardiovascular diseases has been successfully implemented. PMID:25587978

  6. Image analysis library software development

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Bryant, J.

    1977-01-01

    The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.

  7. Medical Image Analysis Facility

    NASA Technical Reports Server (NTRS)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  8. Digital Image Analysis of Cereals

    USDA-ARS?s Scientific Manuscript database

    Image analysis is the extraction of meaningful information from images, mainly digital images by means of digital processing techniques. The field was established in the 1950s and coincides with the advent of computer technology, as image analysis is profoundly reliant on computer processing. As t...

  9. Analysis of eligibility criteria from ClinicalTrials.gov.

    PubMed

    Doods, Justin; Dugas, Martin; Fritz, Fleur

    2014-01-01

    Electronic health care records are being used more and more for patient documentation. This electronic data can be used for secondary purposes, for example through systems that support clinical research. Eligibility criteria have to be processable for such systems to work, but criteria published on ClinicalTrials.gov have been shown to be complex, making them challenging to re-use. We analysed the eligibility criteria on ClinicalTrials.gov using automatic methods to determine whether the criteria definition and number changed over time. From 1998 to 2012 the average number of words used to describe eligibility criteria per year increased by 46%, while the average number of lines used per year only slightly increases until 2000 and stabilizes afterwards. Whether the increase of words resulted in increased criteria complexity or whether more data elements are used to describe eligibility needs further investigation.

  10. Engineering design criteria for an image intensifier/image converter camera

    NASA Technical Reports Server (NTRS)

    Sharpsteen, J. T.; Lund, D. L.; Stoap, L. J.; Solheim, C. D.

    1976-01-01

    The design, display, and evaluation of an image intensifier/image converter camera which can be utilized in various requirements of spaceshuttle experiments are described. An image intensifier tube was utilized in combination with two brassboards as power supply and used for evaluation of night photography in the field. Pictures were obtained showing field details which would have been undistinguishable to the naked eye or to an ordinary camera.

  11. Criteria for High Quality Biology Teaching: An Analysis

    ERIC Educational Resources Information Center

    Tasci, Guntay

    2015-01-01

    This study aims to analyze the process under which biology lessons are taught in terms of teaching quality criteria (TQC). Teaching quality is defined as the properties of efficient teaching and is considered to be the criteria used to measure teaching quality both in general and specific to a field. The data were collected through classroom…

  12. Interactive Image Analysis System Design,

    DTIC Science & Technology

    1982-12-01

    This report describes a design for an interactive image analysis system (IIAS), which implements terrain data extraction techniques. The design... analysis system. Additionally, the system is fully capable of supporting many generic types of image analysis and data processing, and is modularly...employs commercially available, state of the art minicomputers and image display devices with proven software to achieve a cost effective, reliable image

  13. Parallel Algorithms for Image Analysis.

    DTIC Science & Technology

    1982-06-01

    8217 _ _ _ _ _ _ _ 4. TITLE (aid Subtitle) S. TYPE OF REPORT & PERIOD COVERED PARALLEL ALGORITHMS FOR IMAGE ANALYSIS TECHNICAL 6. PERFORMING O4G. REPORT NUMBER TR-1180...Continue on reverse side it neceesary aid Identlfy by block number) Image processing; image analysis ; parallel processing; cellular computers. 20... IMAGE ANALYSIS TECHNICAL 6. PERFORMING ONG. REPORT NUMBER TR-1180 - 7. AUTHOR(&) S. CONTRACT OR GRANT NUMBER(s) Azriel Rosenfeld AFOSR-77-3271 9

  14. Multiple criteria decision analysis for health technology assessment.

    PubMed

    Thokala, Praveen; Duenas, Alejandra

    2012-12-01

    Multicriteria decision analysis (MCDA) has been suggested by some researchers as a method to capture the benefits beyond quality adjusted life-years in a transparent and consistent manner. The objectives of this article were to analyze the possible application of MCDA approaches in health technology assessment and to describe their relative advantages and disadvantages. This article begins with an introduction to the most common types of MCDA models and a critical review of state-of-the-art methods for incorporating multiple criteria in health technology assessment. An overview of MCDA is provided and is compared against the current UK National Institute for Health and Clinical Excellence health technology appraisal process. A generic MCDA modeling approach is described, and the different MCDA modeling approaches are applied to a hypothetical case study. A comparison of the different MCDA approaches is provided, and the generic issues that need consideration before the application of MCDA in health technology assessment are examined. There are general practical issues that might arise from using an MCDA approach, and it is suggested that appropriate care be taken to ensure the success of MCDA techniques in the appraisal process. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  15. Moving Image Analysis System

    NASA Astrophysics Data System (ADS)

    Shifley, Loren A.

    1989-02-01

    The recent introduction of a two dimensional interactive software package provides a new technique for quantitative analysis. Integrated with its corresponding peripherals, the same software offers either film or video data reduction. Digitized data points measured from the images are stored in the computer. With is data, a variety of information can be displayed, printed or plotted in a graphical form. The resultant graphs could determine such factors as: displacement, force, velocity, momentum, angular acceleration, center of gravity, energy, leng, , angle and time to name a few. Simple, efficient and precise analysis can now be quantified and documented. This paper will describe the detailed capabilities of the software along with a variety of applications where it might be used.

  16. Moving image analysis system

    NASA Astrophysics Data System (ADS)

    Shifley, Loren A.

    1990-08-01

    The recent introduction of a two dimensional interactive software package provides a new technique for quantitative analysis. Integrated with its corresponding peripherals, the same software offers either film or video data reduction. Digitized data points measured from the images are stored in the computer. With this data, a variety of information can be displayed, printed or plotted in a graphical form. The resultant graphs could determine such factors as: displacement, force, velocity, momentum, angular acceleration, center of gravity, energy, length, angle and time to name a few. Simple, efficient and precise analysis can now be quantified and documented. This paper will describe the detailed capabilities of the software along with a variety of applications where it might be used.

  17. Efficiency of model selection criteria in flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Calenda, G.; Volpi, E.

    2009-04-01

    The estimation of high flood quantiles requires the extrapolation of the probability distributions far beyond the usual sample length, involving high estimation uncertainties. The choice of the probability law, traditionally based on the hypothesis testing, is critical to this point. In this study the efficiency of different model selection criteria, seldom applied in flood frequency analysis, is investigated. The efficiency of each criterion in identifying the probability distribution of the hydrological extremes is evaluated by numerical simulations for different parent distributions, coefficients of variation and skewness, and sample sizes. The compared model selection procedures are the Akaike Information Criterion (AIC), the Bayesian Information Criterion (BIC), the Anderson Darling Criterion (ADC) recently discussed by Di Baldassarre et al. (2008) and Sample Quantile Criterion (SQC), recently proposed by the authors (Calenda et al., 2009). The SQC is based on the principle of maximising the probability density of the elements of the sample that are considered relevant to the problem, and takes into account both the accuracy and the uncertainty of the estimate. Since the stress is mainly on extreme events, the SQC involves upper-tail probabilities, where the effect of the model assumption is more critical. The proposed index is equal to the sum of logarithms of the inverse of the sample probability density of the observed quantiles. The definition of this index is based on the principle that the more centred is the sample value in respect to its density distribution (accuracy of the estimate) and the less spread is this distribution (uncertainty of the estimate), the greater is the probability density of the sample quantile. Thus, lower values of the index indicate a better performance of the distribution law. This criterion can operate the selection of the optimum distribution among competing probability models that are estimated using different samples. The

  18. Brain Imaging Analysis

    PubMed Central

    BOWMAN, F. DUBOIS

    2014-01-01

    The increasing availability of brain imaging technologies has led to intense neuroscientific inquiry into the human brain. Studies often investigate brain function related to emotion, cognition, language, memory, and numerous other externally induced stimuli as well as resting-state brain function. Studies also use brain imaging in an attempt to determine the functional or structural basis for psychiatric or neurological disorders and, with respect to brain function, to further examine the responses of these disorders to treatment. Neuroimaging is a highly interdisciplinary field, and statistics plays a critical role in establishing rigorous methods to extract information and to quantify evidence for formal inferences. Neuroimaging data present numerous challenges for statistical analysis, including the vast amounts of data collected from each individual and the complex temporal and spatial dependence present. We briefly provide background on various types of neuroimaging data and analysis objectives that are commonly targeted in the field. We present a survey of existing methods targeting these objectives and identify particular areas offering opportunities for future statistical contribution. PMID:25309940

  19. DIDA - Dynamic Image Disparity Analysis.

    DTIC Science & Technology

    1982-12-31

    Understanding, Dynamic Image Analysis , Disparity Analysis, Optical Flow, Real-Time Processing ___ 20. ABSTRACT (Continue on revere side If necessary aid identify...three aspects of dynamic image analysis must be studied: effectiveness, generality, and efficiency. In addition, efforts must be made to understand the...environment. A better understanding of the need for these Limiting constraints is required. Efficiency is obviously important if dynamic image analysis is

  20. Image-guided Tumor Ablation: Standardization of Terminology and Reporting Criteria

    PubMed Central

    Goldberg, S. Nahum; Grassi, Clement J.; Cardella, John F.; Charboneau, J. William; Dodd, Gerald D.; Dupuy, Damian E.; Gervais, Debra A.; Gillams, Alice R.; Kane, Robert A.; Lee, Fred T.; Livraghi, Tito; McGahan, John; Phillips, David A.; Rhim, Hyunchul; Silverman, Stuart G.; Solbiati, Luigi; Vogl, Thomas J.; Wood, Bradford J.; Vedantham, Suresh; Sacks, David

    2012-01-01

    The field of interventional oncology with use of image-guided tumor ablation requires standardization of terminology and reporting criteria to facilitate effective communication of ideas and appropriate comparison between treatments that use different technologies, such as chemical (ethanol or acetic acid) ablation, and thermal therapies, such as radiofrequency (RF), laser, microwave, ultrasound, and cryoablation. This document provides a framework that will hopefully facilitate the clearest communication between investigators and will provide the greatest flexibility in comparison between the many new, exciting, and emerging technologies. An appropriate vehicle for reporting the various aspects of image-guided ablation therapy, including classification of therapies and procedure terms, appropriate descriptors of imaging guidance, and terminology to define imaging and pathologic findings, are outlined. Methods for standardizing the reporting of follow-up findings and complications and other important aspects that require attention when reporting clinical results are addressed. It is the group’s intention that adherence to the recommendations will facilitate achievement of the group’s main objective: improved precision and communication in this field that lead to more accurate comparison of technologies and results and, ultimately, to improved patient outcomes. The intent of this standardization of terminology is to provide an appropriate vehicle for reporting the various aspects of image-guided ablation therapy. PMID:15845798

  1. Image-guided tumor ablation: standardization of terminology and reporting criteria.

    PubMed

    Goldberg, S Nahum; Grassi, Clement J; Cardella, John F; Charboneau, J William; Dodd, Gerald D; Dupuy, Damian E; Gervais, Debra A; Gillams, Alice R; Kane, Robert A; Lee, Fred T; Livraghi, Tito; McGahan, John; Phillips, David A; Rhim, Hyunchul; Silverman, Stuart G; Solbiati, Luigi; Vogl, Thomas J; Wood, Bradford J; Vedantham, Suresh; Sacks, David

    2009-07-01

    The field of interventional oncology with use of image-guided tumor ablation requires standardization of terminology and reporting criteria to facilitate effective communication of ideas and appropriate comparison between treatments that use different technologies, such as chemical (ethanol or acetic acid) ablation, and thermal therapies, such as radiofrequency (RF), laser, microwave, ultrasound, and cryoablation. This document provides a framework that will hopefully facilitate the clearest communication between investigators and will provide the greatest flexibility in comparison between the many new, exciting, and emerging technologies. An appropriate vehicle for reporting the various aspects of image-guided ablation therapy, including classification of therapies and procedure terms, appropriate descriptors of imaging guidance, and terminology to define imaging and pathologic findings, are outlined. Methods for standardizing the reporting of follow-up findings and complications and other important aspects that require attention when reporting clinical results are addressed. It is the group's intention that adherence to the recommendations will facilitate achievement of the group's main objective: improved precision and communication in this field that lead to more accurate comparison of technologies and results and, ultimately, to improved patient outcomes. The intent of this standardization of terminology is to provide an appropriate vehicle for reporting the various aspects of image-guided ablation therapy.

  2. Regional Analysis of Self-Reported Personality Disorder Criteria

    PubMed Central

    Turkheimer, Eric; Ford, Derek C.; Oltmanns, Thomas F.

    2010-01-01

    Building on the theoretical work of Louis Guttman, we propose that the core problem facing research into the multidimensional structure of the personality disorders is not the identification of factorial simple structure but rather detailed characterization of the multivariate configuration of the diagnostic criteria. Dimensions rotated to orthogonal or oblique simple structure are but one way out of many to characterize a multivariate map, and their current near universal application represents a choice for a very particular set of interpretive advantages and disadvantages. We use multidimensional scaling and regional interpretation to investigate the structure of 78 self-reported personality disorder criteria from a large sample of military recruits and college students. Results suggest that the criteria have a three-dimensional radex structure that conforms only loosely to the 10 existing personality disorder (PD) categories. Regional interpretation in three dimensions elucidates several important aspects of PDs and their interrelationships. PMID:19012659

  3. Reflections on ultrasound image analysis.

    PubMed

    Alison Noble, J

    2016-10-01

    Ultrasound (US) image analysis has advanced considerably in twenty years. Progress in ultrasound image analysis has always been fundamental to the advancement of image-guided interventions research due to the real-time acquisition capability of ultrasound and this has remained true over the two decades. But in quantitative ultrasound image analysis - which takes US images and turns them into more meaningful clinical information - thinking has perhaps more fundamentally changed. From roots as a poor cousin to Computed Tomography (CT) and Magnetic Resonance (MR) image analysis, both of which have richer anatomical definition and thus were better suited to the earlier eras of medical image analysis which were dominated by model-based methods, ultrasound image analysis has now entered an exciting new era, assisted by advances in machine learning and the growing clinical and commercial interest in employing low-cost portable ultrasound devices outside traditional hospital-based clinical settings. This short article provides a perspective on this change, and highlights some challenges ahead and potential opportunities in ultrasound image analysis which may both have high impact on healthcare delivery worldwide in the future but may also, perhaps, take the subject further away from CT and MR image analysis research with time. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. A multiple criteria-based spectral partitioning method for remotely sensed hyperspectral image classification

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Li, Jun; Plaza, Antonio; Sun, Yanli

    2016-10-01

    Hyperspectral remote sensing offers a powerful tool in many different application contexts. The imbalance between the high dimensionality of the data and the limited availability of training samples calls for the need to perform dimensionality reduction in practice. Among traditional dimensionality reduction techniques, feature extraction is one of the most widely used approaches due to its flexibility to transform the original spectral information into a subspace. In turn, band selection is important when the application requires preserving the original spectral information (especially the physically meaningful information) for the interpretation of the hyperspectral scene. In the case of hyperspectral image classification, both techniques need to discard most of the original features/bands in order to perform the classification using a feature set with much lower dimensionality. However, the discriminative information that allows a classifier to provide good performance is usually classdependent and the relevant information may live in weak features/bands that are usually discarded or lost through subspace transformation or band selection. As a result, in practice, it is challenging to use either feature extraction or band selection for classification purposes. Relevant lines of attack to address this problem have focused on multiple feature selection aiming at a suitable fusion of diverse features in order to provide relevant information to the classifier. In this paper, we present a new dimensionality reduction technique, called multiple criteria-based spectral partitioning, which is embedded in an ensemble learning framework to perform advanced hyperspectral image classification. Driven by the use of a multiple band priority criteria that is derived from classic band selection techniques, we obtain multiple spectral partitions from the original hyperspectral data that correspond to several band subgroups with much lower spectral dimensionality as compared with

  5. Spreadsheet-Like Image Analysis

    DTIC Science & Technology

    1992-08-01

    1 " DTIC AD-A254 395 S LECTE D, ° AD-E402 350 Technical Report ARPAD-TR-92002 SPREADSHEET-LIKE IMAGE ANALYSIS Paul Willson August 1992 U.S. ARMY...August 1992 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS SPREADSHEET-LIKE IMAGE ANALYSIS 6. AUTHOR(S) Paul Willson 7. PERFORMING ORGANIZATION NAME(S) AND...14. SUBJECT TERMS 15. NUMBER OF PAGES Image analysis , nondestructive inspection, spreadsheet, Macintosh software, 14 neural network, signal processing

  6. Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1981-04-01

    UNCLASSIF1 ED ETL-025s N IIp ETL-0258 AL Ai01319 S"Knowledge-based image analysis u George C. Stockman Barbara A. Lambird I David Lavine Laveen N. Kanal...extraction, verification, region classification, pattern recognition, image analysis . 3 20. A. CT (Continue on rever.. d. It necessary and Identify by...UNCLgSTFTF n In f SECURITY CLASSIFICATION OF THIS PAGE (When Date Entered) .L1 - I Table of Contents Knowledge Based Image Analysis I Preface

  7. Analysis of proposed criteria for human response to vibration

    NASA Technical Reports Server (NTRS)

    Janeway, R. N.

    1975-01-01

    The development of criteria for human vibration response is reviewed, including the evolution of the ISO standard 2631. The document is analyzed to show why its application to vehicle ride evaluation is strongly opposed. Alternative vertical horizontal limits for comfort are recommended in the ground vehicle ride frequency range above 1 Hz. These values are derived by correlating the absorbed power findings of Pradko and Lee with other established criteria. Special emphasis is placed on working limits in the frequency range of 1 to 10 Hz since this is the most significant area in ground vehicle ride evaluation.

  8. Performance of the ASAS classification criteria for axial and peripheral spondyloarthritis: a systematic literature review and meta-analysis.

    PubMed

    Sepriano, Alexandre; Rubio, Roxana; Ramiro, Sofia; Landewé, Robert; van der Heijde, Désirée

    2017-05-01

    To summarise the evidence on the performance of the Assessment of SpondyloArthritis international Society (ASAS) classification criteria for axial spondyloarthritis (axSpA) (also imaging and clinical arm separately), peripheral (p)SpA and the entire set, when tested against the rheumatologist's diagnosis ('reference standard'). A systematic literature review was performed to identify eligible studies. Raw data on SpA diagnosis and classification were extracted or, if necessary, obtained from the authors of the selected publications. A meta-analysis was performed to obtain pooled estimates for sensitivity, specificity, positive and negative likelihood ratios, by fitting random effects models. Nine papers fulfilled the inclusion criteria (N=5739 patients). The entire set of the ASAS SpA criteria yielded a high pooled sensitivity (73%) and specificity (88%). Similarly, good results were found for the axSpA criteria (sensitivity: 82%; specificity: 88%). Splitting the axSpA criteria in 'imaging arm only' and 'clinical arm only' resulted in much lower sensitivity (30% and 23% respectively), but very high specificity was retained (97% and 94% respectively). The pSpA criteria were less often tested than the axSpA criteria and showed a similarly high pooled specificity (87%) but lower sensitivity (63%). Accumulated evidence from studies with more than 5500 patients confirms the good performance of the various ASAS SpA criteria as tested against the rheumatologist's diagnosis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. A critical overview of the imaging arm of the ASAS criteria for diagnosing axial spondyloarthritis: what the radiologist should know.

    PubMed

    Aydingoz, Ustun; Yildiz, Adalet Elcin; Ozdemir, Zeynep Maras; Yildirim, Seray Akcalar; Erkus, Figen; Ergen, Fatma Bilge

    2012-01-01

    The Assessment in SpondyloArthritis international Society (ASAS) defined new criteria in 2009 for the classification of axial spondyloarthritis (SpA) in patients with ≥ 3 months of back pain who were aged <45 years at the onset of back pain. This represents a culmination of a number of efforts in the last 30 years starting with the 1984 modified New York criteria for ankylosing spondylitis, followed by the 1990 Amor criteria and the 1991 European Spondyloarthropathy Study Group criteria for SpA. The importance of new ASAS criteria for radiologists is that magnetic resonance imaging (MRI) takes center stage and is one of the major criteria for the diagnosis of axial SpA when active (or acute) inflammation is present on MRI that is highly suggestive of sacroiliitis associated with SpA. According to the new criteria, sacroiliitis on imaging plus ≥ 1 SpA features (such as inflammatory back pain, arthritis, heel enthesitis, uveitis, dactylitis, psoriasis, Crohn's disease/colitis, good response to non-steroidal anti-inflammatory drugs, family history for SpA, HLA-B27 positivity, or elevated C-reactive protein) is sufficient to make the diagnosis of axial SpA. A number of rules and pitfalls, however, are present in the diagnosis of active sacroiliitis on MRI. These points are highlighted in this review, and a potential shortcoming of the imaging arm of the ASAS criteria is addressed.

  10. Air Pollution Monitoring Site Selection by Multiple Criteria Decision Analysis

    EPA Science Inventory

    Criteria air pollutants (particulate matter, sulfur dioxide, oxides of nitrogen, volatile organic compounds, and carbon monoxide) as well as toxic air pollutants are a global concern. A particular scenario that is receiving increased attention in the research is the exposure to t...

  11. A multivariate analysis of choice criteria for hospitals.

    PubMed

    Heischmidt, K A; Hekmat, F; Gordon, P

    1993-01-01

    This study determines the choice criteria used by consumers in selecting a hospital and provides information useful to hospital administrators in planning and implementing marketing efforts directed toward potential consumers. The findings suggest that physical plot, previous experience with the hospital, location of hospital, overall cost, and reputation of the hospital were important factors in selecting a hospital.

  12. Air Pollution Monitoring Site Selection by Multiple Criteria Decision Analysis

    EPA Science Inventory

    Criteria air pollutants (particulate matter, sulfur dioxide, oxides of nitrogen, volatile organic compounds, and carbon monoxide) as well as toxic air pollutants are a global concern. A particular scenario that is receiving increased attention in the research is the exposure to t...

  13. Image-guided tumor ablation: standardization of terminology and reporting criteria--a 10-year update.

    PubMed

    Ahmed, Muneeb; Solbiati, Luigi; Brace, Christopher L; Breen, David J; Callstrom, Matthew R; Charboneau, J William; Chen, Min-Hua; Choi, Byung Ihn; de Baère, Thierry; Dodd, Gerald D; Dupuy, Damian E; Gervais, Debra A; Gianfelice, David; Gillams, Alice R; Lee, Fred T; Leen, Edward; Lencioni, Riccardo; Littrup, Peter J; Livraghi, Tito; Lu, David S; McGahan, John P; Meloni, Maria Franca; Nikolic, Boris; Pereira, Philippe L; Liang, Ping; Rhim, Hyunchul; Rose, Steven C; Salem, Riad; Sofocleous, Constantinos T; Solomon, Stephen B; Soulen, Michael C; Tanaka, Masatoshi; Vogl, Thomas J; Wood, Bradford J; Goldberg, S Nahum

    2014-10-01

    Image-guided tumor ablation has become a well-established hallmark of local cancer therapy. The breadth of options available in this growing field increases the need for standardization of terminology and reporting criteria to facilitate effective communication of ideas and appropriate comparison among treatments that use different technologies, such as chemical (eg, ethanol or acetic acid) ablation, thermal therapies (eg, radiofrequency, laser, microwave, focused ultrasound, and cryoablation) and newer ablative modalities such as irreversible electroporation. This updated consensus document provides a framework that will facilitate the clearest communication among investigators regarding ablative technologies. An appropriate vehicle is proposed for reporting the various aspects of image-guided ablation therapy including classification of therapies, procedure terms, descriptors of imaging guidance, and terminology for imaging and pathologic findings. Methods are addressed for standardizing reporting of technique, follow-up, complications, and clinical results. As noted in the original document from 2003, adherence to the recommendations will improve the precision of communications in this field, leading to more accurate comparison of technologies and results, and ultimately to improved patient outcomes. Online supplemental material is available for this article .

  14. Image-guided tumor ablation: standardization of terminology and reporting criteria--a 10-year update.

    PubMed

    Ahmed, Muneeb; Solbiati, Luigi; Brace, Christopher L; Breen, David J; Callstrom, Matthew R; Charboneau, J William; Chen, Min-Hua; Choi, Byung Ihn; de Baère, Thierry; Dodd, Gerald D; Dupuy, Damian E; Gervais, Debra A; Gianfelice, David; Gillams, Alice R; Lee, Fred T; Leen, Edward; Lencioni, Riccardo; Littrup, Peter J; Livraghi, Tito; Lu, David S; McGahan, John P; Meloni, Maria Franca; Nikolic, Boris; Pereira, Philippe L; Liang, Ping; Rhim, Hyunchul; Rose, Steven C; Salem, Riad; Sofocleous, Constantinos T; Solomon, Stephen B; Soulen, Michael C; Tanaka, Masatoshi; Vogl, Thomas J; Wood, Bradford J; Goldberg, S Nahum

    2014-11-01

    Image-guided tumor ablation has become a well-established hallmark of local cancer therapy. The breadth of options available in this growing field increases the need for standardization of terminology and reporting criteria to facilitate effective communication of ideas and appropriate comparison among treatments that use different technologies, such as chemical (eg, ethanol or acetic acid) ablation, thermal therapies (eg, radiofrequency, laser, microwave, focused ultrasound, and cryoablation) and newer ablative modalities such as irreversible electroporation. This updated consensus document provides a framework that will facilitate the clearest communication among investigators regarding ablative technologies. An appropriate vehicle is proposed for reporting the various aspects of image-guided ablation therapy including classification of therapies, procedure terms, descriptors of imaging guidance, and terminology for imaging and pathologic findings. Methods are addressed for standardizing reporting of technique, follow-up, complications, and clinical results. As noted in the original document from 2003, adherence to the recommendations will improve the precision of communications in this field, leading to more accurate comparison of technologies and results, and ultimately to improved patient outcomes.

  15. [MR imaging of the Achilles tendon: evaluation of criteria for the differentiation of asymptomatic and symptomatic tendons].

    PubMed

    Weber, C; Wedegärtner, U; Maas, L C; Buchert, R; Adam, G; Maas, R

    2011-07-01

    The purpose of this study was to develop quantitative and qualitative MRI criteria to differentiate between healthy and pathological Achilles tendons. 364 Achilles tendons were examined on a 1.5 T MRI scanner. 264 patients had Achilles tendon complaints, 100 asymptomatic Achilles tendons served as a control. T 1-weighted, T 2-weighted and a STIR sequence were performed in sagittal and axial orientation. Images were evaluated in consensus by two radiologists. Quantitative and qualitative criteria were assessed. A Mann-Whitney-U-Test and a regression analysis were used for statistical analysis. There were statistically significant differences between the patients with disorders and the control group concerning the depth (12.0 mm and 6.3 mm, p < 0.001) and length (83.2 mm and 45.9 mm, p < 0.001) of the tendon, the area of the tendon cross section (1.60 mm (2) and 061 mm (2), p < 0.001), as well as the length of the bursa retrocalcanea (8.3 mm and 5.3 mm, p < 0.001). There was a sensitivity of 97 % and a specificity of 91 % using a formula including the 3 criteria: tendon depth (A4), length of bursa (A5) and area of tendon (F). The measurement of the Achilles tendon and the binary-logistic regression analysis allow differentiation between normal and pathological Achilles tendons. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Spotlight-8 Image Analysis Software

    NASA Technical Reports Server (NTRS)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  17. Oncological image analysis: medical and molecular image analysis

    NASA Astrophysics Data System (ADS)

    Brady, Michael

    2007-03-01

    This paper summarises the work we have been doing on joint projects with GE Healthcare on colorectal and liver cancer, and with Siemens Molecular Imaging on dynamic PET. First, we recall the salient facts about cancer and oncological image analysis. Then we introduce some of the work that we have done on analysing clinical MRI images of colorectal and liver cancer, specifically the detection of lymph nodes and segmentation of the circumferential resection margin. In the second part of the paper, we shift attention to the complementary aspect of molecular image analysis, illustrating our approach with some recent work on: tumour acidosis, tumour hypoxia, and multiply drug resistant tumours.

  18. Imaging based enrichment criteria using deep learning algorithms for efficient clinical trials in MCI

    PubMed Central

    Ithapu, Vamsi K.; Okonkwo, Ozioma C.; Chappell, Richard J.; Dowling, N. Maritza; Johnson, Sterling C.

    2015-01-01

    The Mild Cognitive Impairment (MCI) stage of AD may be optimal for clinical trials to test potential treatments for preventing or delaying decline to dementia. However, MCI is heterogeneous in that not all cases progress to dementia within the time frame of a trial, and some may not have underlying AD pathology. Identifying those MCIs who are most likely to decline during a trial and thus most likely to benefit from treatment will improve trial efficiency and power to detect treatment effects. To this end, employing multi-modal imaging-derived inclusion criteria may be especially beneficial. Here, we present a novel multi-modal imaging marker that predicts future cognitive and neural decline from [F-18]fluorodeoxyglucose positron emission tomography (PET), amyloid florbetapir PET, and structural magnetic resonance imaging (MRI), based on a new deep learning algorithm (randomized denoising autoencoder marker, rDAm). Using ADNI2 MCI data, we show that employing rDAm as a trial enrichment criterion reduces the required sample estimates by at least five times compared to the no-enrichment regime, and leads to smaller trials with high statistical power, compared to existing methods. PMID:26093156

  19. Mapping tropical dry forest succession using multiple criteria spectral mixture analysis

    NASA Astrophysics Data System (ADS)

    Cao, Sen; Yu, Qiuyan; Sanchez-Azofeifa, Arturo; Feng, Jilu; Rivard, Benoit; Gu, Zhujun

    2015-11-01

    Tropical dry forests (TDFs) in the Americas are considered the first frontier of economic development with less than 1% of their total original coverage under protection. Accordingly, accurate estimates of their spatial extent, fragmentation, and degree of regeneration are critical in evaluating the success of current conservation policies. This study focused on a well-protected secondary TDF in Santa Rosa National Park (SRNP) Environmental Monitoring Super Site, Guanacaste, Costa Rica. We used spectral signature analysis of TDF ecosystem succession (early, intermediate, and late successional stages), and its intrinsic variability, to propose a new multiple criteria spectral mixture analysis (MCSMA) method on the shortwave infrared (SWIR) of HyMap image. Unlike most existing iterative mixture analysis (IMA) techniques, MCSMA tries to extract and make use of representative endmembers with spectral and spatial information. MCSMA then considers three criteria that influence the comparative importance of different endmember combinations (endmember models): root mean square error (RMSE); spatial distance (SD); and fraction consistency (FC), to create an evaluation framework to select a best-fit model. The spectral analysis demonstrated that TDFs have a high spectral variability as a result of biomass variability. By adopting two search strategies, the unmixing results showed that our new MCSMA approach had a better performance in root mean square error (early: 0.160/0.159; intermediate: 0.322/0.321; and late: 0.239/0.235); mean absolute error (early: 0.132/0.128; intermediate: 0.254/0.251; and late: 0.191/0.188); and systematic error (early: 0.045/0.055; intermediate: -0.211/-0.214; and late: 0.161/0.160), compared to the multiple endmember spectral mixture analysis (MESMA). This study highlights the importance of SWIR in differentiating successional stages in TDFs. The proposed MCSMA provides a more flexible and generalized means for the best-fit model determination

  20. Image interpretation criteria for FDG PET/CT in multiple myeloma: a new proposal from an Italian expert panel. IMPeTUs (Italian Myeloma criteria for PET USe).

    PubMed

    Nanni, Cristina; Zamagni, Elena; Versari, Annibale; Chauvie, Stephane; Bianchi, Andrea; Rensi, Marco; Bellò, Marilena; Rambaldi, Ilaria; Gallamini, Andrea; Patriarca, Francesca; Gay, Francesca; Gamberi, Barbara; Cavo, Michele; Fanti, Stefano

    2016-03-01

    .43, 0.22 and 0.21, respectively, and on PET-EoT, the alpha coefficients were 0.07, 0.28, 0.25 and 0.21, respectively. BM was generally difficult to score since grades 2 and 3 are difficult to discriminate. However, since neither of the two grades is related to BM myelomatous involvement, the difference was not clinically relevant. Agreement on focal lesion scores and on the number of focal lesions was good. The new visual criteria for interpreting FDG PET/CT imaging in MM patients, IMPeTUs, were found to be feasible in clinical practice.

  1. Appendage modal coordinate truncation criteria in hybrid coordinate dynamic analysis. [for spacecraft attitude control

    NASA Technical Reports Server (NTRS)

    Likins, P.; Ohkami, Y.; Wong, C.

    1976-01-01

    The paper examines the validity of the assumption that certain appendage-distributed (modal) coordinates can be truncated from a system model without unacceptable degradation of fidelity in hybrid coordinate dynamic analysis for attitude control of spacecraft with flexible appendages. Alternative truncation criteria are proposed and their interrelationships defined. Particular attention is given to truncation criteria based on eigenvalues, eigenvectors, and controllability and observability. No definitive resolution of the problem is advanced, and exhaustive study is required to obtain ultimate truncation criteria.

  2. Paraxial ghost image analysis

    NASA Astrophysics Data System (ADS)

    Abd El-Maksoud, Rania H.; Sasian, José M.

    2009-08-01

    This paper develops a methodology to model ghost images that are formed by two reflections between the surfaces of a multi-element lens system in the paraxial regime. An algorithm is presented to generate the ghost layouts from the nominal layout. For each possible ghost layout, paraxial ray tracing is performed to determine the ghost Gaussian cardinal points, the size of the ghost image at the nominal image plane, the location and diameter of the ghost entrance and exit pupils, and the location and diameter for the ghost entrance and exit windows. The paraxial ghost irradiance point spread function is obtained by adding up the irradiance contributions for all ghosts. Ghost simulation results for a simple lens system are provided. This approach provides a quick way to analyze ghost images in the paraxial regime.

  3. Radiologist and automated image analysis

    NASA Astrophysics Data System (ADS)

    Krupinski, Elizabeth A.

    1999-07-01

    Significant advances are being made in the area of automated medical image analysis. Part of the progress is due to the general advances being made in the types of algorithms used to process images and perform various detection and recognition tasks. A more important reason for this growth in medical image analysis processes, may be due however to a very different reason. The use of computer workstations, digital image acquisition technologies and the use of CRT monitors for display of medical images for primary diagnostic reading is becoming more prevalent in radiology departments around the world. With the advance in computer- based displays, however, has come the realization that displaying images on a CRT monitor is not the same as displaying film on a viewbox. There are perceptual, cognitive and ergonomic issues that must be considered if radiologists are to accept this change in technology and display. The bottom line is that radiologists' performance must be evaluated with these new technologies and image analysis techniques in order to verify that diagnostic performance is at least as good with these new technologies and image analysis procedures as with film-based displays. The goal of this paper is to address some of the perceptual, cognitive and ergonomic issues associated with reading radiographic images from digital displays.

  4. 75 FR 69140 - NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-10

    ... COMMISSION NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the...- Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis Risk Models...-Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis Risk...

  5. Criteria for Comparing Domain Analysis Approaches Version 01.00.00

    DTIC Science & Technology

    1991-12-01

    all domain analysis approaches. "* Section 4 presents the comparison criteria. 4 1. Introducion " Section 5 applies these criteria to the six...Library Science KAPTUR Prieto-Diaz 1987 Arango . Prieto-Diaz Mhodeyn Developmented0 Software Productivity Solutions TheryShrnfivasP Algebra ,.- Ada FODA

  6. Histopathological Image Analysis: A Review

    PubMed Central

    Gurcan, Metin N.; Boucheron, Laura; Can, Ali; Madabhushi, Anant; Rajpoot, Nasir; Yener, Bulent

    2010-01-01

    Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement to the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe. PMID:20671804

  7. Multispectral analysis of multimodal images.

    PubMed

    Kvinnsland, Yngve; Brekke, Njål; Taxt, Torfinn M; Grüner, Renate

    2009-01-01

    An increasing number of multimodal images represent a valuable increase in available image information, but at the same time it complicates the extraction of diagnostic information across the images. Multispectral analysis (MSA) has the potential to simplify this problem substantially as unlimited number of images can be combined, and tissue properties across the images can be extracted automatically. We have developed a software solution for MSA containing two algorithms for unsupervised classification, an EM-algorithm finding multinormal class descriptions and the k-means clustering algorithm, and two for supervised classification, a Bayesian classifier using multinormal class descriptions and a kNN-algorithm. The software has an efficient user interface for the creation and manipulation of class descriptions, and it has proper tools for displaying the results. The software has been tested on different sets of images. One application is to segment cross-sectional images of brain tissue (T1- and T2-weighted MR images) into its main normal tissues and brain tumors. Another interesting set of images are the perfusion maps and diffusion maps, derived images from raw MR images. The software returns segmentations that seem to be sensible. The MSA software appears to be a valuable tool for image analysis with multimodal images at hand. It readily gives a segmentation of image volumes that visually seems to be sensible. However, to really learn how to use MSA, it will be necessary to gain more insight into what tissues the different segments contain, and the upcoming work will therefore be focused on examining the tissues through for example histological sections.

  8. Vessel Labeling in Combined Confocal Scanning Laser Ophthalmoscopy and Optical Coherence Tomography Images: Criteria for Blood Vessel Discrimination

    PubMed Central

    Motte, Jeremias; Alten, Florian; Ewering, Carina; Osada, Nani; Kadas, Ella M.; Brandt, Alexander U.; Oberwahrenbrock, Timm; Clemens, Christoph R.; Eter, Nicole; Paul, Friedemann; Marziniak, Martin

    2014-01-01

    Introduction The diagnostic potential of optical coherence tomography (OCT) in neurological diseases is intensively discussed. Besides the sectional view of the retina, modern OCT scanners produce a simultaneous top-view confocal scanning laser ophthalmoscopy (cSLO) image including the option to evaluate retinal vessels. A correct discrimination between arteries and veins (labeling) is vital for detecting vascular differences between healthy subjects and patients. Up to now, criteria for labeling (cSLO) images generated by OCT scanners do not exist. Objective This study reviewed labeling criteria originally developed for color fundus photography (CFP) images. Methods The criteria were modified to reflect the cSLO technique, followed by development of a protocol for labeling blood vessels. These criteria were based on main aspects such as central light reflex, brightness, and vessel thickness, as well as on some additional criteria such as vascular crossing patterns and the context of the vessel tree. Results and Conclusion They demonstrated excellent inter-rater agreement and validity, which seems to indicate that labeling of images might no longer require more than one rater. This algorithm extends the diagnostic possibilities offered by OCT investigations. PMID:25203135

  9. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  10. Decerns: A framework for multi-criteria decision analysis

    DOE PAGES

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  11. Comparison of the EORTC criteria and PERCIST in solid tumors: a pooled analysis and review

    PubMed Central

    Kim, Jung Han

    2016-01-01

    Two sets of response criteria using PET are currently available to monitor metabolic changes in solid tumors: the criteria developed by the European Organization for Research and Treatment of Cancer (EORTC criteria) and the PET Response Criteria in Solid Tumors (PERCIST). We conducted this pooled study to investigate the strength of agreement between the EORTC criteria and PERCIST in the assessment of tumor response. We surveyed MEDLINE, EMBASE and PUBMED for articles with terms of the EORTC criteria and PERCIST between 2009 and January 2016. We searched for all the references of relevant articles and reviews using the ‘related articles’ feature in the PUBMED. There were six articles with the data on the comparison of the EORTC criteria and PERCIST. A total of 348 patients were collected; 190 (54.6%) with breast cancer, 81 with colorectal cancer, 45 with lung cancer, 14 with basal cell carcinoma in the skin, 12 with stomach cancer, and 6 with head and neck cancer. The agreement of tumor response between the EORTC criteria and PERCIST was excellent (k = 0.946). Of 348 patients, only 12 (3.4%) showed disagreement between the two criteria in the assessment of tumor response. The shift of tumor response between the EORTC criteria and PERCIST occurred mostly in patients with PMR and SMD. The estimated overall response rates were not significantly different between the two criteria (72.7% by EORTC vs. 73.6% by PERCIST). In conclusion, this pooled analysis demonstrates that the EORTC criteria and PERCIST showed almost perfect agreement in the assessment of tumor response. PMID:27517621

  12. Comparison of the EORTC criteria and PERCIST in solid tumors: a pooled analysis and review.

    PubMed

    Kim, Jung Han

    2016-09-06

    Two sets of response criteria using PET are currently available to monitor metabolic changes in solid tumors: the criteria developed by the European Organization for Research and Treatment of Cancer (EORTC criteria) and the PET Response Criteria in Solid Tumors (PERCIST). We conducted this pooled study to investigate the strength of agreement between the EORTC criteria and PERCIST in the assessment of tumor response. We surveyed MEDLINE, EMBASE and PUBMED for articles with terms of the EORTC criteria and PERCIST between 2009 and January 2016. We searched for all the references of relevant articles and reviews using the 'related articles' feature in the PUBMED. There were six articles with the data on the comparison of the EORTC criteria and PERCIST. A total of 348 patients were collected; 190 (54.6%) with breast cancer, 81 with colorectal cancer, 45 with lung cancer, 14 with basal cell carcinoma in the skin, 12 with stomach cancer, and 6 with head and neck cancer. The agreement of tumor response between the EORTC criteria and PERCIST was excellent (k = 0.946). Of 348 patients, only 12 (3.4%) showed disagreement between the two criteria in the assessment of tumor response. The shift of tumor response between the EORTC criteria and PERCIST occurred mostly in patients with PMR and SMD. The estimated overall response rates were not significantly different between the two criteria (72.7% by EORTC vs. 73.6% by PERCIST). In conclusion, this pooled analysis demonstrates that the EORTC criteria and PERCIST showed almost perfect agreement in the assessment of tumor response.

  13. Analysis and performance of various classification criteria sets in a Colombian cohort of patients with spondyloarthritis.

    PubMed

    Bautista-Molano, Wilson; Landewé, Robert B M; Londoño, John; Romero-Sanchez, Consuelo; Valle-Oñate, Rafael; van der Heijde, Désirée

    2016-07-01

    The objective of this study was to investigate the performance of classification criteria sets (Assessment of SpondyloArthritis international Society (ASAS), European Spondylarthropathy Study Group (ESSG), and Amor) for spondyloarthritis (SpA) in a clinical practice cohort in Colombia and provide insight into how rheumatologists follow the diagnostic path in patients suspected of SpA. Patients with a rheumatologist's diagnosis of SpA were retrospectively classified according to three criteria sets. Classification rate was defined as the proportion of patients fulfilling a particular criterion. Characteristics of patients fulfilling and not fulfilling each criterion were compared. The ASAS criteria classified 81 % of all patients (n = 581) as having either axial SpA (44 %) or peripheral SpA (37 %), whereas a lower proportion met ESSG criteria (74 %) and Amor criteria (53 %). There was a high degree of overlap among the different criteria, and 42 % of the patients met all three criteria. Patients fulfilling all three criteria sets were older (36 vs. 30 years), had more SpA features (3 vs. 1 features), and more frequently had a current or past history of back pain (77 vs. 43 %), inflammatory back pain (47 vs. 13 %), enthesitis (67 vs. 26 %), and buttock pain (37 vs. 13 %) vs. those not fulfilling any criteria. HLA-B27, radiographs, and MRI-SI were performed in 77, 59, and 24 % of the patients, respectively. The ASAS criteria classified more patients as having SpA in this Colombian cohort when the rheumatologist's diagnosis is used as an external standard. Although physicians do not perform HLA-B27 or imaging in all patients, they do require these tests if the clinical symptoms fall short of confirming SpA and suspicion remains.

  14. Distributed multi-criteria model evaluation and spatial association analysis

    NASA Astrophysics Data System (ADS)

    Scherer, Laura; Pfister, Stephan

    2015-04-01

    Model performance, if evaluated, is often communicated by a single indicator and at an aggregated level; however, it does not embrace the trade-offs between different indicators and the inherent spatial heterogeneity of model efficiency. In this study, we simulated the water balance of the Mississippi watershed using the Soil and Water Assessment Tool (SWAT). The model was calibrated against monthly river discharge at 131 measurement stations. Its time series were bisected to allow for subsequent validation at the same gauges. Furthermore, the model was validated against evapotranspiration which was available as a continuous raster based on remote sensing. The model performance was evaluated for each of the 451 sub-watersheds using four different criteria: 1) Nash-Sutcliffe efficiency (NSE), 2) percent bias (PBIAS), 3) root mean square error (RMSE) normalized to standard deviation (RSR), as well as 4) a combined indicator of the squared correlation coefficient and the linear regression slope (bR2). Conditions that might lead to a poor model performance include aridity, a very flat and steep relief, snowfall and dams, as indicated by previous research. In an attempt to explain spatial differences in model efficiency, the goodness of the model was spatially compared to these four phenomena by means of a bivariate spatial association measure which combines Pearson's correlation coefficient and Moran's index for spatial autocorrelation. In order to assess the model performance of the Mississippi watershed as a whole, three different averages of the sub-watershed results were computed by 1) applying equal weights, 2) weighting by the mean observed river discharge, 3) weighting by the upstream catchment area and the square root of the time series length. Ratings of model performance differed significantly in space and according to efficiency criterion. The model performed much better in the humid Eastern region than in the arid Western region which was confirmed by the

  15. Image analysis for DNA sequencing

    NASA Astrophysics Data System (ADS)

    Palaniappan, Kannappan; Huang, Thomas S.

    1991-07-01

    There is a great deal of interest in automating the process of DNA (deoxyribonucleic acid) sequencing to support the analysis of genomic DNA such as the Human and Mouse Genome projects. In one class of gel-based sequencing protocols autoradiograph images are generated in the final step and usually require manual interpretation to reconstruct the DNA sequence represented by the image. The need to handle a large volume of sequence information necessitates automation of the manual autoradiograph reading step through image analysis in order to reduce the length of time required to obtain sequence data and reduce transcription errors. Various adaptive image enhancement, segmentation and alignment methods were applied to autoradiograph images. The methods are adaptive to the local characteristics of the image such as noise, background signal, or presence of edges. Once the two-dimensional data is converted to a set of aligned one-dimensional profiles waveform analysis is used to determine the location of each band which represents one nucleotide in the sequence. Different classification strategies including a rule-based approach are investigated to map the profile signals, augmented with the original two-dimensional image data as necessary, to textual DNA sequence information.

  16. Errors from Image Analysis

    SciTech Connect

    Wood, William Monford

    2015-02-23

    Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.

  17. Evaluation of expert criteria for preoperative magnetic resonance imaging of newly diagnosed breast cancer.

    PubMed

    Behrendt, Carolyn E; Tumyan, Lusine; Gonser, Laura; Shaw, Sara L; Vora, Lalit; Paz, I Benjamin; Ellenhorn, Joshua D I; Yim, John H

    2014-08-01

    Despite 2 randomized trials reporting no reduction in operations or local recurrence at 1 year, preoperative magnetic resonance imaging (MRI) is increasingly used in diagnostic workup of breast cancer. We evaluated 5 utilization criteria recently proposed by experts. Of women (n = 340) newly diagnosed with unilateral breast cancer who underwent bilateral MRI, most (69.4%) met at least 1 criterion before MRI: mammographic density (44.4%), under consideration for partial breast irradiation (PBI) (19.7%), genetic-familial risk (12.9%), invasive lobular carcinoma (11.8%), and multifocal/multicentric disease (10.6%). MRI detected occult malignant lesion or extension of index lesion in 21.2% of index, 3.3% of contralateral, breasts. No expert criterion was associated with MRI-detected malignant lesion, which associated instead with pre-MRI plan of lumpectomy without PBI (48.2% of subjects): Odds Ratio 3.05, 95% CI 1.57-5.91 (p adjusted for multiple hypothesis testing = 0.007, adjusted for index-vs-contralateral breast and covariates). The expert guidelines were not confirmed by clinical evidence.

  18. Anmap: Image and data analysis

    NASA Astrophysics Data System (ADS)

    Alexander, Paul; Waldram, Elizabeth; Titterington, David; Rees, Nick

    2014-11-01

    Anmap analyses and processes images and spectral data. Originally written for use in radio astronomy, much of its functionality is applicable to other disciplines; additional algorithms and analysis procedures allow direct use in, for example, NMR imaging and spectroscopy. Anmap emphasizes the analysis of data to extract quantitative results for comparison with theoretical models and/or other experimental data. To achieve this, Anmap provides a wide range of tools for analysis, fitting and modelling (including standard image and data processing algorithms). It also provides a powerful environment for users to develop their own analysis/processing tools either by combining existing algorithms and facilities with the very powerful command (scripting) language or by writing new routines in FORTRAN that integrate seamlessly with the rest of Anmap.

  19. Evaluation of low-energy contrast-enhanced spectral mammography images by comparing them to full-field digital mammography using EUREF image quality criteria.

    PubMed

    Lalji, U C; Jeukens, C R L P N; Houben, I; Nelemans, P J; van Engen, R E; van Wylick, E; Beets-Tan, R G H; Wildberger, J E; Paulis, L E; Lobbes, M B I

    2015-10-01

    Contrast-enhanced spectral mammography (CESM) examination results in a low-energy (LE) and contrast-enhanced image. The LE appears similar to a full-field digital mammogram (FFDM). Our aim was to evaluate LE CESM image quality by comparing it to FFDM using criteria defined by the European Reference Organization for Quality Assured Breast Screening and Diagnostic Services (EUREF). A total of 147 cases with both FFDM and LE images were independently scored by two experienced radiologists using these (20) EUREF criteria. Contrast detail measurements were performed using a dedicated phantom. Differences in image quality scores, average glandular dose, and contrast detail measurements between LE and FFDM were tested for statistical significance. No significant differences in image quality scores were observed between LE and FFDM images for 17 out of 20 criteria. LE scored significantly lower on one criterion regarding the sharpness of the pectoral muscle (p < 0.001), and significantly better on two criteria on the visualization of micro-calcifications (p = 0.02 and p = 0.034). Dose and contrast detail measurements did not reveal any physical explanation for these observed differences. Low-energy CESM images are non-inferior to FFDM images. From this perspective FFDM can be omitted in patients with an indication for CESM. • Low-energy CESM images are non-inferior to FFDM images. • Micro-calcifications are significantly more visible on LE CESM than on FFDM. • There is no physical explanation for this improved visibility of micro-calcifications. • There is no need for an extra FFDM when CESM is indicated.

  20. Performance criteria for emergency medicine residents: a job analysis.

    PubMed

    Blouin, Danielle; Dagnone, Jeffrey Damon

    2008-11-01

    A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.

  1. Data selection criteria in star-based monitoring of GOES imager visible-channel responsivities

    NASA Astrophysics Data System (ADS)

    Chang, I.-Lok; Crosby, David; Dean, Charles; Weinreb, Michael; Baltimore, Perry; Baucom, Jeanette; Han, Dejiang

    2004-10-01

    Monitoring the responsivities of the visible channels of the operational Geostationary Operational Environmental Satellites (GOES) is an on-going effort at NOAA. Various techniques are being used. In this paper we describe the technique based on the analysis of star signals that are used in the GOES Orbit and Attitude Tracking System (OATS) for satellite attitude and orbit determination. Time series of OATS star observations give information on the degradation of the detectors of a visible channel. Investigations of star data from the past three years have led to several modifications of the method we initially used to calculate the exponential degradation coefficient of a star-signal time series. First we observed that different patterns of detector output versus time result when star images drift across the detector array along different trajectories. We found that certain trajectories should be rejected in the data analysis. We found also that some detector-dependent weighting coefficients used in the OATS analysis tend to scatter the star signals measured by different detectors. We present a set of modifications to our star monitoring algorithms for resolving such problems. Other simple enhancements on the algorithms will also be described. With these modifications, the time series of the star signals show less scatter. This allows for more confidence in the estimated degradation rates and a more realistic statistical analysis on the extent of uncertainty in those rates. The resulting time series and estimated degradation rates for the visible channels of GOES-8 and GOES-10 Imagers will be presented.

  2. Multispectral Imaging Broadens Cellular Analysis

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Amnis Corporation, a Seattle-based biotechnology company, developed ImageStream to produce sensitive fluorescence images of cells in flow. The company responded to an SBIR solicitation from Ames Research Center, and proposed to evaluate several methods of extending the depth of field for its ImageStream system and implement the best as an upgrade to its commercial products. This would allow users to view whole cells at the same time, rather than just one section of each cell. Through Phase I and II SBIR contracts, Ames provided Amnis the funding the company needed to develop this extended functionality. For NASA, the resulting high-speed image flow cytometry process made its way into Medusa, a life-detection instrument built to collect, store, and analyze sample organisms from erupting hydrothermal vents, and has the potential to benefit space flight health monitoring. On the commercial end, Amnis has implemented the process in ImageStream, combining high-resolution microscopy and flow cytometry in a single instrument, giving researchers the power to conduct quantitative analyses of individual cells and cell populations at the same time, in the same experiment. ImageStream is also built for many other applications, including cell signaling and pathway analysis; classification and characterization of peripheral blood mononuclear cell populations; quantitative morphology; apoptosis (cell death) assays; gene expression analysis; analysis of cell conjugates; molecular distribution; and receptor mapping and distribution.

  3. [Diagnosis of pelvic inflammatory disease. Which clinical and paraclinical criteria? Role of imaging and laparoscopy?].

    PubMed

    Bouquier, J; Fauconnier, A; Fraser, W; Dumont, A; Huchon, C

    2012-12-01

    Diagnosis of pelvic inflammatory disease is difficult. We focus on a systematic literature review to study diagnostic values of history-taking, clinical examination, laboratory tests and imagery. After this literature review, we build a diagnostic model for pelvic inflammatory disease. This diagnostic model is built on two major criteria: presence of adnexal tenderness or cervical motion tenderness. Additional minor criteria, increasing the likelihood of the diagnosis of pelvic inflammatory disease were added based on their specificity and their positive likelihood ratio. These minor criteria are supported by history-taking, clinical examination, laboratory tests and also on relevant ultrasonographic criteria. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  4. Multi-criteria analysis for PM10 planning

    NASA Astrophysics Data System (ADS)

    Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa

    To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.

  5. Multivariate image analysis in biomedicine.

    PubMed

    Nattkemper, Tim W

    2004-10-01

    In recent years, multivariate imaging techniques are developed and applied in biomedical research in an increasing degree. In research projects and in clinical studies as well m-dimensional multivariate images (MVI) are recorded and stored to databases for a subsequent analysis. The complexity of the m-dimensional data and the growing number of high throughput applications call for new strategies for the application of image processing and data mining to support the direct interactive analysis by human experts. This article provides an overview of proposed approaches for MVI analysis in biomedicine. After summarizing the biomedical MVI techniques the two level framework for MVI analysis is illustrated. Following this framework, the state-of-the-art solutions from the fields of image processing and data mining are reviewed and discussed. Motivations for MVI data mining in biology and medicine are characterized, followed by an overview of graphical and auditory approaches for interactive data exploration. The paper concludes with summarizing open problems in MVI analysis and remarks upon the future development of biomedical MVI analysis.

  6. Family-based association analysis of alcohol dependence criteria and severity

    PubMed Central

    Wetherill, Leah; Kapoor, Manav; Agrawal, Arpana; Bucholz, Kathleen; Koller, Daniel; Bertelsen, Sarah E.; Le, Nhung; Wang, Jen-Chyong; Almasy, Laura; Hesselbrock, Victor; Kramer, John; Nurnberger, John I.; Schuckit, Marc; Tischfield, Jay A.; Xuei, Xiaoling; Porjesz, Bernice; Edenberg, Howard J.; Goate, Alison M.; Foroud, Tatiana

    2013-01-01

    Background Despite the high heritability of alcohol dependence (AD), the genes found to be associated with it account for only a small proportion of its total variability. The goal of this study was to identify and analyze phenotypes based on homogeneous classes of individuals to increase the power to detect genetic risk factors contributing to the risk of AD. Methods The 7 individual DSM-IV criteria for AD were analyzed using latent class analysis (LCA) to identify classes defined by the pattern of endorsement of the criteria. A genome-wide association study was performed in 118 extended European American families (n = 2,322 individuals) densely affected with AD to identify genes associated with AD, with each of the seven DSM-IV criteria, and with the probability of belonging to two of three latent classes. Results Heritability for DSM-IV AD was 61%, and ranged from 17-60% for the other phenotypes. A SNP in the olfactory receptor OR51L1 was significantly associated (7.3 × 10−8) with the DSM-IV criterion of persistent desire to, or inability to, cut down on drinking. LCA revealed a three-class model: the “low risk” class (50%) rarely endorsed any criteria, and none met criteria for AD; the “moderate risk” class (33) endorsed primarily 4 DSM-IV criteria, and 48% met criteria for AD; the “high risk” class (17%) manifested high endorsement probabilities for most criteria and nearly all (99%) met criteria for AD One single nucleotide polymorphism (SNP) in a sodium leak channel NALCN demonstrated genome-wide significance with the high risk class (p=4.1 × 10−8). Analyses in an independent sample did not replicate these associations. Conclusion We explored the genetic contribution to several phenotypes derived from the DSM-IV alcohol dependence criteria. The strongest evidence of association was with SNPs in NALCN and OR51L1. PMID:24015780

  7. Priority setting of health interventions: the need for multi-criteria decision analysis

    PubMed Central

    Baltussen, Rob; Niessen, Louis

    2006-01-01

    Priority setting of health interventions is often ad-hoc and resources are not used to an optimal extent. Underlying problem is that multiple criteria play a role and decisions are complex. Interventions may be chosen to maximize general population health, to reduce health inequalities of disadvantaged or vulnerable groups, ad/or to respond to life-threatening situations, all with respect to practical and budgetary constraints. This is the type of problem that policy makers are typically bad at solving rationally, unaided. They tend to use heuristic or intuitive approaches to simplify complexity, and in the process, important information is ignored. Next, policy makers may select interventions for only political motives. This indicates the need for rational and transparent approaches to priority setting. Over the past decades, a number of approaches have been developed, including evidence-based medicine, burden of disease analyses, cost-effectiveness analyses, and equity analyses. However, these approaches concentrate on single criteria only, whereas in reality, policy makers need to make choices taking into account multiple criteria simultaneously. Moreover, they do not cover all criteria that are relevant to policy makers. Therefore, the development of a multi-criteria approach to priority setting is necessary, and this has indeed recently been identified as one of the most important issues in health system research. In other scientific disciplines, multi-criteria decision analysis is well developed, has gained widespread acceptance and is routinely used. This paper presents the main principles of multi-criteria decision analysis. There are only a very few applications to guide resource allocation decisions in health. We call for a shift away from present priority setting tools in health – that tend to focus on single criteria – towards transparent and systematic approaches that take into account all relevant criteria simultaneously. PMID:16923181

  8. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  9. A mathematical analysis of the ABCD criteria for diagnosing malignant melanoma

    NASA Astrophysics Data System (ADS)

    Lee, Hyunju; Kwon, Kiwoon

    2017-03-01

    The medical community currently employs the ABCD (asymmetry, border irregularity, color variegation, and diameter of the lesion) criteria in the early diagnosis of a malignant melanoma. Although many image segmentation and classification methods are used to analyze the ABCD criteria, it is rare to see a study containing mathematical justification of the parameters that are used to quantify the ABCD criteria. In this paper, we suggest new parameters to assess asymmetry, border irregularity, and color variegation, and explain the mathematical meaning of the parameters. The suggested parameters are then tested with 24 skin samples. The parameters suggested for the 24 skin samples are displayed in three-dimensional coordinates and are compared to those presented in other studies (Ercal et al 1994 IEEE Trans. Biomed. Eng. 41 837–45, Cheerla and Frazier 2014 Int. J. Innovative Res. Sci., Eng. Technol. 3 9164–83) in terms of Pearson correlation coefficient and classification accuracy in determining the malignancy of the lesions.

  10. A comparative analysis of the D-criteria used to determine genetic links of small bodies

    NASA Astrophysics Data System (ADS)

    Sokolova, M. G.; Kondratyeva, E. D.; Nefedyev, Y. A.

    2013-10-01

    In this article the D-criteria, which can be used to determine the genetic relationships of small bodies with their parent bodies in the solar system, are estimated. Drummond (1981), Southworth and Hawkins (1963), Jopek (1993), dynamic (Kalinin and Kulikova, 2007; Holshevnikov and Titov, 2007) D-criteria were analysed. It was found that the Drummond criterion is less sensitive to errors of observations and its upper limit does not exceed 0.2. The Southworth-Hawkins and Jopek D-criteria are more stable and have good convergence. Limiting values, which vary in the range of 0.3-0.6 (except for the Lyrids), were determined on the basis of the analysis of six meteor showers for the Southworth-Hawkins and Jopek criteria.

  11. [Analysis of radiological criteria correlation and clinical manifestation of central lumbosacral spinal stenosis].

    PubMed

    Shevelev, I N; Kornienko, V N; Konovalov, N A; Cherkashov, A M; Nazarenko, A G; Asiutin, D S

    2012-01-01

    to assess the correlation analysis of radiologic criteria referred to central degenerative spinal stenosis and intensity of clinical manifestation. a retrospective cohort data were collected from 2010 till 2011, 27 patients who underwent surgical treatment of central spinal stenosis in Burdenko Neurosurgical Institute. 16 male and 11 female patients were included in the present study. Mean age of the patients at the time of surgery was 57.9 years. All patients had spinal canal decompression and transpedicular or oblique transcorporal fusion. Stabilization included different types of pedicle screws, including transcutaneous stabilization systems. Interbody fusion was achieved by posterolateral transforaminal approach (TLIF --transforaminal lumbar interbody fusion). 13 cases included combination of interbody fusion and guided oblique lumbar interbody fusion "GO-LIF", which could not be managed without robotic assistance. All patients underwent full preoperative examination. MR image evaluation included: antero-posterior diameter of the spinal canal (mm), interfacet interval (mm), and cross-section area of the spinal canal (mm2). Patients were evaluated by outcome analysis scales: Degenerative Disease Intensity Level (DDIL) and Swiss Spinal Stenosis Score (Zurich Claudication Questionnaire, Brigham spinal stenosis questionnaire). Surgical outcomes were evaluated according to modified classes of Kawabata et al. analysis of our patients group demonstrated absence of correlation between intensity level of degenerative central spinal stenosis based on neurovisualization methods and intensity of its clinical manifestation. Pearson's coefficient of correlation and Spearmen rank correlation for variable which evaluates clinical signs (DDIL in %) and neurovisualization data (antero-posterior diameter of the spinal canal (mm), interfacet interval (mm), and cross-section area of the spinal canal (mm2)) are not significant to zero (p > 0.2).

  12. A Unified Mathematical Approach to Image Analysis.

    DTIC Science & Technology

    1987-08-31

    describes four instances of the paradigm in detail. Directions for ongoing and future research are also indicated. Keywords: Image processing; Algorithms; Segmentation; Boundary detection; tomography; Global image analysis .

  13. Multifluorescence 2D gel imaging and image analysis.

    PubMed

    Vormbrock, Ingo; Hartwig, Sonja; Lehr, Stefan

    2012-01-01

    Although image acquisition and analysis are crucial steps within the multifluorescence two-dimensional gel electrophoresis workflow, some basics are frequently not carried out with the necessary diligence. This chapter should help to prevent easily avoidable failures during imaging and image preparation for comparative protein analysis.

  14. UV imaging in pharmaceutical analysis.

    PubMed

    Østergaard, Jesper

    2017-08-01

    UV imaging provides spatially and temporally resolved absorbance measurements, which are highly useful in pharmaceutical analysis. Commercial UV imaging instrumentation was originally developed as a detector for separation sciences, but the main use is in the area of in vitro dissolution and release testing studies. The review covers the basic principles of the technology and summarizes the main applications in relation to intrinsic dissolution rate determination, excipient compatibility studies and in vitro release characterization of drug substances and vehicles intended for parenteral administration. UV imaging has potential for providing new insights to drug dissolution and release processes in formulation development by real-time monitoring of swelling, precipitation, diffusion and partitioning phenomena. Limitations of current instrumentation are discussed and a perspective to new developments and opportunities given as new instrumentation is emerging. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Minimizing impacts of land use change on ecosystem services using multi-criteria heuristic analysis.

    PubMed

    Keller, Arturo A; Fournier, Eric; Fox, Jessica

    2015-06-01

    Development of natural landscapes to support human activities impacts the capacity of the landscape to provide ecosystem services. Typically, several ecosystem services are impacted at a single development site and various footprint scenarios are possible, thus a multi-criteria analysis is needed. Restoration potential should also be considered for the area surrounding the permanent impact site. The primary objective of this research was to develop a heuristic approach to analyze multiple criteria (e.g. impacts to various ecosystem services) in a spatial configuration with many potential development sites. The approach was to: (1) quantify the magnitude of terrestrial ecosystem service (biodiversity, carbon sequestration, nutrient and sediment retention, and pollination) impacts associated with a suite of land use change scenarios using the InVEST model; (2) normalize results across categories of ecosystem services to allow cross-service comparison; (3) apply the multi-criteria heuristic algorithm to select sites with the least impact to ecosystem services, including a spatial criterion (separation between sites). As a case study, the multi-criteria impact minimization algorithm was applied to InVEST output to select 25 potential development sites out of 204 possible locations (selected by other criteria) within a 24,000 ha property. This study advanced a generally applicable spatial multi-criteria approach for 1) considering many land use footprint scenarios, 2) balancing impact decisions across a suite of ecosystem services, and 3) determining the restoration potential of ecosystem services after impacts.

  16. Selecting Potential Targetable Biomarkers for Imaging Purposes in Colorectal Cancer Using TArget Selection Criteria (TASC): A Novel Target Identification Tool.

    PubMed

    van Oosten, Marleen; Crane, Lucia Ma; Bart, Joost; van Leeuwen, Fijs W; van Dam, Gooitzen M

    2011-04-01

    Peritoneal carcinomatosis (PC) of colorectal origin is associated with a poor prognosis. However, cytoreductive surgery combined with hyperthermic intraperitoneal chemotherapy is available for a selected group of PC patients, which significantly increases overall survival rates up to 30%. As a consequence, there is substantial room for improvement. Tumor targeting is expected to improve the treatment efficacy of colorectal cancer (CRC) further through 1) more sensitive preoperative tumor detection, thus reducing overtreatment; 2) better intraoperative detection and surgical elimination of residual disease using tumor-specific intraoperative imaging; and 3) tumor-specific targeted therapeutics. This review focuses, in particular, on the development of tumor-targeted imaging agents. A large number of biomarkers are known to be upregulated in CRC. However, to date, no validated criteria have been described for the selection of the most promising biomarkers for tumor targeting. Such a scoring system might improve the selection of the correct biomarker for imaging purposes. In this review, we present the TArget Selection Criteria (TASC) scoring system for selection of potential biomarkers for tumor-targeted imaging. By applying TASC to biomarkers for CRC, we identified seven biomarkers (carcinoembryonic antigen, CXC chemokine receptor 4, epidermal growth factor receptor, epithelial cell adhesion molecule, matrix metalloproteinases, mucin 1, and vascular endothelial growth factor A) that seem most suitable for tumor-targeted imaging applications in colorectal cancer. Further cross-validation studies in CRC and other tumor types are necessary to establish its definitive value.

  17. A Survey of Enterprise Architecture Analysis Using Multi Criteria Decision Making Models (MCDM)

    NASA Astrophysics Data System (ADS)

    Zia, Mehmooda Jabeen; Azam, Farooque; Allauddin, Maria

    System design becomes really important for software production due to continuous increase in size and complexity of software systems. It is a complex design activity to build architecture for the systems like large enterprises. Thus it is a critical issue to select the correct architecture in software engineering domain. Moreover, in enterprise architecture selection different goals and objectives must be taken into consideration as it is a multi-criteria decision making problem. Generally this field of enterprise architecture analysis has progressed from the application of linear weighting, through integer programming and linear programming to multi-criteria decision making (MCDM) models. In this paper we survey two multi-criteria decision making models (AHP, ANP) to determine that to what extent they have been used in making powerful decisions in complex enterprise architecture analysis. We have found that by using ANP model, decision makers of an enterprise can make more precise and suitable decisions in selection of enterprise architecture styles.

  18. A computational image analysis glossary for biologists.

    PubMed

    Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M

    2012-09-01

    Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies.

  19. 75 FR 80544 - NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... COMMISSION NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the..., ``Confirmatory Thermal-Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis... . SUPPLEMENTARY INFORMATION: NUREG-1953, ``Confirmatory Thermal-Hydraulic Analysis to Support Specific...

  20. Do choosing wisely tools meet criteria for patient decision aids? A descriptive analysis of patient materials.

    PubMed

    Légaré, France; Hébert, Jessica; Goh, Larissa; Lewis, Krystina B; Leiva Portocarrero, Maria Ester; Robitaille, Hubert; Stacey, Dawn

    2016-08-26

    Choosing Wisely is a remarkable physician-led campaign to reduce unnecessary or harmful health services. Some of the literature identifies Choosing Wisely as a shared decision-making approach. We evaluated the patient materials developed by Choosing Wisely Canada to determine whether they meet the criteria for shared decision-making tools known as patient decision aids. Descriptive analysis of all Choosing Wisely Canada patient materials. In May 2015, we selected all Choosing Wisely Canada patient materials from its official website. Four team members independently extracted characteristics of the English materials using the International Patient Decision Aid Standards (IPDAS) modified 16-item minimum criteria for qualifying and certifying patient decision aids. The research team discussed discrepancies between data extractors and reached a consensus. Descriptive analysis was conducted. Of the 24 patient materials assessed, 12 were about treatments, 11 were about screening and 1 was about prevention. The median score for patient materials using IPDAS criteria was 10/16 (range: 8-11) for screening topics and 6/12 (range: 6-9) for prevention and treatment topics. Commonly missed criteria were stating the decision (21/24 did not), providing balanced information on option benefits/harms (24/24 did not), citing evidence (24/24 did not) and updating policy (24/24 did not). Out of 24 patient materials, only 2 met the 6 IPDAS criteria to qualify as patient decision aids, and neither of these 2 met the 6 certifying criteria. Patient materials developed by Choosing Wisely Canada do not meet the IPDAS minimal qualifying or certifying criteria for patient decision aids. Modifications to the Choosing Wisely Canada patient materials would help to ensure that they qualify as patient decision aids and thus as more effective shared decision-making tools. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to

  1. Do choosing wisely tools meet criteria for patient decision aids? A descriptive analysis of patient materials

    PubMed Central

    Légaré, France; Hébert, Jessica; Goh, Larissa; Lewis, Krystina B; Leiva Portocarrero, Maria Ester; Robitaille, Hubert; Stacey, Dawn

    2016-01-01

    Objectives Choosing Wisely is a remarkable physician-led campaign to reduce unnecessary or harmful health services. Some of the literature identifies Choosing Wisely as a shared decision-making approach. We evaluated the patient materials developed by Choosing Wisely Canada to determine whether they meet the criteria for shared decision-making tools known as patient decision aids. Design Descriptive analysis of all Choosing Wisely Canada patient materials. Data source In May 2015, we selected all Choosing Wisely Canada patient materials from its official website. Main outcomes and measures Four team members independently extracted characteristics of the English materials using the International Patient Decision Aid Standards (IPDAS) modified 16-item minimum criteria for qualifying and certifying patient decision aids. The research team discussed discrepancies between data extractors and reached a consensus. Descriptive analysis was conducted. Results Of the 24 patient materials assessed, 12 were about treatments, 11 were about screening and 1 was about prevention. The median score for patient materials using IPDAS criteria was 10/16 (range: 8–11) for screening topics and 6/12 (range: 6–9) for prevention and treatment topics. Commonly missed criteria were stating the decision (21/24 did not), providing balanced information on option benefits/harms (24/24 did not), citing evidence (24/24 did not) and updating policy (24/24 did not). Out of 24 patient materials, only 2 met the 6 IPDAS criteria to qualify as patient decision aids, and neither of these 2 met the 6 certifying criteria. Conclusions Patient materials developed by Choosing Wisely Canada do not meet the IPDAS minimal qualifying or certifying criteria for patient decision aids. Modifications to the Choosing Wisely Canada patient materials would help to ensure that they qualify as patient decision aids and thus as more effective shared decision-making tools. PMID:27566638

  2. Image analysis in medical imaging: recent advances in selected examples

    PubMed Central

    Dougherty, G

    2010-01-01

    Medical imaging has developed into one of the most important fields within scientific imaging due to the rapid and continuing progress in computerised medical image visualisation and advances in analysis methods and computer-aided diagnosis. Several research applications are selected to illustrate the advances in image analysis algorithms and visualisation. Recent results, including previously unpublished data, are presented to illustrate the challenges and ongoing developments. PMID:21611048

  3. [Evaluation of dental plaque by quantitative digital image analysis system].

    PubMed

    Huang, Z; Luan, Q X

    2016-04-18

    To analyze the plaque staining image by using image analysis software, to verify the maneuverability, practicability and repeatability of this technique, and to evaluate the influence of different plaque stains. In the study, 30 volunteers were enrolled from the new dental students of Peking University Health Science Center in accordance with the inclusion criteria. The digital images of the anterior teeth were acquired after plaque stained according to filming standardization.The image analysis was performed using Image Pro Plus 7.0, and the Quigley-Hein plaque indexes of the anterior teeth were evaluated. The plaque stain area percentage and the corresponding dental plaque index were highly correlated,and the Spearman correlation coefficient was 0.776 (P<0.01). Intraclass correlation coefficients of the tooth area and plaque area which two researchers used the software to calculate were 0.956 and 0.930 (P<0.01).The Bland-Altman analysis chart showed only a few spots outside the 95% consistency boundaries. The different plaque stains image analysis results showed that the difference of the tooth area measurements was not significant, while the difference of the plaque area measurements significant (P<0.01). This method is easy in operation and control,highly related to the calculated percentage of plaque area and traditional plaque index, and has good reproducibility.The different plaque staining method has little effect on image segmentation results.The sensitive plaque stain for image analysis is suggested.

  4. Uncooled thermal imaging and image analysis

    NASA Astrophysics Data System (ADS)

    Wang, Shiyun; Chang, Benkang; Yu, Chunyu; Zhang, Junju; Sun, Lianjun

    2006-09-01

    Thermal imager can transfer difference of temperature to difference of electric signal level, so can be application to medical treatment such as estimation of blood flow speed and vessel 1ocation [1], assess pain [2] and so on. With the technology of un-cooled focal plane array (UFPA) is grown up more and more, some simple medical function can be completed with un-cooled thermal imager, for example, quick warning for fever heat with SARS. It is required that performance of imaging is stabilization and spatial and temperature resolution is high enough. In all performance parameters, noise equivalent temperature difference (NETD) is often used as the criterion of universal performance. 320 x 240 α-Si micro-bolometer UFPA has been applied widely presently for its steady performance and sensitive responsibility. In this paper, NETD of UFPA and the relation between NETD and temperature are researched. several vital parameters that can affect NETD are listed and an universal formula is presented. Last, the images from the kind of thermal imager are analyzed based on the purpose of detection persons with fever heat. An applied thermal image intensification method is introduced.

  5. Quantitative multi-image analysis for biomedical Raman spectroscopic imaging.

    PubMed

    Hedegaard, Martin A B; Bergholt, Mads S; Stevens, Molly M

    2016-05-01

    Imaging by Raman spectroscopy enables unparalleled label-free insights into cell and tissue composition at the molecular level. With established approaches limited to single image analysis, there are currently no general guidelines or consensus on how to quantify biochemical components across multiple Raman images. Here, we describe a broadly applicable methodology for the combination of multiple Raman images into a single image for analysis. This is achieved by removing image specific background interference, unfolding the series of Raman images into a single dataset, and normalisation of each Raman spectrum to render comparable Raman images. Multivariate image analysis is finally applied to derive the contributing 'pure' biochemical spectra for relative quantification. We present our methodology using four independently measured Raman images of control cells and four images of cells treated with strontium ions from substituted bioactive glass. We show that the relative biochemical distribution per area of the cells can be quantified. In addition, using k-means clustering, we are able to discriminate between the two cell types over multiple Raman images. This study shows a streamlined quantitative multi-image analysis tool for improving cell/tissue characterisation and opens new avenues in biomedical Raman spectroscopic imaging. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. The use of multi-criteria decision analysis to tackle waste management problems: a literature review.

    PubMed

    Achillas, Charisios; Moussiopoulos, Nicolas; Karagiannidis, Avraam; Banias, Georgias; Perkoulidis, George

    2013-02-01

    Problems in waste management have become more and more complex during recent decades. The increasing volumes of waste produced and social environmental consciousness present prominent drivers for environmental managers towards the achievement of a sustainable waste management scheme. However, in practice, there are many factors and influences - often mutually conflicting - criteria for finding solutions in real-life applications. This paper presents a review of the literature on multi-criteria decision aiding in waste management problems for all reported waste streams. Despite limitations, which are clearly stated, most of the work published in this field is reviewed. The present review aims to provide environmental managers and decision-makers with a thorough list of practical applications of the multi-criteria decision analysis techniques that are used to solve real-life waste management problems, as well as the criteria that are mostly employed in such applications according to the nature of the problem under study. Moreover, the paper explores the advantages and disadvantages of using multi-criteria decision analysis techniques in waste management problems in comparison to other available alternatives.

  7. Multi-criteria analysis of site selection for groundwater recharge with treated municipal wastewater.

    PubMed

    Ahmadi, Mohammad Mehdi; Mahdavirad, Hadi; Bakhtiari, Bahram

    2017-08-01

    Geographic information systems (GIS) and remote sensing techniques are used as a decision support system to identify potential soil aquifer treatment (SAT) sites for groundwater recharge of Kerman aquifer, which is located in the southeast of Iran. These sites are identified using a single-objective multi-criteria analysis. To ensure technical feasibility, environmental sustainability, social acceptability and economical viability a number of criteria are considered for the site selection. The criteria selected for the different variables and acceptable ranges are based on standards published in national and international guidelines and technical documents. Multi-criteria evaluation was performed combining all produced thematic maps by means of the weighted index overlay method in order to select sites meeting all the criteria. The resulting map of this analysis shows potential sites are located in the north, southwest and southeast of the study area. Considering field observations, a potential site, which is located in the southwest of the study area, is proposed as the most suitable site for SAT. The result indicates that the study area has sufficient required suitable space for groundwater recharge with treated wastewater.

  8. Imaging analysis of LDEF craters

    NASA Technical Reports Server (NTRS)

    Radicatidibrozolo, F.; Harris, D. W.; Chakel, J. A.; Fleming, R. H.; Bunch, T. E.

    1991-01-01

    Two small craters in Al from the Long Duration Exposure Facility (LDEF) experiment tray A11E00F (no. 74, 119 micron diameter and no. 31, 158 micron diameter) were analyzed using Auger electron spectroscopy (AES), time-of-flight secondary ion mass spectroscopy (TOF-SIMS), low voltage scanning electron microscopy (LVSEM), and SEM energy dispersive spectroscopy (EDS). High resolution images and sensitive elemental and molecular analysis were obtained with this combined approach. The result of these analyses are presented.

  9. Validity of Criteria-Based Content Analysis (CBCA) at Trial in Free-Narrative Interviews

    ERIC Educational Resources Information Center

    Roma, Paolo; San Martini, Pietro; Sabatello, Ugo; Tatarelli, Roberto; Ferracuti, Stefano

    2011-01-01

    Objective: The reliability of child witness testimony in sexual abuse cases is often controversial, and few tools are available. Criteria-Based Content Analysis (CBCA) is a widely used instrument for evaluating psychological credibility in cases of suspected child sexual abuse. Only few studies have evaluated CBCA scores in children suspected of…

  10. Fuels planning: science synthesis and integration; economic uses fact sheet 06: selection criteria analysis

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Confidence in decisionmaking can often come from knowing if others in similar circumstances would choose the same management strategy. Researchers at the USDA FS Pacific Northwest Research Station and the University of Saskatchewan have developed a Selection Criteria Analysis for answering this very question. This fact sheet discusses factors affecting the choice of...

  11. Using multi-criteria analysis of simulation models to understand complex biological systems

    Treesearch

    Maureen C. Kennedy; E. David. Ford

    2011-01-01

    Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...

  12. Logical Criteria Applied in Writing and in Editing by Text Analysis.

    ERIC Educational Resources Information Center

    Mandersloot, Wim G. B.

    1996-01-01

    Argues that technical communication editing is most effective if it deals with structure first, and that structure deficiencies can be detected by applying a range of logical analysis criteria to each text part. Concludes that lists, headings, classifications, and organograms must comply with the laws of categorization and relevant logical…

  13. Planning applications in image analysis

    NASA Technical Reports Server (NTRS)

    Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.

    1994-01-01

    We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.

  14. Grid computing in image analysis.

    PubMed

    Kayser, Klaus; Görtler, Jürgen; Borkenfeld, Stephan; Kayser, Gian

    2011-01-01

    Diagnostic surgical pathology or tissue–based diagnosis still remains the most reliable and specific diagnostic medical procedure. The development of whole slide scanners permits the creation of virtual slides and to work on so-called virtual microscopes. In addition to interactive work on virtual slides approaches have been reported that introduce automated virtual microscopy, which is composed of several tools focusing on quite different tasks. These include evaluation of image quality and image standardization, analysis of potential useful thresholds for object detection and identification (segmentation), dynamic segmentation procedures, adjustable magnification to optimize feature extraction, and texture analysis including image transformation and evaluation of elementary primitives. Grid technology seems to possess all features to efficiently target and control the specific tasks of image information and detection in order to obtain a detailed and accurate diagnosis. Grid technology is based upon so-called nodes that are linked together and share certain communication rules in using open standards. Their number and functionality can vary according to the needs of a specific user at a given point in time. When implementing automated virtual microscopy with Grid technology, all of the five different Grid functions have to be taken into account, namely 1) computation services, 2) data services, 3) application services, 4) information services, and 5) knowledge services. Although all mandatory tools of automated virtual microscopy can be implemented in a closed or standardized open system, Grid technology offers a new dimension to acquire, detect, classify, and distribute medical image information, and to assure quality in tissue–based diagnosis.

  15. Criteria for the addition of prone imaging to myocardial perfusion single-photon emission computed tomography for inferior wall.

    PubMed

    Nakaya, Koji; Onoguchi, Masahisa; Nishimura, Yoshihiro; Kiso, Keisuke; Otsuka, Hideki; Nouno, Yoshifumi; Shibutani, Takayuki; Yasuda, Eisuke

    2017-09-01

    Myocardial perfusion single-photon emission computed tomography (SPECT) is occasionally suspected to generate images that represent either ischemia or infarction for the inferior wall [right coronary artery (RCA) disease] or attenuation artifacts because of the diaphragm. We often encounter this. The application of prone imaging is advantageous in the differentiation of RCA disease because of attenuation artifacts. If decreased accumulation of radioisotopes is observed at the site with either RCA disease or attenuation artifacts, then a criterion that enables the addition of prone imaging should be implemented. Then, we evaluated sites where RCA disease and attenuation artifacts would likely appear and investigated the threshold of decreased accumulation that enables utilization of prone imaging. The patients in this study were divided into two groups: group A (20 patients) suspected to have attenuation artifacts because of the diaphragm and group B (14 patients) with RCA disease. Additional evaluation by prone imaging was performed in all patients. We utilized a 20-segment quantitative perfusion SPECT polar map in the supine and prone positions to compare the percentage increase in Thallium chloride (Tl) in both groups. We then investigated the percent uptake (%uptake) value of decreased accumulation in the inferior wall for the addition of prone imaging. The highest %uptake was present in segments 3, 4, 5, and 10 in group A after the prone imaging. Detection of attenuation artifacts from the diaphragm was easy in segments 3, 4, 5, and 10, and we set the %uptake threshold at 62, 61, 71, and 76%, respectively, in the supine position for the addition of prone imaging. A decrease of the %uptake in segments 3, 4, 5, and 10 after supine imaging is presumed to result from attenuation artifact or RCA disease. We established evaluation criteria for the addition of prone imaging in patients with decreased accumulation in the inferior wall during supine imaging.

  16. The economics of project analysis: Optimal investment criteria and methods of study

    NASA Technical Reports Server (NTRS)

    Scriven, M. C.

    1979-01-01

    Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.

  17. Validity of criteria-based content analysis (CBCA) at trial in free-narrative interviews.

    PubMed

    Roma, Paolo; Martini, Pietro San; Sabatello, Ugo; Tatarelli, Roberto; Ferracuti, Stefano

    2011-08-01

    The reliability of child witness testimony in sexual abuse cases is often controversial, and few tools are available. Criteria-Based Content Analysis (CBCA) is a widely used instrument for evaluating psychological credibility in cases of suspected child sexual abuse. Only few studies have evaluated CBCA scores in children suspected of being sexually abused. We designed this study to investigate the reliability of CBCA in discriminating allegations of child sexual abuse during court hearings, by comparing CBCA results with the court's final, unappealable sentence. We then investigated whether CBCA scores correlated with age, and whether some criteria were better than others in distinguishing cases of confirmed and unconfirmed abuse. From a pool of 487 child sexual abuse cases, confirmed and unconfirmed cases were selected using various criteria including child IQ≥70, agreement between the final trial outcome and the opinion of 3 experts, presence of at least 1 independent validating informative component in cases of confirmed abuse, and absence of suggestive questions during the child's testimonies. This screening yielded a study sample of 60 confirmed and 49 unconfirmed cases. The 14 item version of CBCA was applied to child witness testimony by 2 expert raters. Of the 14 criteria tested, 12 achieved satisfactory inter-rater agreement (Maxwell's Random Error). Analyses of covariance, with case group (confirmed vs. unconfirmed) and gender as independent variables and age as a covariate, showed no main effect of gender. Analyses of the interaction showed that the simple effects of abuse were significant in both sex. Nine CBCA criteria were satisfied more often among confirmed than unconfirmed cases; seven criteria increased with age. CBCA scores distinguish between confirmed and unconfirmed cases. The criteria that distinguish best between the 2 groups are Quantity of Details, Interactions, and Subjective Experience. CBCA scores correlate positively with age, and

  18. Determining optimal medical image compression: psychometric and image distortion analysis

    PubMed Central

    2012-01-01

    Background Storage issues and bandwidth over networks have led to a need to optimally compress medical imaging files while leaving clinical image quality uncompromised. Methods To determine the range of clinically acceptable medical image compression across multiple modalities (CT, MR, and XR), we performed psychometric analysis of image distortion thresholds using physician readers and also performed subtraction analysis of medical image distortion by varying degrees of compression. Results When physician readers were asked to determine the threshold of compression beyond which images were clinically compromised, the mean image distortion threshold was a JPEG Q value of 23.1 ± 7.0. In Receiver-Operator Characteristics (ROC) plot analysis, compressed images could not be reliably distinguished from original images at any compression level between Q = 50 and Q = 95. Below this range, some readers were able to discriminate the compressed and original images, but high sensitivity and specificity for this discrimination was only encountered at the lowest JPEG Q value tested (Q = 5). Analysis of directly measured magnitude of image distortion from subtracted image pairs showed that the relationship between JPEG Q value and degree of image distortion underwent an upward inflection in the region of the two thresholds determined psychometrically (approximately Q = 25 to Q = 50), with 75 % of the image distortion occurring between Q = 50 and Q = 1. Conclusion It is possible to apply lossy JPEG compression to medical images without compromise of clinical image quality. Modest degrees of compression, with a JPEG Q value of 50 or higher (corresponding approximately to a compression ratio of 15:1 or less), can be applied to medical images while leaving the images indistinguishable from the original. PMID:22849336

  19. Determining optimal medical image compression: psychometric and image distortion analysis.

    PubMed

    Flint, Alexander C

    2012-07-31

    Storage issues and bandwidth over networks have led to a need to optimally compress medical imaging files while leaving clinical image quality uncompromised. To determine the range of clinically acceptable medical image compression across multiple modalities (CT, MR, and XR), we performed psychometric analysis of image distortion thresholds using physician readers and also performed subtraction analysis of medical image distortion by varying degrees of compression. When physician readers were asked to determine the threshold of compression beyond which images were clinically compromised, the mean image distortion threshold was a JPEG Q value of 23.1 ± 7.0. In Receiver-Operator Characteristics (ROC) plot analysis, compressed images could not be reliably distinguished from original images at any compression level between Q = 50 and Q = 95. Below this range, some readers were able to discriminate the compressed and original images, but high sensitivity and specificity for this discrimination was only encountered at the lowest JPEG Q value tested (Q = 5). Analysis of directly measured magnitude of image distortion from subtracted image pairs showed that the relationship between JPEG Q value and degree of image distortion underwent an upward inflection in the region of the two thresholds determined psychometrically (approximately Q = 25 to Q = 50), with 75 % of the image distortion occurring between Q = 50 and Q = 1. It is possible to apply lossy JPEG compression to medical images without compromise of clinical image quality. Modest degrees of compression, with a JPEG Q value of 50 or higher (corresponding approximately to a compression ratio of 15:1 or less), can be applied to medical images while leaving the images indistinguishable from the original.

  20. Can medical criteria settle priority-setting debates? The need for ethical analysis.

    PubMed

    Dickenson, D L

    1999-01-01

    Medical criteria rooted in evidence-based medicine are often seen as a value-neutral 'trump card' which puts paid to any further debate about setting priorities for treatment. On this argument, doctors should stop providing treatment at the point when it becomes medically futile, and that is also the threshold at which the health purchaser should stop purchasing. This paper offers three kinds of ethical criteria as a counterweight to analysis based solely on medical criteria. The first set of arguments concerns futility, probability and utility; the second, justice and fairness; the third, consent and competence. The argument is illustrated by two recent case studies about futility and priority-setting: the U.S. example of 'Baby Ryan' and the U.K. case of 'Child B'.

  1. Automated image analysis of uterine cervical images

    NASA Astrophysics Data System (ADS)

    Li, Wenjing; Gu, Jia; Ferris, Daron; Poirson, Allen

    2007-03-01

    Cervical Cancer is the second most common cancer among women worldwide and the leading cause of cancer mortality of women in developing countries. If detected early and treated adequately, cervical cancer can be virtually prevented. Cervical precursor lesions and invasive cancer exhibit certain morphologic features that can be identified during a visual inspection exam. Digital imaging technologies allow us to assist the physician with a Computer-Aided Diagnosis (CAD) system. In colposcopy, epithelium that turns white after application of acetic acid is called acetowhite epithelium. Acetowhite epithelium is one of the major diagnostic features observed in detecting cancer and pre-cancerous regions. Automatic extraction of acetowhite regions from cervical images has been a challenging task due to specular reflection, various illumination conditions, and most importantly, large intra-patient variation. This paper presents a multi-step acetowhite region detection system to analyze the acetowhite lesions in cervical images automatically. First, the system calibrates the color of the cervical images to be independent of screening devices. Second, the anatomy of the uterine cervix is analyzed in terms of cervix region, external os region, columnar region, and squamous region. Third, the squamous region is further analyzed and subregions based on three levels of acetowhite are identified. The extracted acetowhite regions are accompanied by color scores to indicate the different levels of acetowhite. The system has been evaluated by 40 human subjects' data and demonstrates high correlation with experts' annotations.

  2. An analysis of the criteria used to diagnose children with Nonverbal Learning Disability (NLD).

    PubMed

    Mammarella, Irene C; Cornoldi, Cesare

    2014-01-01

    Based on a review of the literature, the diagnostic criteria used for children with nonverbal learning disabilities (NLD) were identified as follows: (a) low visuospatial intelligence; (b) discrepancy between verbal and visuospatial intelligence; (c) visuoconstructive and fine-motor coordination skills; (d) visuospatial memory tasks; (e) reading better than mathematical achievement; and (f) socioemotional skills. An analysis of the effect size was used to investigate the strength of criteria for diagnosing NLD considering 35 empirical studies published from January 1980 to February 2011. Overall, our results showed that the most important criteria for distinguishing children with NLD from controls were as follows: a low visuospatial intelligence with a relatively good verbal intelligence, visuoconstructive and fine-motor coordination impairments, good reading decoding together with low math performance. Deficits in visuospatial memory and social skills were also present. A preliminary set of criteria for diagnosing NLD was developed on these grounds. It was concluded, however, that-although some consensus is emerging-further research is needed to definitively establish shared diagnostic criteria for children with NLD.

  3. Validating retinal fundus image analysis algorithms: issues and a proposal.

    PubMed

    Trucco, Emanuele; Ruggeri, Alfredo; Karnowski, Thomas; Giancardo, Luca; Chaum, Edward; Hubschman, Jean Pierre; Al-Diri, Bashir; Cheung, Carol Y; Wong, Damon; Abràmoff, Michael; Lim, Gilbert; Kumar, Dinesh; Burlina, Philippe; Bressler, Neil M; Jelinek, Herbert F; Meriaudeau, Fabrice; Quellec, Gwénolé; Macgillivray, Tom; Dhillon, Bal

    2013-05-01

    This paper concerns the validation of automatic retinal image analysis (ARIA) algorithms. For reasons of space and consistency, we concentrate on the validation of algorithms processing color fundus camera images, currently the largest section of the ARIA literature. We sketch the context (imaging instruments and target tasks) of ARIA validation, summarizing the main image analysis and validation techniques. We then present a list of recommendations focusing on the creation of large repositories of test data created by international consortia, easily accessible via moderated Web sites, including multicenter annotations by multiple experts, specific to clinical tasks, and capable of running submitted software automatically on the data stored, with clear and widely agreed-on performance criteria, to provide a fair comparison.

  4. Image analysis of dye stained patterns in soils

    NASA Astrophysics Data System (ADS)

    Bogner, Christina; Trancón y Widemann, Baltasar; Lange, Holger

    2013-04-01

    Quality of surface water and groundwater is directly affected by flow processes in the unsaturated zone. In general, it is difficult to measure or model water flow. Indeed, parametrization of hydrological models is problematic and often no unique solution exists. To visualise flow patterns in soils directly dye tracer studies can be done. These experiments provide images of stained soil profiles and their evaluation demands knowledge in hydrology as well as in image analysis and statistics. First, these photographs are converted to binary images classifying the pixels in dye stained and non-stained ones. Then, some feature extraction is necessary to discern relevant hydrological information. In our study we propose to use several index functions to extract different (ideally complementary) features. We associate each image row with a feature vector (i.e. a certain number of image function values) and use these features to cluster the image rows to identify similar image areas. Because images of stained profiles might have different reasonable clusterings, we calculate multiple consensus clusterings. An expert can explore these different solutions and base his/her interpretation of predominant flow mechanisms on quantitative (objective) criteria. The complete workflow from reading-in binary images to final clusterings has been implemented in the free R system, a language and environment for statistical computing. The calculation of image indices is part of our own package Indigo, manipulation of binary images, clustering and visualization of results are done using either build-in facilities in R, additional R packages or the LATEX system.

  5. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.

    PubMed

    Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W

    2017-08-18

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  6. Whole slide image with image analysis of atypical bile duct brushing: Quantitative features predictive of malignancy.

    PubMed

    Collins, Brian T; Weimholt, R Cody

    2015-01-01

    Whole slide images (WSIs) involve digitally capturing glass slides for microscopic computer-based viewing and these are amenable to quantitative image analysis. Bile duct (BD) brushing can show morphologic features that are categorized as indeterminate for malignancy. The study aims to evaluate quantitative morphologic features of atypical categories of BD brushing by WSI analysis for the identification of criteria predictive of malignancy. Over a 3-year period, BD brush specimens with indeterminate diagnostic categorization (atypical to suspicious) were subjected to WSI analysis. Ten well-visualized groups with morphologic atypical features were selected per case and had the quantitative analysis performed for group area, individual nuclear area, the number of nuclei per group, N: C ratio and nuclear size differential. There were 28 cases identified with 17 atypical and 11 suspicious. The average nuclear area was 63.7 µm(2) for atypical and 80.1 µm(2) for suspicious (+difference 16.4 µm(2); P = 0.002). The nuclear size differential was 69.7 µm(2) for atypical and 88.4 µm(2) for suspicious (+difference 18.8 µm(2); P = 0.009). An average nuclear area >70 µm(2) had a 3.2 risk ratio for suspicious categorization. The quantitative criteria findings as measured by image analysis on WSI showed that cases categorized as suspicious had more nuclear size pleomorphism (+18.8 µm(2)) and larger nuclei (+16.4 µm(2)) than those categorized as atypical. WSI with morphologic image analysis can demonstrate quantitative statistically significant differences between atypical and suspicious BD brushings and provide objective criteria that support the diagnosis of carcinoma.

  7. A water quality monitoring network design using fuzzy theory and multiple criteria analysis.

    PubMed

    Chang, Chia-Ling; Lin, You-Tze

    2014-10-01

    A proper water quality monitoring design is required in a watershed, particularly in a water resource protected area. As numerous factors can influence the water quality monitoring design, this study applies multiple criteria analysis to evaluate the suitability of the water quality monitoring design in the Taipei Water Resource Domain (TWRD) in northern Taiwan. Seven criteria, which comprise percentage of farmland area, percentage of built-up area, amount of non-point source pollution, green cover ratio, landslide area ratio, ratio of over-utilization on hillsides, and density of water quality monitoring stations, are selected in the multiple criteria analysis. The criteria are normalized and weighted. The weighted method is applied to score the subbasins. The density of water quality stations needs to be increased in priority in the subbasins with a higher score. The fuzzy theory is utilized to prioritize the need for a higher density of water quality monitoring stations. The results show that the need for more water quality stations in subbasin 2 in the Bei-Shih Creek Basin is much higher than those in the other subbasins. Furthermore, the existing water quality station in subbasin 2 requires maintenance. It is recommended that new water quality stations be built in subbasin 2.

  8. Statistical analysis of dynamic sequences for functional imaging

    NASA Astrophysics Data System (ADS)

    Kao, Chien-Min; Chen, Chin-Tu; Wernick, Miles N.

    2000-04-01

    Factor analysis of medical image sequences (FAMIS), in which one concerns the problem of simultaneous identification of homogeneous regions (factor images) and the characteristic temporal variations (factors) inside these regions from a temporal sequence of images by statistical analysis, is one of the major challenges in medical imaging. In this research, we contribute to this important area of research by proposing a two-step approach. First, we study the use of the noise- adjusted principal component (NAPC) analysis developed by Lee et. al. for identifying the characteristic temporal variations in dynamic scans acquired by PET and MRI. NAPC allows us to effectively reject data noise and substantially reduce data dimension based on signal-to-noise ratio consideration. Subsequently, a simple spatial analysis based on the criteria of minimum spatial overlapping and non-negativity of the factor images is applied for extraction of the factors and factor images. In our simulation study, our preliminary results indicate that the proposed approach can accurately identify the factor images. However, the factors are not completely separated.

  9. Multispectral Image Analysis of Hurricane Gilbert

    DTIC Science & Technology

    1989-05-19

    Classification) Multispectral Image Analysis of Hurrican Gilbert (unclassified) 12. PERSONAL AUTHOR(S) Kleespies, Thomas J. (GL/LYS) 13a. TYPE OF REPORT...cloud top height. component, of tle image in the red channel, and similarly for the green and blue channels. Multispectral Muti.pectral image analysis can...However, there seems to be few references to the human range of vision, the selection as to which mllti.pp.tral image analysis of scenes or

  10. Automated Microarray Image Analysis Toolbox for MATLAB

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Willse, Alan R.; Protic, Miroslava; Chandler, Darrell P.

    2005-09-01

    The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.

  11. Statistical analysis of biophoton image

    NASA Astrophysics Data System (ADS)

    Wang, Susheng

    1998-08-01

    A photon count image system has been developed to obtain the ultra-weak bioluminescence image. The photon images of some plant, animal and human hand have been detected. The biophoton image is different from usual image. In this paper three characteristics of biophoton image are analyzed. On the basis of these characteristics the detected probability and detected limit of photon count image system, detected limit of biophoton image have been discussed. These researches provide scientific basis for experiments design and photon image processing.

  12. Using multi-criteria decision analysis to assess the vulnerability of drinking water utilities.

    PubMed

    Joerin, Florent; Cool, Geneviève; Rodriguez, Manuel J; Gignac, Marc; Bouchard, Christian

    2010-07-01

    Outbreaks of microbiological waterborne disease have increased governmental concern regarding the importance of drinking water safety. Considering the multi-barrier approach to safe drinking water may improve management decisions to reduce contamination risks. However, the application of this approach must consider numerous and diverse kinds of information simultaneously. This makes it difficult for authorities to apply the approach to decision making. For this reason, multi-criteria decision analysis can be helpful in applying the multi-barrier approach to vulnerability assessment. The goal of this study is to propose an approach based on a multi-criteria analysis method in order to rank drinking water systems (DWUs) based on their vulnerability to microbiological contamination. This approach is illustrated with an application carried out on 28 DWUs supplied by groundwater in the Province of Québec, Canada. The multi-criteria analysis method chosen is measuring attractiveness by a categorical based evaluation technique methodology allowing the assessment of a microbiological vulnerability indicator (MVI) for each DWU. Results are presented on a scale ranking DWUs from less vulnerable to most vulnerable to contamination. MVI results are tested using a sensitivity analysis on barrier weights and they are also compared with historical data on contamination at the utilities. The investigation demonstrates that MVI provides a good representation of the vulnerability of DWUs to microbiological contamination.

  13. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  14. Multi-criteria decision analysis in conservation planning: Designing conservation area networks in San Diego County

    NASA Astrophysics Data System (ADS)

    MacDonald, Garrick Richard

    applicable to the research project, however, it did exhibit a few limitations. Both the advantages and disadvantages of ConsNet should be considered before using ConsNet for future conservation planning projects. The research project is an example of a large data scenario undertaken with a multiple criteria decision analysis (MCDA) approach.

  15. Surgical treatment of acromegaly according to the 2010 remission criteria: systematic review and meta-analysis.

    PubMed

    Starnoni, Daniele; Daniel, Roy Thomas; Marino, Laura; Pitteloud, Nelly; Levivier, Marc; Messerer, Mahmoud

    2016-11-01

    In 2010, the Acromegaly Consensus Group revised the criteria for cure of acromegaly and thus rates of surgical remission need to be revised in light of these new thresholds. Two subgroups consisted of patients with discordant GH and IGF-1 levels and patients in remission according to the 2000 criteria, but not to the 2010 criteria, have been reported after adenomectomy and for these subgroups the precise incidence and management has not been established. The objective of the study was to update rates of surgical remission and complications and to evaluate the incidence, management, and long-term outcome of the two previously described subgroups of patients. Systematic review and meta-analysis of surgical series that defined remission according to the 2010 biochemical criteria. We included 13 studies (1105 patients). The pooled rate of overall surgical remission was 54.8 % (95 % CI 44.4-65.2 %), and 72.2 % with previous criteria. Remission was achieved in 77.9 % (95 % CI 68.1-87.6 %) of microadenomas; 52.7 % (95 % CI 41-64.4 %) of macroadenomas; 29 % (95 % CI 20.1-37.8 %) of invasive and 68.8 % (95 % CI 60-77.6 %) of non-invasive adenomas. Complication rates were 1.2 % (95 % CI 0.6-1.9 %) for CSF leak, 1.3 % (95 % CI 0.6-2.1 %) for permanent diabetes insipidus, 8.7 % (95 % CI 4.8-12.5 %) for new anterior pituitary dysfunction and 0.6 % (95 % CI 0.1-1.1 %) for severe intraoperative hemorrhage. We identified an intermediate group of patients, defined as: (1) Remission according to one, but not the other biochemical criteria (GH or IGF-1) or 2010 criteria (14.3 % and 47.1 % cases), (2) Remission according to 2000, but not 2010 criteria (13.2-58.8 % cases). Two studies reported a remission rate of 56.5 % and 100 %, in the two subgroups respectively, in a long-term outcome without adjuvant therapy. Overall remission with transsphenoidal surgery is achieved in ∼55 % of patients. For the intermediate group of patients

  16. Ultrasonic image analysis and image-guided interventions

    PubMed Central

    Noble, J. Alison; Navab, Nassir; Becher, H.

    2011-01-01

    The fields of medical image analysis and computer-aided interventions deal with reducing the large volume of digital images (X-ray, computed tomography, magnetic resonance imaging (MRI), positron emission tomography and ultrasound (US)) to more meaningful clinical information using software algorithms. US is a core imaging modality employed in these areas, both in its own right and used in conjunction with the other imaging modalities. It is receiving increased interest owing to the recent introduction of three-dimensional US, significant improvements in US image quality, and better understanding of how to design algorithms which exploit the unique strengths and properties of this real-time imaging modality. This article reviews the current state of art in US image analysis and its application in image-guided interventions. The article concludes by giving a perspective from clinical cardiology which is one of the most advanced areas of clinical application of US image analysis and describing some probable future trends in this important area of ultrasonic imaging research. PMID:22866237

  17. Comparative analysis of buckling criteria for engineering structures. Single-degree-of-freedom systems

    NASA Astrophysics Data System (ADS)

    Stupishin, L.

    2017-05-01

    The criteria of the buckling of engineering structures in terms of the systems with lumped parameters are discussed in the article. The examples of setting buckling problems and their solutions using Timoshenko and Bryan criteria’s, and the criterion of the critical levels of energy are given. The analysis of the compared approaches, their strengths and shortcomings in terms of the single-degree-of-freedom systems are presented.

  18. Broader economic evaluation of disease management programs using multi-criteria decision analysis.

    PubMed

    Tsiachristas, Apostolos; Cramm, Jane Murray; Nieboer, Anna; Rutten-van Mölken, Maureen

    2013-07-01

    The aim of this paper is to develop a methodological framework to facilitate the application of Multi-Criteria Decision Analysis (MCDA) for a comprehensive economic evaluation of disease management programs (DMPs). We studied previously developed frameworks for the evaluation of DMPs and different methods of MCDA and we used practical field experience in the economic evaluation of DMPs and personal discussions with stakeholders in chronic care. The framework includes different objectives and criteria that are relevant for the evaluation of DMPs, indicators that can be used to measure how DMPs perform on these criteria, and distinguishes between the development and implementation phase of DMPs. The objectives of DMPs are categorised into a) changes in the process of care delivery, b) changes in patient lifestyle and self-management behaviour, c) changes in biomedical, physiological and clinical health outcomes, d) changes in health-related quality of life, and e) changes in final health outcomes. All relevant costs of DMPs are also included in the framework. Based on this framework we conducted a MCDA of a hypothetical DMP versus usual care. We call for a comprehensive economic evaluation of DMPs that is not just based on a single criterion but takes into account multiple relevant criteria simultaneously. The framework we presented here is a step towards standardising such an evaluation.

  19. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

  20. Hepatopulmonary syndrome: which blood gas analysis criteria and position should we use for diagnosis?

    PubMed

    Grilo, Israel; Pascasio, Juan Manuel; López-Pardo, Francisco-Jesús; Ortega-Ruiz, Francisco; Tirado, Juan Luis; Sousa, José Manuel; Rodríguez-Puras, María José; Ferrer, María Teresa; Gómez-Bravo, Miguel Ángel; Grilo, Antonio

    2017-10-03

    Different blood gas criteria have been used in the diagnosis of hepatopulmonary syndrome (HPS). Arterial blood gases were prospectively evaluated in 194 cirrhotic candidates for liver transplantation (LT) in the supine and seated position. Three blood gas criteria were analyzed: classic (partial pressure of oxygen [PaO2] < 70 mmHg and/or alveolar-arterial gradient of oxygen [A-a PO2] ≥ 20 mmHg), modern (A-a PO2 ≥ 15 mmHg or ≥ 20 mmHg in patients over 64) and the A-a PO2 ≥ threshold value adjusted for age. The prevalence of HPS in the supine and seated position was 27.8% and 23.2% (classic), 34% and 25.3% (modern) and 22.2% and 19% (adjusted for age), respectively. The proportion of severe and very severe cases increased in a seated position (11/49 [22.4%] vs 5/66 [7.6%], p = 0.02). No difference was observed in the pre-LT, post-LT and overall mortality in patients with HPS, regardless of the criteria used. Obtaining blood gas measurements in the supine position and the use of modern criteria are more sensitive for the diagnosis of HPS. Blood gas analysis with the patient seated detects a greater number of severe and very severe cases. The presence of HPS was not associated with an increase in mortality regardless of blood gas criterion used.

  1. The minimum risk principle that underlies the criteria of bounded component analysis.

    PubMed

    Cruces, Sergio; Durán-Díaz, Iván

    2015-05-01

    This paper studies the problem of the blind extraction of a subset of bounded component signals from the observations of a linear mixture. In the first part of this paper, we analyze the geometric assumptions of the observations that characterize the problem, and their implications on the mixing matrix and latent sources. In the second part, we solve the problem by adopting the principle of minimizing the risk, which refers to the encoding complexity of the observations in the worst admissible situation. This principle provides an underlying justification of several bounded component analysis (BCA) criteria, including the minimum normalized volume criterion of the estimated sources or the maximum negentropy-likelihood criterion with a uniform reference model for the estimated sources. This unifying framework can explain the differences between the criteria in accordance with their considered hypotheses for the model of the observations. This paper is first presented for the case of the extraction of a complex and multidimensional source, and later is particularized for the case of the extraction of subsets of 1-D complex sources. The results also hold true in the case of real signals, where the obtained criteria for the extraction of a set of 1-D sources usually coincide with the existing BCA criteria.

  2. Item response theory analysis of DSM-IV criteria for inhalant-use disorders in adolescents.

    PubMed

    Perron, Brian E; Vaughn, Michael G; Howard, Matthew O; Bohnert, Amy; Guerrero, Erick

    2010-07-01

    Inhalants are a serious public health concern and a dangerous form of substance use. An important unresolved issue in the inhalant literature concerns the validity of inhalant-use diagnoses and the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, distinction between inhalant abuse and inhalant dependence. To address these limitations and provide the foundation for helping build stronger diagnostic and assessment tools related to inhalant problems, this study examined the dimensionality of the criteria set and the abuse-dependence distinction using item response theory (IRT) analysis. This study used data from a survey of the population of Missouri Division of Youth Services' residents of the residential treatment system. The current study focused on adolescents and young adults who reported a lifetime history of inhalant use (N = 279). The results from the IRT analysis showed no consistent hierarchical ordering of abuse and dependence criteria, providing strong evidence against the abuse-dependence distinction. The abuse criterion of legal problems associated with use represented the item with the highest level of inhalant severity. The dependence criterion that was related to giving up important social, occupational, or recreational activities provided the most accurate discrimination between individuals at different levels of severity. Inhalant-use disorders are best represented using a dimensional versus a categorical approach. IRT analysis provides guidance for selecting criteria that can be useful for brief assessments of inhalant-use problems.

  3. Item Response Theory Analysis of DSM-IV Criteria for Inhalant-Use Disorders in Adolescents*

    PubMed Central

    Perron, Brian E.; Vaughn, Michael G.; Howard, Matthew O.; Bohnert, Amy; Guerrero, Erick

    2010-01-01

    Objective: Inhalants are a serious public health concern and a dangerous form of substance use. An important unresolved issue in the inhalant literature concerns the validity of inhalant-use diagnoses and the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, distinction between inhalant abuse and inhalant dependence. To address these limitations and provide the foundation for helping build stronger diagnostic and assessment tools related to inhalant problems, this study examined the dimensionality of the criteria set and the abuse—dependence distinction using item response theory (IRT) analysis. Method: This study used data from a survey of the population of Missouri Division of Youth Services' residents of the residential treatment system. The current study focused on adolescents and young adults who reported a lifetime history of inhalant use (N = 279). Results: The results from the IRT analysis showed no consistent hierarchical ordering of abuse and dependence criteria, providing strong evidence against the abuse—dependence distinction. The abuse criterion of legal problems associated with use represented the item with the highest level of inhalant severity. The dependence criterion that was related to giving up important social, occupational, or recreational activities provided the most accurate discrimination between individuals at different levels of severity. Conclusions: Inhalant-use disorders are best represented using a dimensional versus a categorical approach. IRT analysis provides guidance for selecting criteria that can be useful for brief assessments of inhalant-use problems. PMID:20553671

  4. Stroke lesion volumes and outcome are not different in hemispheric stroke side treated with intravenous thrombolysis based on magnetic resonance imaging criteria.

    PubMed

    Golsari, Amir; Cheng, Bastian; Sobesky, Jan; Schellinger, Peter D; Fiehler, Jens; Gerloff, Christian; Thomalla, Götz

    2015-04-01

    Patients with right hemispheric stroke (RHS) have been reported to have fewer good outcomes after thrombolysis. We aimed at evaluating outcome after stroke thrombolysis with regards to the affected hemisphere controlling for stroke lesion volume as a potential confounder. We retrospectively analyzed data from a prospective study of patients with acute stroke treated with intravenous tissue-type plasminogen activator, based on magnetic resonance imaging criteria within 6 hours of symptom onset. Neurological deficit was assessed by the National Institutes of Health Stroke Scale. Lesion volume on acute perfusion imaging, diffusion-weighted imaging (DWI) and perfusion imaging/DWI mismatch were measured. Clinical outcome was assessed after 90 days using the modified Rankin Scale, and relation to affected hemisphere was studied by multivariate analysis. Of 173 patients, 55 (32%) presented with RHS, whereas 118 (68%) had left HS. Baseline National Institutes of Health Stroke Scale was lower in RHS (11.7 versus 13.6; P=0.031). There were no differences in DWI lesion volume (11.0 versus 17.8 mL; P=0.519), perfusion imaging lesion volume (98.9 versus 118.3 mL; P=0.395), perfusion imaging/DWI mismatch (60 versus 85.05 mL; P=0.283). Clinical outcome was also comparable for both groups (modified Rankin Scale, 0-1; P=0.327). In multivariate analysis, DWI lesion volume (P<0.001) and age were associated with modified Rankin Scale at day 90, whereas affected hemisphere was not. We did not find differences between RHS and left HS with regards to stroke lesions volumes or outcome after thrombolysis. Previously reported hemisphere-related differences in stroke outcome may partly results from imbalances in stroke lesion volume between RHS and left HS. © 2015 American Heart Association, Inc.

  5. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  6. Use of multi-criteria decision analysis in regulatory alternatives analysis: a case study of lead free solder.

    PubMed

    Malloy, Timothy F; Sinsheimer, Peter J; Blake, Ann; Linkov, Igor

    2013-10-01

    Regulators are implementing new programs that require manufacturers of products containing certain chemicals of concern to identify, evaluate, and adopt viable, safer alternatives. Such programs raise the difficult question for policymakers and regulated businesses of which alternatives are "viable" and "safer." To address that question, these programs use "alternatives analysis," an emerging methodology that integrates issues of human health and environmental effects with technical feasibility and economic impact. Despite the central role that alternatives analysis plays in these programs, the methodology itself is neither well-developed nor tailored to application in regulatory settings. This study uses the case of Pb-based bar solder and its non-Pb-based alternatives to examine the application of 2 multi-criteria decision analysis (MCDA) methods to alternatives analysis: multi-attribute utility analysis and outranking. The article develops and evaluates an alternatives analysis methodology and supporting decision-analysis software for use in a regulatory context, using weighting of the relevant decision criteria generated from a stakeholder elicitation process. The analysis produced complete rankings of the alternatives, including identification of the relative contribution to the ranking of each of the highest level decision criteria such as human health impacts, technical feasibility, and economic feasibility. It also examined the effect of variation in data conventions, weighting, and decision frameworks on the outcome. The results indicate that MCDA can play a critical role in emerging prevention-based regulatory programs. Multi-criteria decision analysis methods offer a means for transparent, objective, and rigorous analysis of products and processes, providing regulators and stakeholders with a common baseline understanding of the relative performance of alternatives and the trade-offs they present. © 2013 SETAC.

  7. Adaptation and Evaluation of a Multi-Criteria Decision Analysis Model for Lyme Disease Prevention

    PubMed Central

    Aenishaenslin, Cécile; Gern, Lise; Michel, Pascal; Ravel, André; Hongoh, Valérie; Waaub, Jean-Philippe; Milord, François; Bélanger, Denise

    2015-01-01

    Designing preventive programs relevant to vector-borne diseases such as Lyme disease (LD) can be complex given the need to include multiple issues and perspectives into prioritizing public health actions. A multi-criteria decision aid (MCDA) model was previously used to rank interventions for LD prevention in Quebec, Canada, where the disease is emerging. The aim of the current study was to adapt and evaluate the decision model constructed in Quebec under a different epidemiological context, in Switzerland, where LD has been endemic for the last thirty years. The model adaptation was undertaken with a group of Swiss stakeholders using a participatory approach. The PROMETHEE method was used for multi-criteria analysis. Key elements and results of the MCDA model are described and contrasted with the Quebec model. All criteria and most interventions of the MCDA model developed for LD prevention in Quebec were directly transferable to the Swiss context. Four new decision criteria were added, and the list of proposed interventions was modified. Based on the overall group ranking, interventions targeting human populations were prioritized in the Swiss model, with the top ranked action being the implementation of a large communication campaign. The addition of criteria did not significantly alter the intervention rankings, but increased the capacity of the model to discriminate between highest and lowest ranked interventions. The current study suggests that beyond the specificity of the MCDA models developed for Quebec and Switzerland, their general structure captures the fundamental and common issues that characterize the complexity of vector-borne disease prevention. These results should encourage public health organizations to adapt, use and share MCDA models as an effective and functional approach to enable the integration of multiple perspectives and considerations in the prevention and control of complex public health issues such as Lyme disease or other vector

  8. Adaptation and Evaluation of a Multi-Criteria Decision Analysis Model for Lyme Disease Prevention.

    PubMed

    Aenishaenslin, Cécile; Gern, Lise; Michel, Pascal; Ravel, André; Hongoh, Valérie; Waaub, Jean-Philippe; Milord, François; Bélanger, Denise

    2015-01-01

    Designing preventive programs relevant to vector-borne diseases such as Lyme disease (LD) can be complex given the need to include multiple issues and perspectives into prioritizing public health actions. A multi-criteria decision aid (MCDA) model was previously used to rank interventions for LD prevention in Quebec, Canada, where the disease is emerging. The aim of the current study was to adapt and evaluate the decision model constructed in Quebec under a different epidemiological context, in Switzerland, where LD has been endemic for the last thirty years. The model adaptation was undertaken with a group of Swiss stakeholders using a participatory approach. The PROMETHEE method was used for multi-criteria analysis. Key elements and results of the MCDA model are described and contrasted with the Quebec model. All criteria and most interventions of the MCDA model developed for LD prevention in Quebec were directly transferable to the Swiss context. Four new decision criteria were added, and the list of proposed interventions was modified. Based on the overall group ranking, interventions targeting human populations were prioritized in the Swiss model, with the top ranked action being the implementation of a large communication campaign. The addition of criteria did not significantly alter the intervention rankings, but increased the capacity of the model to discriminate between highest and lowest ranked interventions. The current study suggests that beyond the specificity of the MCDA models developed for Quebec and Switzerland, their general structure captures the fundamental and common issues that characterize the complexity of vector-borne disease prevention. These results should encourage public health organizations to adapt, use and share MCDA models as an effective and functional approach to enable the integration of multiple perspectives and considerations in the prevention and control of complex public health issues such as Lyme disease or other vector

  9. Latent Class Analysis of DSM-5 Alcohol Use Disorder Criteria among Heavy-Drinking College Students

    PubMed Central

    Neighbors, Clayton

    2015-01-01

    The DSM-5 has created significant changes in the definition of alcohol use disorders (AUD). Limited work has considered the impact of these changes in specific populations, such as heavy-drinking college students. Latent class analysis (LCA) is a person-centered approach that divides a population into mutually exclusive and exhaustive latent classes, based on observable indicator variables. The present research was designed to examine whether there were distinct classes of heavy-drinking college students who met DSM-5 criteria for an AUD and whether gender, perceived social norms, use of protective behavioral strategies (PBS), drinking refusal self-efficacy (DRSE), self-perceptions of drinking identity, psychological distress, and membership in a fraternity/sorority would be associated with class membership. Three-hundred and ninety-four college students who met DSM-5 criteria for an AUD were recruited from three different universities. Two distinct classes emerged: Less Severe (86%), the majority of whom endorsed both drinking more than intended and tolerance, as well as met criteria for a mild AUD; and More Severe (14%), the majority of whom endorsed at least half of the DSM-5 AUD criteria and met criteria for a severe AUD. Relative to the Less Severe class, membership in the More Severe class was negatively associated with DRSE and positively associated with self-identification as a drinker. There is a distinct class of heavy-drinking college students with a more severe AUD and for whom intervention content needs to be more focused and tailored. Clinical implications are discussed. PMID:26051027

  10. Latent Class Analysis of DSM-5 Alcohol Use Disorder Criteria Among Heavy-Drinking College Students.

    PubMed

    Rinker, Dipali Venkataraman; Neighbors, Clayton

    2015-10-01

    The DSM-5 has created significant changes in the definition of alcohol use disorders (AUDs). Limited work has considered the impact of these changes in specific populations, such as heavy-drinking college students. Latent class analysis (LCA) is a person-centered approach that divides a population into mutually exclusive and exhaustive latent classes, based on observable indicator variables. The present research was designed to examine whether there were distinct classes of heavy-drinking college students who met DSM-5 criteria for an AUD and whether gender, perceived social norms, use of protective behavioral strategies (PBS), drinking refusal self-efficacy (DRSE), self-perceptions of drinking identity, psychological distress, and membership in a fraternity/sorority would be associated with class membership. Three-hundred and ninety-four college students who met DSM-5 criteria for an AUD were recruited from three different universities. Two distinct classes emerged: Less Severe (86%), the majority of whom endorsed both drinking more than intended and tolerance, as well as met criteria for a mild AUD; and More Severe (14%), the majority of whom endorsed at least half of the DSM-5 AUD criteria and met criteria for a severe AUD. Relative to the Less Severe class, membership in the More Severe class was negatively associated with DRSE and positively associated with self-identification as a drinker. There is a distinct class of heavy-drinking college students with a more severe AUD and for whom intervention content needs to be more focused and tailored. Clinical implications are discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Principles and clinical applications of image analysis.

    PubMed

    Kisner, H J

    1988-12-01

    Image processing has traveled to the lunar surface and back, finding its way into the clinical laboratory. Advances in digital computers have improved the technology of image analysis, resulting in a wide variety of medical applications. Offering improvements in turnaround time, standardized systems, increased precision, and walkaway automation, digital image analysis has likely found a permanent home as a diagnostic aid in the interpretation of microscopic as well as macroscopic laboratory images.

  12. FFDM image quality assessment using computerized image texture analysis

    NASA Astrophysics Data System (ADS)

    Berger, Rachelle; Carton, Ann-Katherine; Maidment, Andrew D. A.; Kontos, Despina

    2010-04-01

    Quantitative measures of image quality (IQ) are routinely obtained during the evaluation of imaging systems. These measures, however, do not necessarily correlate with the IQ of the actual clinical images, which can also be affected by factors such as patient positioning. No quantitative method currently exists to evaluate clinical IQ. Therefore, we investigated the potential of using computerized image texture analysis to quantitatively assess IQ. Our hypothesis is that image texture features can be used to assess IQ as a measure of the image signal-to-noise ratio (SNR). To test feasibility, the "Rachel" anthropomorphic breast phantom (Model 169, Gammex RMI) was imaged with a Senographe 2000D FFDM system (GE Healthcare) using 220 unique exposure settings (target/filter, kVs, and mAs combinations). The mAs were varied from 10%-300% of that required for an average glandular dose (AGD) of 1.8 mGy. A 2.5cm2 retroareolar region of interest (ROI) was segmented from each image. The SNR was computed from the ROIs segmented from images linear with dose (i.e., raw images) after flat-field and off-set correction. Image texture features of skewness, coarseness, contrast, energy, homogeneity, and fractal dimension were computed from the Premium ViewTM postprocessed image ROIs. Multiple linear regression demonstrated a strong association between the computed image texture features and SNR (R2=0.92, p<=0.001). When including kV, target and filter as additional predictor variables, a stronger association with SNR was observed (R2=0.95, p<=0.001). The strong associations indicate that computerized image texture analysis can be used to measure image SNR and potentially aid in automating IQ assessment as a component of the clinical workflow. Further work is underway to validate our findings in larger clinical datasets.

  13. Image analysis: a consumer's guide.

    PubMed

    Meyer, F

    1983-01-01

    The last years have seen an explosion of systems in image analysis. It is hard for the pathologist or the cytologist to make the right choice of equipment. All machines are stupid, and the only valuable thing is the human work put into it. So make your benefit of the work other people have done for you. Chose a method largely used on many systems and which has proved fertile in many domains and not only for your specific to day's application: Mathematical Morphology, to which are to be added the linear convolutions present on all machines is a strong candidate for becoming such a method. The paper illustrates a working day of an ideal system: research and diagnostic directed work during the working hours, automatic screening of cervical (or other) smears during night.

  14. Multiple criteria analysis of remotely piloted aircraft systems for monitoring the crops vegetation status

    NASA Astrophysics Data System (ADS)

    Cristea, L.; Luculescu, M. C.; Zamfira, S. C.; Boer, A. L.; Pop, S.

    2016-08-01

    The paper presents an analysis of Remotely Piloted Aircraft Systems (RPAS) used for monitoring the crops vegetation status. The study focuses on two types of RPAS, namely the flying wing and the multi-copter. The following criteria were taken into account: technical characteristics, power consumption, flight autonomy, flight conditions, costs, data acquisition systems used for monitoring, crops area and so on. Based on this analysis, advantages and disadvantages are emphasized offering a useful tool for choosing the proper solution according to the specific application conditions.

  15. Spreadsheet-like image analysis

    NASA Astrophysics Data System (ADS)

    Wilson, Paul

    1992-08-01

    This report describes the design of a new software system being built by the Army to support and augment automated nondestructive inspection (NDI) on-line equipment implemented by the Army for detection of defective manufactured items. The new system recalls and post-processes (off-line) the NDI data sets archived by the on-line equipment for the purpose of verifying the correctness of the inspection analysis paradigms, of developing better analysis paradigms and to gather statistics on the defects of the items inspected. The design of the system is similar to that of a spreadsheet, i.e., an array of cells which may be programmed to contain functions with arguments being data from other cells and whose resultant is the output of that cell's function. Unlike a spreadsheet, the arguments and the resultants of a cell may be a matrix such as a two-dimensional matrix of picture elements (pixels). Functions include matrix mathematics, neural networks and image processing as well as those ordinarily found in spreadsheets. The system employs all of the common environmental supports of the Macintosh computer, which is the hardware platform. The system allows the resultant of a cell to be displayed in any of multiple formats such as a matrix of numbers, text, an image, or a chart. Each cell is a window onto the resultant. Like a spreadsheet if the input value of any cell is changed its effect is cascaded into the resultants of all cells whose functions use that value directly or indirectly. The system encourages the user to play what-of games, as ordinary spreadsheets do.

  16. Micro-CT imaging: Developing criteria for examining fetal skeletons in regulatory developmental toxicology studies - A workshop report.

    PubMed

    Solomon, Howard M; Makris, Susan L; Alsaid, Hasan; Bermudez, Oscar; Beyer, Bruce K; Chen, Antong; Chen, Connie L; Chen, Zhou; Chmielewski, Gary; DeLise, Anthony M; de Schaepdrijver, Luc; Dogdas, Belma; French, Julian; Harrouk, Wafa; Helfgott, Jonathan; Henkelman, R Mark; Hesterman, Jacob; Hew, Kok-Wah; Hoberman, Alan; Lo, Cecilia W; McDougal, Andrew; Minck, Daniel R; Scott, Lelia; Stewart, Jane; Sutherland, Vicki; Tatiparthi, Arun K; Winkelmann, Christopher T; Wise, L David; Wood, Sandra L; Ying, Xiaoyou

    2016-06-01

    During the past two decades the use and refinements of imaging modalities have markedly increased making it possible to image embryos and fetuses used in pivotal nonclinical studies submitted to regulatory agencies. Implementing these technologies into the Good Laboratory Practice environment requires rigorous testing, validation, and documentation to ensure the reproducibility of data. A workshop on current practices and regulatory requirements was held with the goal of defining minimal criteria for the proper implementation of these technologies and subsequent submission to regulatory agencies. Micro-computed tomography (micro-CT) is especially well suited for high-throughput evaluations, and is gaining popularity to evaluate fetal skeletons to assess the potential developmental toxicity of test agents. This workshop was convened to help scientists in the developmental toxicology field understand and apply micro-CT technology to nonclinical toxicology studies and facilitate the regulatory acceptance of imaging data. Presentations and workshop discussions covered: (1) principles of micro-CT fetal imaging; (2) concordance of findings with conventional skeletal evaluations; and (3) regulatory requirements for validating the system. Establishing these requirements for micro-CT examination can provide a path forward for laboratories considering implementing this technology and provide regulatory agencies with a basis to consider the acceptability of data generated via this technology.

  17. Photoneutron reaction cross sections from various experiments - analysis and evaluation using physical criteria of data reliability

    NASA Astrophysics Data System (ADS)

    Varlamov, Vladimir; Ishkhanov, Boris; Orlin, Vadim; Peskov, Nikolai; Stepanov, Mikhail

    2017-09-01

    The majority of photonuclear reaction cross sections important for many fields of science and technology and various data files (EXFOR, RIPL, ENDF, etc.) supported by the IAEA were obtained in experiments using quasimonoenergetic annihilation photons. There are well-known systematic discrepancies between the partial photoneutron reactions (γ, 1n), (γ, 2n), (γ, 3n). For analysis of the data reliability the objective physical criteria were proposed. It was found out that the experimental data for many nuclei are not reliable because of large systematic uncertainties of the neutron multiplicity sorting method used. The experimentally-theoretical method was proposed for evaluating the reaction cross sections data satisfying the reliability criteria. The partial and total reaction cross sections were evaluated for many nuclei. In many cases evaluated data differ noticeably from both the experimental data and the data evaluated before for the IAEA Photonuclear Data Library. Therefore it became evident that the IAEA Library needs to be revised and updated.

  18. [THE COMPARATIVE ANALYSIS OF INFORMATION VALUE OF MAIN CLINICAL CRITERIA USED TO DIAGNOSE OF BACTERIAL VAGINOSIS].

    PubMed

    Tsvetkova, A V; Murtazina, Z A; Markusheva, T V; Mavzutov, A R

    2015-05-01

    The bacterial vaginosis is one of the most frequent causes of women visiting gynecologist. The diagnostics of bacterial vaginosis is predominantly based on Amsel criteria (1983). Nowadays, the objectivity of these criteria is disputed more often. The analysis of excretion of mucous membranes of posterolateral fornix of vagina was applied to 640 women with clinical diagnosis bacterial vaginosis. The application of light microscopy to mounts of excretion confirmed in laboratory way the diagnosis of bacterial vaginosis in 100 (15.63%) women. The complaints of burning and unpleasant smell and the Amsel criterion of detection of "key cells" against the background of pH > 4.5 were established as statistically significant for bacterial vaginosis. According study data, the occurrence of excretions has no statistical reliable obligation for differentiation of bacterial vaginosis form other inflammatory pathological conditions of female reproductive sphere. At the same time, detection of "key cells" in mount reliably correlated with bacterial vaginosis.

  19. Using modified visual-inspection criteria to interpret functional analysis outcomes.

    PubMed

    Roane, Henry S; Fisher, Wayne W; Kelley, Michael E; Mevers, Joanna L; Bouxsein, Kelly J

    2013-01-01

    The development of functional analysis (FA) methodologies allows the identification of the reinforcers that maintain problem behavior and improved intervention efficacy in the form of function-based treatments. Despite the profound impact of FA on clinical practice and research, questions still remain about the methods by which clinicians and researchers interpret FA graphs. In the current study, 141 FA data sets were evaluated using the structured visual-inspection criteria developed by Hagopian et al. (1997). However, the criteria were modified for FAs of varying lengths. Interobserver agreement assessments revealed high agreement coefficients across expert judges, postdoctoral reviewers, master's-level reviewers, and postbaccalaureate reviewers. Once the validity of the modified visual-inspection procedures was established, the utility of those procedures was examined by using them to categorize the maintaining reinforcement contingency related to problem behavior for all 141 data sets and for the 101 participants who contributed to the 141 data sets.

  20. A multi-criteria decision analysis assessment of waste paper management options.

    PubMed

    Hanan, Deirdre; Burnley, Stephen; Cooke, David

    2013-03-01

    The use of Multi-criteria Decision Analysis (MCDA) was investigated in an exercise using a panel of local residents and stakeholders to assess the options for managing waste paper on the Isle of Wight. Seven recycling, recovery and disposal options were considered by the panel who evaluated each option against seven environmental, financial and social criteria. The panel preferred options where the waste was managed on the island with gasification and recycling achieving the highest scores. Exporting the waste to the English mainland for incineration or landfill proved to be the least preferred options. This research has demonstrated that MCDA is an effective way of involving community groups in waste management decision making. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Clock Scan Protocol for Image Analysis: ImageJ Plugins.

    PubMed

    Dobretsov, Maxim; Petkau, Georg; Hayar, Abdallah; Petkau, Eugen

    2017-06-19

    The clock scan protocol for image analysis is an efficient tool to quantify the average pixel intensity within, at the border, and outside (background) a closed or segmented convex-shaped region of interest, leading to the generation of an averaged integral radial pixel-intensity profile. This protocol was originally developed in 2006, as a visual basic 6 script, but as such, it had limited distribution. To address this problem and to join similar recent efforts by others, we converted the original clock scan protocol code into two Java-based plugins compatible with NIH-sponsored and freely available image analysis programs like ImageJ or Fiji ImageJ. Furthermore, these plugins have several new functions, further expanding the range of capabilities of the original protocol, such as analysis of multiple regions of interest and image stacks. The latter feature of the program is especially useful in applications in which it is important to determine changes related to time and location. Thus, the clock scan analysis of stacks of biological images may potentially be applied to spreading of Na(+) or Ca(++) within a single cell, as well as to the analysis of spreading activity (e.g., Ca(++) waves) in populations of synaptically-connected or gap junction-coupled cells. Here, we describe these new clock scan plugins and show some examples of their applications in image analysis.

  2. Effects of using different criteria for caries removal: a systematic review and network meta-analysis.

    PubMed

    Schwendicke, Falk; Paris, Sebastian; Tu, Yu-Kang

    2015-01-01

    Conventionally, caries excavation is performed until only hard dentine remains, while more selective and reliable criteria might be available. We aimed at systematically comparing the effects of using different excavation criteria via network meta-analysis. Electronic databases were searched for randomised or non-randomised clinical trials (RCTs/NRCTs) evaluating excavation of cavitated lesions. Criteria were divided into six groups: Excavation until pulpo-proximal dentine on the cavity floor was (1) either hard on probing, (2) slightly softened on probing, (3) not stainable by caries-detector-dye, or until (4) self-limiting polymer burs, (5) fluorescence-assisted devices or (6) chemo-mechanical gels indicated termination of the excavation. Evaluation of risk of complications, risk of pain/discomfort, excavation time, and number of remaining bacteria were then undertaken using Bayesian network meta-analysis. 28 studies (19 RCTs, 9 NRCTs) with 1782 patients (2555 lesions), most of them investigating primary teeth, were included. Risk of complications was highest when excavating until only non-stainable dentine remained, and lowest when not attempting to remove all softened dentine. Risk of pain significantly decreased if self-limiting chemo-mechanical excavation or fluorescence-assisted lasers were used instead of excavating until all dentine was hard. When not attempting to remove all softened dentine, the time required for excavation was shortest, whilst the greatest number bacteria remained. Not attempting to remove all softened or stainable dentine might reduce the risk of complications. Data regarding self-limiting excavation is insufficient for definitive conclusions. Excavation criteria should be validated against clinically relevant outcomes. Given current evidence, dentists might not need to attempt excavation until only hard dentin remains in proximity to the pulp. Instead, their choice of excavation criterion or method should be guided by clinical

  3. MetaQC: objective quality control and inclusion/exclusion criteria for genomic meta-analysis.

    PubMed

    Kang, Dongwan D; Sibille, Etienne; Kaminski, Naftali; Tseng, George C

    2012-01-01

    Genomic meta-analysis to combine relevant and homogeneous studies has been widely applied, but the quality control (QC) and objective inclusion/exclusion criteria have been largely overlooked. Currently, the inclusion/exclusion criteria mostly depend on ad-hoc expert opinion or naïve threshold by sample size or platform. There are pressing needs to develop a systematic QC methodology as the decision of study inclusion greatly impacts the final meta-analysis outcome. In this article, we propose six quantitative quality control measures, covering internal homogeneity of coexpression structure among studies, external consistency of coexpression pattern with pathway database, and accuracy and consistency of differentially expressed gene detection or enriched pathway identification. Each quality control index is defined as the minus log transformed P values from formal hypothesis testing. Principal component analysis biplots and a standardized mean rank are applied to assist visualization and decision. We applied the proposed method to 4 large-scale examples, combining 7 brain cancer, 9 prostate cancer, 8 idiopathic pulmonary fibrosis and 17 major depressive disorder studies, respectively. The identified problematic studies were further scrutinized for potential technical or biological causes of their lower quality to determine their exclusion from meta-analysis. The application and simulation results concluded a systematic quality assessment framework for genomic meta-analysis.

  4. Design criteria for a multiple input land use system. [digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Bryant, N. A.

    1975-01-01

    A design is presented that proposes the use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remote sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set, and that images taken of thematic maps or from remote sensing platforms can be converted to a raster scan. A major advantage of the raster format is that x, y coordinates are implicitly recognized by their position in the scan, and z values can be treated as Boolean layers in a three-dimensional data space. Such a system permits the rapid incorporation of data sets, rapid comparison of data sets, and adaptation to variable scales by resampling the raster scans.

  5. Design criteria for a multiple input land use system. [digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Bryant, N. A.

    1975-01-01

    A design is presented that proposes the use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remote sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set, and that images taken of thematic maps or from remote sensing platforms can be converted to a raster scan. A major advantage of the raster format is that x, y coordinates are implicitly recognized by their position in the scan, and z values can be treated as Boolean layers in a three-dimensional data space. Such a system permits the rapid incorporation of data sets, rapid comparison of data sets, and adaptation to variable scales by resampling the raster scans.

  6. Naval Signal and Image Analysis Conference Report

    DTIC Science & Technology

    1998-02-26

    Arlington Hilton Hotel in Arlington, Virginia. The meeting was by invitation only and consisted of investigators in the ONR Signal and Image Analysis Program...in signal and image analysis . The conference provided an opportunity for technical interaction between academic researchers and Naval scientists and...plan future directions for the ONR Signal and Image Analysis Program as well as informal recommendations to the Program Officer.

  7. Image analysis applications for grain science

    NASA Astrophysics Data System (ADS)

    Zayas, Inna Y.; Steele, James L.

    1991-02-01

    Morphometrical features of single grain kernels or particles were used to discriminate two visibly similar wheat varieties foreign material in wheat hardsoft and spring-winter wheat classes and whole from broken corn kernels. Milled fractions of hard and soft wheat were evaluated using textural image analysis. Color image analysis of sound and mold damaged corn kernels yielded high recognition rates. The studies collectively demonstrate the potential for automated classification and assessment of grain quality using image analysis.

  8. Satellite image analysis using neural networks

    NASA Technical Reports Server (NTRS)

    Sheldon, Roger A.

    1990-01-01

    The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.

  9. Microscopy image segmentation tool: robust image data analysis.

    PubMed

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  10. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya Monton, Carlos; Schuller, Ivan K.

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  11. Automated Dermoscopy Image Analysis of Pigmented Skin Lesions

    PubMed Central

    Baldi, Alfonso; Quartulli, Marco; Murace, Raffaele; Dragonetti, Emanuele; Manganaro, Mario; Guerra, Oscar; Bizzi, Stefano

    2010-01-01

    Dermoscopy (dermatoscopy, epiluminescence microscopy) is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs), allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis). This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR). PMID:24281070

  12. Using Multi Criteria Decision Making in Analysis of Alternatives for Selection of Enabling Technology

    NASA Astrophysics Data System (ADS)

    Georgiadis, Daniel

    Prior to Milestone A, the Department of Defense (DoD) requires that service sponsors conduct an Analysis of Alternatives (AoA), an analytical comparison of multiple alternatives, to be completed prior to committing and investing costly resources to one project or decision. Despite this requirement, sponsors will circumvent or dilute the process in an effort to save money or schedule, and specific requirements are proposed that can effectively eliminate all but the preselected alternatives. This research focuses on identifying decision aiding methods which can lead to the selection of specific criteria that are key performance drivers thus enabling an informed selection of the enabling technology. This work defines the enabling technology as the sub-system which presents the most risk within the system design. After a thorough literature review of available Multi Criteria Decision Making methods, a case study example is presented demonstrating the selection of the enabling technology of a Light Detection and Ranging (LIDAR) system. Using subjective criteria in the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) is shown to successfully account for tacit knowledge of expert practitioners.

  13. Regulatory analysis on criteria for the release of patients administered radioactive material. Final report

    SciTech Connect

    Schneider, S.; McGuire, S.A.

    1997-02-01

    This regulatory analysis was developed to respond to three petitions for rulemaking to amend 10 CFR parts 20 and 35 regarding release of patients administered radioactive material. The petitions requested revision of these regulations to remove the ambiguity that existed between the 1-millisievert (0.1-rem) total effective dose equivalent (TEDE) public dose limit in Part 20, adopted in 1991, and the activity-based release limit in 10 CFR 35.75 that, in some instances, would permit release of individuals in excess of the current public dose limit. Three alternatives for resolution of the petitions were evaluated. Under Alternative 1, NRC would amend its patient release criteria in 10 CFR 35.75 to match the annual public dose limit in Part 20 of 1 millisievert (0.1 rem) TEDE. Alternative 2 would maintain the status quo of using the activity-based release criteria currently found in 10 CFR 35.75. Under Alternative 3, the NRC would revise the release criteria in 10 CFR 35.75 to specify a dose limit of 5 millisieverts (0.5 rem) TEDE.

  14. SU-E-J-27: Appropriateness Criteria for Deformable Image Registration and Dose Propagation

    SciTech Connect

    Papanikolaou, P; Tuohy, Rachel; Mavroidis, P; Eng, T; Gutierrez, A; Stathakis, S

    2014-06-01

    Purpose: Several commercial software packages have been recently released that allow the user to apply deformable registration algorithms (DRA) for image fusion and dose propagation. Although the idea of anatomically tracking the daily patient dose in the context of adaptive radiotherapy or merely adding the dose from prior treatment to the current one is very intuitive, the accuracy and applicability of such algorithms needs to be investigated as it remains somewhat subjective. In our study, we used true anatomical data where we introduced changes in the density, volume and location of segmented structures to test the DRA for its sensitivity and accuracy. Methods: The CT scan of a prostate patient was selected for this study. The CT images were first segmented to define structure such as the PTV, bladder, rectum, intestines and pelvic bone anatomy. To perform our study, we introduced anatomical changes in the reference patient image set in three different ways: (i) we kept the segmented volumes constant and changed the density of rectum and bladder in increments of 5% (ii) we changed the volume of rectum and bladder in increments of 5% and (iii) we kept the segmented volumes constant but changed their location by moving their COM in increments of 3mm. Using the Velocity software, we evaluated the accuracy of the DRA for each incremental change in all three scenarios. Results: The DRA performs reasonably well when the differential density difference against the background is more than 5%. For the volume change study, the DRA results became unreliable for relative volume changes greater than 10%. Finally for the location study, the DRA performance was acceptable for shifts below 9mm. Conclusion: Site specific and patient specific QA for DRA is an important step to evaluate such algorithms prior to their use for dose propagation.

  15. Image registration with uncertainty analysis

    DOEpatents

    Simonson, Katherine M [Cedar Crest, NM

    2011-03-22

    In an image registration method, edges are detected in a first image and a second image. A percentage of edge pixels in a subset of the second image that are also edges in the first image shifted by a translation is calculated. A best registration point is calculated based on a maximum percentage of edges matched. In a predefined search region, all registration points other than the best registration point are identified that are not significantly worse than the best registration point according to a predetermined statistical criterion.

  16. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  17. Image processing software for imaging spectrometry data analysis

    NASA Technical Reports Server (NTRS)

    Mazer, Alan; Martin, Miki; Lee, Meemong; Solomon, Jerry E.

    1988-01-01

    Imaging spectrometers simultaneously collect image data in hundreds of spectral channels, from the near-UV to the IR, and can thereby provide direct surface materials identification by means resembling laboratory reflectance spectroscopy. Attention is presently given to a software system, the Spectral Analysis Manager (SPAM) for the analysis of imaging spectrometer data. SPAM requires only modest computational resources and is composed of one main routine and a set of subroutine libraries. Additions and modifications are relatively easy, and special-purpose algorithms have been incorporated that are tailored to geological applications.

  18. Image processing software for imaging spectrometry data analysis

    NASA Astrophysics Data System (ADS)

    Mazer, Alan; Martin, Miki; Lee, Meemong; Solomon, Jerry E.

    1988-02-01

    Imaging spectrometers simultaneously collect image data in hundreds of spectral channels, from the near-UV to the IR, and can thereby provide direct surface materials identification by means resembling laboratory reflectance spectroscopy. Attention is presently given to a software system, the Spectral Analysis Manager (SPAM) for the analysis of imaging spectrometer data. SPAM requires only modest computational resources and is composed of one main routine and a set of subroutine libraries. Additions and modifications are relatively easy, and special-purpose algorithms have been incorporated that are tailored to geological applications.

  19. Regulatory analysis on criteria for the release of patients administered radioactive material

    SciTech Connect

    Schneider, S.; McGuire, S.A.; Behling, U.H.; Behling, K.; Goldin, D.

    1994-05-01

    The Nuclear Regulatory Commission (NRC) has received two petitions to amend its regulations in 10 CFR Parts 20 and 35 as they apply to doses received by members of the public exposed to patients released from a hospital after they have been administered radioactive material. While the two petitions are not identical they both request that the NRC establish a dose limit of 5 millisieverts (0.5 rem) per year for individuals exposed to patients who have been administered radioactive materials. This Regulatory Analysis evaluates three alternatives. Alternative 1 is for the NRC to amend its patient release criteria in 10 CFR 35.75 to use the more stringent dose limit of 1 millisievert per year in 10 CFR 20.1301(a) for its patient release criteria. Alternative 2 is for the NRC to continue using the existing patient release criteria in 10 CFR 35.75 of 1,110 megabecquerels of activity or a dose rate at one meter from the patient of 0.05 millisievert per hour. Alternative 3 is for the NRC to amend the patient release criteria in 10 CFR 35.75 to specify a dose limit of 5 millisieverts for patient release. The evaluation indicates that Alternative 1 would cause a prohibitively large increase in the national health care cost from retaining patients in a hospital longer and would cause significant personal and psychological costs to patients and their families. The choice of Alternatives 2 or 3 would affect only thyroid cancer patients treated with iodine-131. For those patients, Alternative 3 would result in less hospitalization than Alternative 2. Alternative 3 has a potential decrease in national health care cost of $30,000,000 per year but would increase the potential collective dose from released therapy patients by about 2,700 person-rem per year, mainly to family members.

  20. Imaging-based enrichment criteria using deep learning algorithms for efficient clinical trials in mild cognitive impairment.

    PubMed

    Ithapu, Vamsi K; Singh, Vikas; Okonkwo, Ozioma C; Chappell, Richard J; Dowling, N Maritza; Johnson, Sterling C

    2015-12-01

    The mild cognitive impairment (MCI) stage of Alzheimer's disease (AD) may be optimal for clinical trials to test potential treatments for preventing or delaying decline to dementia. However, MCI is heterogeneous in that not all cases progress to dementia within the time frame of a trial and some may not have underlying AD pathology. Identifying those MCIs who are most likely to decline during a trial and thus most likely to benefit from treatment will improve trial efficiency and power to detect treatment effects. To this end, using multimodal, imaging-derived, inclusion criteria may be especially beneficial. Here, we present a novel multimodal imaging marker that predicts future cognitive and neural decline from [F-18]fluorodeoxyglucose positron emission tomography (PET), amyloid florbetapir PET, and structural magnetic resonance imaging, based on a new deep learning algorithm (randomized denoising autoencoder marker, rDAm). Using ADNI2 MCI data, we show that using rDAm as a trial enrichment criterion reduces the required sample estimates by at least five times compared with the no-enrichment regime and leads to smaller trials with high statistical power, compared with existing methods.

  1. Information granules in image histogram analysis.

    PubMed

    Wieclawek, Wojciech

    2017-05-10

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Optimal management of adults with pharyngitis – a multi-criteria decision analysis

    PubMed Central

    Singh, Sonal; Dolan, James G; Centor, Robert M

    2006-01-01

    Background Current practice guidelines offer different management recommendations for adults presenting with a sore throat. The key issue is the extent to which the clinical likelihood of a Group A streptococcal infection should affect patient management decisions. To help resolve this issue, we conducted a multi-criteria decision analysis using the Analytic Hierarchy Process. Methods We defined optimal patient management using four criteria: 1) reduce symptom duration; 2) prevent infectious complications, local and systemic; 3) minimize antibiotic side effects, minor and anaphylaxis; and 4) achieve prudent use of antibiotics, avoiding both over-use and under-use. In our baseline analysis we assumed that all criteria and sub-criteria were equally important except minimizing anaphylactic side effects, which was judged very strongly more important than minimizing minor side effects. Management strategies included: a) No test, No treatment; b) Perform a rapid strep test and treat if positive; c) Perform a throat culture and treat if positive; d) Perform a rapid strep test and treat if positive; if negative obtain a throat culture and treat if positive; and e) treat without further tests. We defined four scenarios based on the likelihood of group A streptococcal infection using the Centor score, a well-validated clinical index. Published data were used to estimate the likelihoods of clinical outcomes and the test operating characteristics of the rapid strep test and throat culture for identifying group A streptococcal infections. Results Using the baseline assumptions, no testing and no treatment is preferred for patients with Centor scores of 1; two strategies – culture and treat if positive and rapid strep with culture of negative results – are equally preferable for patients with Centor scores of 2; and rapid strep with culture of negative results is the best management strategy for patients with Centor scores 3 or 4. These results are sensitive to the

  3. Multi-criteria analysis on how to select solar radiation hydrogen production system

    NASA Astrophysics Data System (ADS)

    Badea, G.; Naghiu, G. S.; Felseghi, R.-A.; Rǎboacǎ, S.; Aşchilean, I.; Giurca, I.

    2015-12-01

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  4. Multi-criteria analysis on how to select solar radiation hydrogen production system

    SciTech Connect

    Badea, G.; Naghiu, G. S. Felseghi, R.-A.; Giurca, I.; Răboacă, S.; Aşchilean, I.

    2015-12-23

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  5. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    NASA Astrophysics Data System (ADS)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  6. Application of multi-criteria decision analysis in selecting of sustainable investments

    NASA Astrophysics Data System (ADS)

    Kozik, Renata

    2017-07-01

    Investors of construction projects, especially financed with public money, quite slowly adapt environmentally friendly solutions, e.g. passive buildings. Practice shows that the use of green public procurement among the public investors is negligible. Energy-saving technologies and equipment are expensive at the construction phase and investors less or not at all take into account the future operating costs. The aim of this article is to apply the method of multi-criteria analysis ELECTRE to select the best investment in terms of cost of implementation, operation, as well as the impact on the environment.

  7. ALARA Analysis of Radiological Control Criteria Associated with Alternatives for Disposal of Hazardous Wastes

    SciTech Connect

    Aaberg, Rosanne L.; Bilyard, Gordon R.; Branch, Kristi M.; Lavender, Jay C.; Miller, Peter L.

    2002-05-15

    This ALARA analysis of Radiological Control Criteria (RCC) considers alternatives to continued storage of certain DOE mixed wastes. It also considers the option of treating hazardous wastes generated by DOE facilities, which have a very low concentration of radionuclide contaminants, as purely hazardous waste. Alternative allowable contaminant levels examined correspond to doses to an individual ranging from 0.01 mrem/yr to 10 to 20 mrem/yr. Generic waste inventory data and radionuclide source terms are used in the assessment. Economic issues, potential health and safety issues, and qualitative factors relating to the use of RCCs are considered.

  8. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    SciTech Connect

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.; Zuffranieri, Jason V.; Paananen, Orman Henrie; Jones, Scott A.; Ortner, Juergen G.; Brewer, Jeffrey D.; Valdez, Maximo M.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.

  9. Multi-criteria decision analysis for the optimal management of nitrate contamination of aquifers.

    PubMed

    Almasri, Mohammad N; Kaluarachchi, Jagath J

    2005-03-01

    We present an integrated methodology for the optimal management of nitrate contamination of ground water combining environmental assessment and economic cost evaluation through multi-criteria decision analysis. The proposed methodology incorporates an integrated physical modeling framework accounting for on-ground nitrogen loading and losses, soil nitrogen dynamics, and fate and transport of nitrate in ground water to compute the sustainable on-ground nitrogen loading such that the maximum contaminant level is not violated. A number of protection alternatives to stipulate the predicted sustainable on-ground nitrogen loading are evaluated using the decision analysis that employs the importance order of criteria approach for ranking and selection of the protection alternatives. The methodology was successfully demonstrated for the Sumas-Blaine aquifer in Washington State. The results showed the importance of using this integrated approach which predicts the sustainable on-ground nitrogen loadings and provides an insight into the economic consequences generated in satisfying the environmental constraints. The results also show that the proposed decision analysis framework, within certain limitations, is effective when selecting alternatives with competing demands.

  10. Can imaging modalities be used as follow-up criteria after brucellar sacroiliitis treatment?

    PubMed

    Bilgeturk, Aybars; Gul, Hanefi Cem; Karakas, Ahmet; Mert, Gurkan; Artuk, Cumhur; Eyigun, Can Polat

    2017-02-28

    This study aimed to identify a follow-up modality that can be used to evaluate therapeutic responses in patients receiving treatment for brucellar sacroillitis and to determine whether antibiotherapy can be stopped. A total of 32 patients with sacroiliac joint involvement demonstrated via magnetic resonance imaging or bone scintigraphy were followed up and treated. Patients received 200 mg/day of doxycycline and 600-900 mg/day of rifampicin for 3-21 months, and 1 g/day of streptomycin for 21 days. The mean age of the 32 patients involved was 21.81±4.09. In total, 10/32 patients did not complete therapy, and the remaining 22 patients received combination antibiotic treatment for a mean of 8.95±4.34 months. Of the 22 patients, 15 underwent MRI, and 7 of them did not consent to MRI. Similarly, 17 patients were followed up by bone scintigraphy, and 5 patients did not have scintigraphy results. In 9/17 patients followed up with bone scintigraphy, sacroiliitis findings were found to reduce after a mean of 7.44±3.71 months, whereas in 12/15 patients on whom MRI was performed,  there were no active sacroiliitis findings for a mean of 6.95±2.83 months. While active involvement findings in bone scintigraphy were observed for a longer period in scintigraphy images, active sacroiliitis findings disappeared in a relatively shorter period of time with MRI. Therefore, we have demonstrated that high-resolution MRI is a very sensitive technique compared to scintigraphy.

  11. Multiscale Analysis of Solar Image Data

    NASA Astrophysics Data System (ADS)

    Young, C. A.; Myers, D. C.

    2001-12-01

    It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.

  12. A Mathematical Framework for Image Analysis

    DTIC Science & Technology

    1991-08-01

    The results reported here were derived from the research project ’A Mathematical Framework for Image Analysis ’ supported by the Office of Naval...Research, contract N00014-88-K-0289 to Brown University. A common theme for the work reported is the use of probabilistic methods for problems in image ... analysis and image reconstruction. Five areas of research are described: rigid body recognition using a decision tree/combinatorial approach; nonrigid

  13. Image Reconstruction Using Analysis Model Prior

    PubMed Central

    Han, Yu; Du, Huiqian; Lam, Fan; Mei, Wenbo; Fang, Liping

    2016-01-01

    The analysis model has been previously exploited as an alternative to the classical sparse synthesis model for designing image reconstruction methods. Applying a suitable analysis operator on the image of interest yields a cosparse outcome which enables us to reconstruct the image from undersampled data. In this work, we introduce additional prior in the analysis context and theoretically study the uniqueness issues in terms of analysis operators in general position and the specific 2D finite difference operator. We establish bounds on the minimum measurement numbers which are lower than those in cases without using analysis model prior. Based on the idea of iterative cosupport detection (ICD), we develop a novel image reconstruction model and an effective algorithm, achieving significantly better reconstruction performance. Simulation results on synthetic and practical magnetic resonance (MR) images are also shown to illustrate our theoretical claims. PMID:27379171

  14. Coastal flooding as a parameter in multi-criteria analysis for industrial site selection

    NASA Astrophysics Data System (ADS)

    Christina, C.; Memos, C.; Diakoulaki, D.

    2014-12-01

    Natural hazards can trigger major industrial accidents, which apart from affecting industrial installations may cause a series of accidents with serious impacts on human health and the environment far beyond the site boundary. Such accidents, also called Na-Tech (natural - technical) accidents, deserve particular attention since they can cause release of hazardous substances possibly resulting in severe environmental pollution, explosions and/or fires. There are different kinds of natural events or, in general terms, of natural causes of industrial accidents, such as landslides, hurricanes, high winds, tsunamis, lightning, cold/hot temperature, floods, heavy rains etc that have caused accidents. The scope of this paper is to examine the coastal flooding as a parameter in causing an industrial accident, such as the nuclear disaster in Fukushima, Japan, and the critical role of this parameter in industrial site selection. Land use planning is a complex procedure that requires multi-criteria decision analysis involving economic, environmental and social parameters. In this context the parameter of a natural hazard occurrence, such as coastal flooding, for industrial site selection should be set by the decision makers. In this paper it is evaluated the influence that has in the outcome of a multi-criteria decision analysis for industrial spatial planning the parameter of an accident risk triggered by coastal flooding. The latter is analyzed in the context of both sea-and-inland induced flooding.

  15. An Item Response Theory Analysis of DSM–IV Personality Disorder Criteria Across Younger and Older Age Groups

    PubMed Central

    Balsis, Steve; Gleason, Marci E. J.; Woods, Carol M.; Oltmanns, Thomas F.

    2015-01-01

    Many of the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM–IV; American Psychiatric Association, 1994) personality disorder (PD) diagnostic criteria focus on a younger social and occupational context. The absence of age-appropriate criteria for older adults forces researchers and clinicians to draw conclusions based on existing criteria, which are likely inadequate. To explore which DSM–IV PD criteria contain age group measurement bias, the authors report 2 analyses of data on nearly 37,000 participants, ages 18–98 years, taken from a public data set that includes 7 of the 10 PDs (antisocial, avoidant, dependent, histrionic, obsessive–compulsive, paranoid, and schizoid). The 1st analysis revealed that older age groups tend to endorse fewer PD criteria than younger age groups. The 2nd analysis revealed that 29% of the criteria contain measurement bias. Although the latent variable structure for each PD was quite similar across younger and older age groups, some individual criteria were differentially endorsed by younger and older adults with equivalent PD pathology. The presence of measurement bias for these criteria raises questions concerning the assessment of PDs in older adults and the interpretation of existing data. PMID:17385993

  16. Image and flow cytometry: companion techniques for adherent and non-adherent cell analysis and sorting.

    PubMed

    Métézeau, P

    1993-01-01

    Flow cytometry (FMC) is an analytical and preparative technique whereas image analysis is only applied to cell analysis. Recently, image analysis has been adapted as a preparative method using a new technique: image cytometry for analysis and sorting (ICAS). FCM and ICAS are complementary. Flow cytometry allows rapid, quantitative and precise study of fluorescence and light scattering in a large number of cells in suspension, while ICAS analyses fewer cells (adherent cells or tissue) on the basis of fluorescence, morphology and size. ICAS can use these criteria to destroy unwanted cells and hence sort selected cells. ICAS can also be used for confocal microscopy and laser surgery.

  17. Merging Panchromatic and Multispectral Images for Enhanced Image Analysis

    DTIC Science & Technology

    1990-08-01

    Multispectral Images for Enhanced Image Analysis I, Curtis K. Munechika grant permission to the Wallace Memorial Library of the Rochester Institute of...0.0 ()0 (.0(%C’ trees 3. 5 2.5% 0.0%l 44. 1% 5 (.()0th ,crass .1 ().W 0.0% 0).0% 97. overall classification accuracy: 87.5%( T-able DlIb . Confusion

  18. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran.

    PubMed

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran

    2009-10-01

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  19. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    SciTech Connect

    Sharifi, Mozafar Hadidi, Mosslem Vessali, Elahe Mosstafakhani, Parasto Taheri, Kamal Shahoie, Saber Khodamoradpour, Mehran

    2009-10-15

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  20. Program for Analysis and Enhancement of Images

    NASA Technical Reports Server (NTRS)

    Lu, Yun-Chi

    1987-01-01

    Land Analysis System (LAS) is collection of image-analysis computer programs designed to manipulate and analyze multispectral image data. Provides user with functions ingesting various sensor data, radiometric and geometric corrections, image registration, training site selection, supervised and unsupervised classification, Fourier domain filtering, and image enhancement. Sufficiently modular and includes extensive library of subroutines to permit inclusion of new algorithmic programs. Commercial package International Mathematical & Statistical Library (IMSL) required for full implementation of LAS. Written in VAX FORTRAN 77, C, and Macro assembler for DEC VAX operating under VMS 4.0.

  1. Preferences for colorectal cancer screening techniques and intention to attend: a multi-criteria decision analysis.

    PubMed

    Hummel, J Marjan; Steuten, Lotte G M; Groothuis-Oudshoorn, C J M; Mulder, Nick; Ijzerman, Maarten J

    2013-10-01

    Despite the expected health benefits of colorectal cancer screening programs, participation rates remain low in countries that have implemented such a screening program. The perceived benefits and risks of the colorectal cancer screening technique are likely to influence the decision to attend the screening program. Besides the diagnostic accuracy and the risks of the screening technique, which can affect the health of the participants, additional factors, such as the burden of the test, may impact the individuals' decisions to participate. To maximise the participation rate of a screening program for a new colorectal cancer program in the Netherlands, it is important to know the preferences of the screening population for alternative screening techniques. The aim of this study was to explore the impact of preferences for particular attributes of the screening tests on the intention to attend a colorectal cancer screening program. We used a web-based questionnaire to elicit the preferences of the target population for a selection of colon-screening techniques. The target population consisted of Dutch men and women aged 55-75 years. The analytic hierarchy process (AHP), a technique for multi-criteria analysis, was used to estimate the colorectal cancer screening preferences. Respondents weighted the relevance of five criteria, i.e. the attributes of the screening techniques: sensitivity, specificity, safety, inconvenience, and frequency of the test. With regard to these criteria, preferences were estimated between four alternative screening techniques, namely, immunochemical fecal occult blood test (iFOBT), colonoscopy, sigmoidoscopy, and computerized tomographic (CT) colonography. A five-point ordinal scale was used to estimate the respondents' intention to attend the screening. We conducted a correlation analysis on the preferences for the screening techniques and the intention to attend. We included 167 respondents who were consistent in their judgments of the

  2. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  3. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    PubMed

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  4. Assessing Interventions to Manage West Nile Virus Using Multi-Criteria Decision Analysis with Risk Scenarios.

    PubMed

    Hongoh, Valerie; Campagna, Céline; Panic, Mirna; Samuel, Onil; Gosselin, Pierre; Waaub, Jean-Philippe; Ravel, André; Samoura, Karim; Michel, Pascal

    2016-01-01

    The recent emergence of West Nile virus (WNV) in North America highlights vulnerability to climate sensitive diseases and stresses the importance of preventive efforts to reduce their public health impact. Effective prevention involves reducing environmental risk of exposure and increasing adoption of preventive behaviours, both of which depend on knowledge and acceptance of such measures. When making operational decisions about disease prevention and control, public health must take into account a wide range of operational, environmental, social and economic considerations in addition to intervention effectiveness. The current study aimed to identify, assess and rank possible risk reduction measures taking into account a broad set of criteria and perspectives applicable to the management of WNV in Quebec under increasing transmission risk scenarios, some of which may be related to ongoing warming in higher-latitude regions. A participatory approach was used to collect information on categories of concern to relevant stakeholders with respect to WNV prevention and control. Multi-criteria decision analysis was applied to examine stakeholder perspectives and their effect on strategy rankings under increasing transmission risk scenarios. Twenty-three preventive interventions were retained for evaluation using eighteen criteria identified by stakeholders. Combined evaluations revealed that, at an individual-level, inspecting window screen integrity, wearing light colored, long clothing, eliminating peridomestic larval sites and reducing outdoor activities at peak times were top interventions under six WNV transmission scenarios. At a regional-level, the use of larvicides was a preferred strategy in five out of six scenarios, while use of adulticides and dissemination of sterile male mosquitoes were found to be among the least favoured interventions in almost all scenarios. Our findings suggest that continued public health efforts aimed at reinforcing individual

  5. Behavior analysis of container ship in maritime accident in order to redefine the operating criteria

    NASA Astrophysics Data System (ADS)

    Ancuţa, C.; Stanca, C.; Andrei, C.; Acomi, N.

    2017-08-01

    In order to enhance the efficiency of maritime transport, container ships operators proceeded to increase the sizes of ships. The latest generation of ships in operation has 19,000 TEU capacity and the perspective is 21,000 TEU within the next years. The increasing of the sizes of container ships involves risks of maritime accidents occurrences. Nowadays, the general rules on operational security, tend to be adjusted as a result of the evaluation for each vessel. To create the premises for making an informed decision, the captain have to be aware of ships behavior in such situations. Not less important is to assure permanent review of the procedures for operation of ship, including the specific procedures in special areas, confined waters or separation schemes. This paper aims at analysing the behavior of the vessel and the respond of the structure of a container ship in maritime accident, in order to redefine the operating criteria. The method selected by authors for carrying out the research is computer simulations. Computer program provides the responses of the container ship model in various situations. Therefore, the simulations allow acquisition of a large category of data, in the scope of improving the prevention of accidents or mitigation of effects as much as possible. Simulations and assessments of certain situations that the ship might experience will be carried out to redefine the operating criteria. The envisaged scenarios are: introducing of maneuver speed for specific areas with high risk of collision or grounding, introducing of flooding scenarios of some compartments in loading programs, conducting of complex simulations in various situations for each vessel type. The main results of this work are documented proposals for operating criteria, intended to improve the safety in case of marine accidents, collisions and groundings. Introducing of such measures requires complex cost benefit analysis, that should not neglect the extreme economic impact

  6. Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis

    PubMed Central

    Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748

  7. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    PubMed Central

    2011-01-01

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular

  8. Assessing Interventions to Manage West Nile Virus Using Multi-Criteria Decision Analysis with Risk Scenarios

    PubMed Central

    Hongoh, Valerie; Campagna, Céline; Panic, Mirna; Samuel, Onil; Gosselin, Pierre; Waaub, Jean-Philippe; Ravel, André; Samoura, Karim; Michel, Pascal

    2016-01-01

    The recent emergence of West Nile virus (WNV) in North America highlights vulnerability to climate sensitive diseases and stresses the importance of preventive efforts to reduce their public health impact. Effective prevention involves reducing environmental risk of exposure and increasing adoption of preventive behaviours, both of which depend on knowledge and acceptance of such measures. When making operational decisions about disease prevention and control, public health must take into account a wide range of operational, environmental, social and economic considerations in addition to intervention effectiveness. The current study aimed to identify, assess and rank possible risk reduction measures taking into account a broad set of criteria and perspectives applicable to the management of WNV in Quebec under increasing transmission risk scenarios, some of which may be related to ongoing warming in higher-latitude regions. A participatory approach was used to collect information on categories of concern to relevant stakeholders with respect to WNV prevention and control. Multi-criteria decision analysis was applied to examine stakeholder perspectives and their effect on strategy rankings under increasing transmission risk scenarios. Twenty-three preventive interventions were retained for evaluation using eighteen criteria identified by stakeholders. Combined evaluations revealed that, at an individual-level, inspecting window screen integrity, wearing light colored, long clothing, eliminating peridomestic larval sites and reducing outdoor activities at peak times were top interventions under six WNV transmission scenarios. At a regional-level, the use of larvicides was a preferred strategy in five out of six scenarios, while use of adulticides and dissemination of sterile male mosquitoes were found to be among the least favoured interventions in almost all scenarios. Our findings suggest that continued public health efforts aimed at reinforcing individual

  9. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases.

    PubMed

    Hongoh, Valerie; Hoen, Anne Gatewood; Aenishaenslin, Cécile; Waaub, Jean-Philippe; Bélanger, Denise; Michel, Pascal

    2011-12-29

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular.

  10. Optical Analysis of Microscope Images

    NASA Astrophysics Data System (ADS)

    Biles, Jonathan R.

    Microscope images were analyzed with coherent and incoherent light using analog optical techniques. These techniques were found to be useful for analyzing large numbers of nonsymbolic, statistical microscope images. In the first part phase coherent transparencies having 20-100 human multiple myeloma nuclei were simultaneously photographed at 100 power magnification using high resolution holographic film developed to high contrast. An optical transform was obtained by focussing the laser onto each nuclear image and allowing the diffracted light to propagate onto a one dimensional photosensor array. This method reduced the data to the position of the first two intensity minima and the intensity of successive maxima. These values were utilized to estimate the four most important cancer detection clues of nuclear size, shape, darkness, and chromatin texture. In the second part, the geometric and holographic methods of phase incoherent optical processing were investigated for pattern recognition of real-time, diffuse microscope images. The theory and implementation of these processors was discussed in view of their mutual problems of dimness, image bias, and detector resolution. The dimness problem was solved by either using a holographic correlator or a speckle free laser microscope. The latter was built using a spinning tilted mirror which caused the speckle to change so quickly that it averaged out during the exposure. To solve the bias problem low image bias templates were generated by four techniques: microphotography of samples, creation of typical shapes by computer graphics editor, transmission holography of photoplates of samples, and by spatially coherent color image bias removal. The first of these templates was used to perform correlations with bacteria images. The aperture bias was successfully removed from the correlation with a video frame subtractor. To overcome the limited detector resolution it is necessary to discover some analog nonlinear intensity

  11. Validation and optimization of criteria for manual smear review following automated blood cell analysis in a large university hospital.

    PubMed

    Pratumvinit, Busadee; Wongkrajang, Preechaya; Reesukumal, Kanit; Klinbua, Cherdsak; Niamjoy, Patama

    2013-03-01

    Each laboratory should have criteria for manual smear review that limit workload without affecting patient care. The International Consensus Group for Hematology Review established guidelines for action after automated blood cell analysis in 2005. To compare the consensus group criteria with our laboratory criteria and optimize them for better efficiency. A total of 2114 first-time samples were collected consecutively from daily workload and were used to compare 2 criteria as well as establish the optimized criteria. Another set of 891 samples was used to validate the optimized criteria. All samples were run on either Sysmex XE-5000 or Coulter LH750 hematology analyzers and were investigated by manual smear review. The efficiency of each set of criteria was compared and optimized to obtain better efficiency, an acceptable review rate, and a low false-negative rate. From 2114 samples, 368 (17.40%) had positive smear results. Compared with that of our laboratory criteria, the efficiency of the consensus group criteria was higher (83.63% versus 78.86%, P < .001), the review rate was higher (29.33% versus 22.37%, P < .001), and the false-negative rate was lower (2.22% versus 8.09%, P < .001). After optimizing the rules, we obtained an efficiency of 87.13%, a review rate of 24.22%, and a false-negative rate of 2.98%. We validated the optimized criteria with another set of samples, and the efficiency, review rate, and false-negative rate were 87.32%, 25.25%, and 1.12%, respectively. Each laboratory should verify the criteria for smear review, based on the International Consensus Group for Hematology Review, and optimize them to maximize efficiency.

  12. Item Response Theory Analysis of DSM-IV Cannabis Abuse and Dependence Criteria in Adolescents

    ERIC Educational Resources Information Center

    Hartman, Christie A.; Gelhorn, Heather; Crowley, Thomas J.; Sakai, Joseph T.; Stallings, Michael; Young, Susan E.; Rhee, Soo Hyun; Corley, Robin; Hewitt, John K.; Hopfer, Christian J.

    2008-01-01

    A study to examine the DSM-IV criteria for cannabis abuse and dependence among adolescents is conducted. Results conclude that abuse and dependence criteria were not found to affect the different levels of severity in cannabis use.

  13. Item Response Theory Analysis of DSM-IV Cannabis Abuse and Dependence Criteria in Adolescents

    ERIC Educational Resources Information Center

    Hartman, Christie A.; Gelhorn, Heather; Crowley, Thomas J.; Sakai, Joseph T.; Stallings, Michael; Young, Susan E.; Rhee, Soo Hyun; Corley, Robin; Hewitt, John K.; Hopfer, Christian J.

    2008-01-01

    A study to examine the DSM-IV criteria for cannabis abuse and dependence among adolescents is conducted. Results conclude that abuse and dependence criteria were not found to affect the different levels of severity in cannabis use.

  14. Interpretations of legal criteria for involuntary psychiatric admission: a qualitative analysis.

    PubMed

    Feiring, Eli; Ugstad, Kristian N

    2014-10-25

    The use of involuntary admission in psychiatry may be necessary to enable treatment and prevent harm, yet remains controversial. Mental health laws in high-income countries typically permit coercive treatment of persons with mental disorders to restore health or prevent future harm. Criteria intended to regulate practice leave scope for discretion. The values and beliefs of staff may become a determinating factor for decisions. Previous research has only to a limited degree addressed how legal criteria for involuntary psychiatric admission are interpreted by clinical decision-makers. We examined clinicians' interpretations of criteria for involuntary admission under the Norwegian Mental Health Care Act. This act applies a status approach, whereby involuntary admission can be used at the presence of mental disorder and need for treatment or perceived risk to the patient or others. Further, best interest assessments carry a large justificatory burden and open for a range of extra-legislative factors to be considered. Deductive thematic analysis was used. Three ideal types of attitudes-to-coercion were developed, denoted paternalistic, deliberative and interpretive. Semi-structured, in-depth interviews with 10 Norwegian clinicians with experience from admissions to psychiatric care were carried out. Data was fit into the preconceived analytical frame. We hypothesised that the data would mirror the recent shift from paternalism towards a more human rights focused approach in modern mental health care. The paternalistic perspective was, however, clearly expressed in the data. Involuntary admission was considered to be in the patient's best interest, and patients suffering from serious mental disorder were assumed to lack decision-making capacity. In addition to assessment of need, outcome effectiveness and risk of harm, extra-legislative factors such as patients' functioning, experience, resistance, networks, and follow-up options were told to influence decisions

  15. Scale-Specific Multifractal Medical Image Analysis

    PubMed Central

    Braverman, Boris

    2013-01-01

    Fractal geometry has been applied widely in the analysis of medical images to characterize the irregular complex tissue structures that do not lend themselves to straightforward analysis with traditional Euclidean geometry. In this study, we treat the nonfractal behaviour of medical images over large-scale ranges by considering their box-counting fractal dimension as a scale-dependent parameter rather than a single number. We describe this approach in the context of the more generalized Rényi entropy, in which we can also compute the information and correlation dimensions of images. In addition, we describe and validate a computational improvement to box-counting fractal analysis. This improvement is based on integral images, which allows the speedup of any box-counting or similar fractal analysis algorithm, including estimation of scale-dependent dimensions. Finally, we applied our technique to images of invasive breast cancer tissue from 157 patients to show a relationship between the fractal analysis of these images over certain scale ranges and pathologic tumour grade (a standard prognosticator for breast cancer). Our approach is general and can be applied to any medical imaging application in which the complexity of pathological image structures may have clinical value. PMID:24023588

  16. Identification of foot and mouth disease risk areas using a multi-criteria analysis approach

    PubMed Central

    Silva, Gustavo Sousa e; Weber, Eliseu José; Hasenack, Heinrich; Groff, Fernando Henrique Sautter; Todeschini, Bernardo; Borba, Mauro Riegert; Medeiros, Antonio Augusto Rosa; Leotti, Vanessa Bielefeldt; Canal, Cláudio Wageck; Corbellini, Luis Gustavo

    2017-01-01

    Foot and mouth disease (FMD) is a highly infectious disease that affects cloven-hoofed livestock and wildlife. FMD has been a problem for decades, which has led to various measures to control, eradicate and prevent FMD by National Veterinary Services worldwide. Currently, the identification of areas that are at risk of FMD virus incursion and spread is a priority for FMD target surveillance after FMD is eradicated from a given country or region. In our study, a knowledge-driven spatial model was built to identify risk areas for FMD occurrence and to evaluate FMD surveillance performance in Rio Grande do Sul state, Brazil. For this purpose, multi-criteria decision analysis was used as a tool to seek multiple and conflicting criteria to determine a preferred course of action. Thirteen South American experts analyzed 18 variables associated with FMD introduction and dissemination pathways in Rio Grande do Sul. As a result, FMD higher risk areas were identified at international borders and in the central region of the state. The final model was expressed as a raster surface. The predictive ability of the model assessed by comparing, for each cell of the raster surface, the computed model risk scores with a binary variable representing the presence or absence of an FMD outbreak in that cell during the period 1985 to 2015. Current FMD surveillance performance was assessed, and recommendations were made to improve surveillance activities in critical areas. PMID:28552973

  17. Rasch analysis of alcohol abuse and dependence diagnostic criteria in persons with spinal cord injury.

    PubMed

    Reslan, S; Kalpakjian, C Z; Hanks, R A; Millis, S R; Bombardier, C H

    2017-05-01

    Cross-sectional. The objective of the study is to examine whether alcohol use disorders should be conceptualized categorically as abuse and dependence as in the 'Diagnostic and Statistical Manual of Mental Disorders' 4th edition or on a single continuum with mild to severe category ratings as in the 'Diagnostic and Statistical Manual of Mental Disorders' 5th edition in people with spinal cord injury (SCI). United States of America. Data from 379 individuals who sustained SCI either traumatically or non-traumatically after the age of 18 and were at least 1 year post injury. Rasch analyses used the alcohol abuse and dependence modules of the Structured Clinical Interview for DSM-IV-TR Axis I Disorders Non-patient Edition (SCID-I/NP). Fifty-seven percent (n=166) of the entire sample endorsed criteria for alcohol abuse, and 25% (n=65) endorsed criteria for alcohol dependence. Fit values were generally acceptable except for one item (for example, alcohol abuse criterion 2), suggesting that the items fit the expectation of unidimensionality. Examination of the principal components analysis did not provide support for unidimensionality. The item-person map illustrates poor targeting of items. Alcohol abuse and dependence criterion appear to reflect a unidimensional construct, a finding that supports a single latent construct or factor consistent with the DSM-5 diagnostic model.

  18. Multi-criteria decision analysis for waste management in Saharawi refugee camps.

    PubMed

    Garfì, M; Tondelli, S; Bonoli, A

    2009-10-01

    The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders: The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.

  19. Analysis of court criteria for awarding disability benefits to patients with Crohn's disease.

    PubMed

    Calvet, Xavier; Motos, Jaime; Montserrat, Antònia; Gallardo, Olga; Vergara, Mercedes

    2009-12-01

    Chronic disability and its consequences for social life and employment are important but often neglected aspects of Crohn's disease. No specific scores have been developed to evaluate chronic disability in patients with Crohn's disease; the medical criteria used by government authorities to award disability benefits have not been analyzed. We aimed to determine the courts' criteria for awarding disability benefits to patients with Crohn's disease in Spain. We systematically searched case law databases in Spain's regional Supreme Courts to identify sentences regarding awards of disability benefits to patients with Crohn's disease. Selected decisions were reviewed to extract variables related to the awarding of benefits. Univariate and multivariate analyses were performed to determine which variables predicted the awarding of benefits. Two hundred eighty sentences were reviewed. The rate of judicial decisions in favor of the claimants varied considerably between the various tribunals. Multivariate analysis showed that adequate description of the disease (odds ratio, 8.6), fecal incontinence (odds ratio, 8.9), the number of associated diseases (odds ratio, 2.3), and the presence of an ostomy (odds ratio, not estimable) were independent predictors of the awarding of Social Security benefits. The amount of Social Security benefits awarded to patients with Crohn's disease varied depending on the tribunal. The most important predictors of a court's disability award were the adequate description of the patient's disease, fecal incontinence, associated diseases, and presence of an ostomy.

  20. A multi-criteria decision analysis assessment of waste paper management options

    SciTech Connect

    Hanan, Deirdre; Burnley, Stephen; Cooke, David

    2013-03-15

    Highlights: ► Isolated communities have particular problems in terms of waste management. ► An MCDA tool allowed a group of non-experts to evaluate waste management options. ► The group preferred local waste management solutions to export to the mainland. ► Gasification of paper was the preferred option followed by recycling. ► The group concluded that they could be involved in the decision making process. - Abstract: The use of Multi-criteria Decision Analysis (MCDA) was investigated in an exercise using a panel of local residents and stakeholders to assess the options for managing waste paper on the Isle of Wight. Seven recycling, recovery and disposal options were considered by the panel who evaluated each option against seven environmental, financial and social criteria. The panel preferred options where the waste was managed on the island with gasification and recycling achieving the highest scores. Exporting the waste to the English mainland for incineration or landfill proved to be the least preferred options. This research has demonstrated that MCDA is an effective way of involving community groups in waste management decision making.

  1. Multi-criteria decision analysis and environmental risk assessment for nanomaterials

    NASA Astrophysics Data System (ADS)

    Linkov, Igor; Satterstrom, F. Kyle; Steevens, Jeffery; Ferguson, Elizabeth; Pleus, Richard C.

    2007-08-01

    Nanotechnology is a broad and complex discipline that holds great promise for innovations that can benefit mankind. Yet, one must not overlook the wide array of factors involved in managing nanomaterial development, ranging from the technical specifications of the material to possible adverse effects in humans. Other opportunities to evaluate benefits and risks are inherent in environmental health and safety (EHS) issues related to nanotechnology. However, there is currently no structured approach for making justifiable and transparent decisions with explicit trade-offs between the many factors that need to be taken into account. While many possible decision-making approaches exist, we believe that multi-criteria decision analysis (MCDA) is a powerful and scientifically sound decision analytical framework for nanomaterial risk assessment and management. This paper combines state-of-the-art research in MCDA methods applicable to nanotechnology with a hypothetical case study for nanomaterial management. The example shows how MCDA application can balance societal benefits against unintended side effects and risks, and how it can also bring together multiple lines of evidence to estimate the likely toxicity and risk of nanomaterials given limited information on physical and chemical properties. The essential contribution of MCDA is to link this performance information with decision criteria and weightings elicited from scientists and managers, allowing visualization and quantification of the trade-offs involved in the decision-making process.

  2. Net clinical benefit of oral anticoagulants: a multiple criteria decision analysis.

    PubMed

    Hsu, Jason C; Hsieh, Cheng-Yang; Yang, Yea-Huei Kao; Lu, Christine Y

    2015-01-01

    This study quantitatively evaluated the comparative efficacy and safety of new oral anticoagulants (dabigatran, rivaroxaban, and apizaban) and warfarin for treatment of nonvalvular atrial fibrillation. We also compared these agents under different scenarios, including population with high risk of stroke and for primary vs. secondary stroke prevention. We used multiple criteria decision analysis (MCDA) to assess the benefit-risk of these medications. Our MCDA models contained criteria for benefits (prevention of ischemic stroke and systemic embolism) and risks (intracranial and extracranial bleeding). We calculated a performance score for each drug accounting for benefits and risks in comparison to treatment alternatives. Overall, new agents had higher performance scores than warfarin; in order of performance scores: dabigatran 150 mg (0.529), rivaroxaban (0.462), apixaban (0.426), and warfarin (0.191). For patients at a higher risk of stroke (CHADS2 score≥3), apixaban had the highest performance score (0.686); performance scores for other drugs were 0.462 for dabigatran 150 mg, 0.392 for dabigatran 110 mg, 0.271 for rivaroxaban, and 0.116 for warfarin. Dabigatran 150 mg had the highest performance score for primary stroke prevention, while dabigatran 110 mg had the highest performance score for secondary prevention. Our results suggest that new oral anticoagulants might be preferred over warfarin. Selecting appropriate medicines according to the patient's condition based on information from an integrated benefit-risk assessment of treatment options is crucial to achieve optimal clinical outcomes.

  3. Multi-criteria decision analysis for waste management in Saharawi refugee camps

    SciTech Connect

    Garfi, M. Tondelli, S.; Bonoli, A.

    2009-10-15

    The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders: The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.

  4. Comparative Analysis of Thermoeconomic Evaluation Criteria for an Actual Heat Engine

    NASA Astrophysics Data System (ADS)

    Özel, Gülcan; Açıkkalp, Emin; Savaş, Ahmet Fevzi; Yamık, Hasan

    2016-07-01

    In the present study, an actual heat engine is investigated by using different thermoeconomic evaluation criteria in the literature. A criteria that has not been investigated in detail is considered and it is called as ecologico-economical criteria (F_{EC}). It is the difference of power cost and exergy destruction rate cost of the system. All four criteria are applied to an irreversible Carnot heat engine, results are presented numerically and some suggestions are made.

  5. Spatial multi-criteria decision analysis to predict suitability for African swine fever endemicity in Africa

    PubMed Central

    2014-01-01

    Background African swine fever (ASF) is endemic in several countries of Africa and may pose a risk to all pig producing areas on the continent. Official ASF reporting is often rare and there remains limited awareness of the continent-wide distribution of the disease. In the absence of accurate ASF outbreak data and few quantitative studies on the epidemiology of the disease in Africa, we used spatial multi-criteria decision analysis (MCDA) to derive predictions of the continental distribution of suitability for ASF persistence in domestic pig populations as part of sylvatic or domestic transmission cycles. In order to incorporate the uncertainty in the relative importance of different criteria in defining suitability, we modelled decisions within the MCDA framework using a stochastic approach. The predictive performance of suitability estimates was assessed via a partial ROC analysis using ASF outbreak data reported to the OIE since 2005. Results Outputs from the spatial MCDA indicate that large areas of sub-Saharan Africa may be suitable for ASF persistence as part of either domestic or sylvatic transmission cycles. Areas with high suitability for pig to pig transmission (‘domestic cycles’) were estimated to occur throughout sub-Saharan Africa, whilst areas with high suitability for introduction from wildlife reservoirs (‘sylvatic cycles’) were found predominantly in East, Central and Southern Africa. Based on average AUC ratios from the partial ROC analysis, the predictive ability of suitability estimates for domestic cycles alone was considerably higher than suitability estimates for sylvatic cycles alone, or domestic and sylvatic cycles in combination. Conclusions This study provides the first standardised estimates of the distribution of suitability for ASF transmission associated with domestic and sylvatic cycles in Africa. We provide further evidence for the utility of knowledge-driven risk mapping in animal health, particularly in data

  6. A fully multiple-criteria implementation of the Sobol‧ method for parameter sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Rosolem, Rafael; Gupta, Hoshin V.; Shuttleworth, W. James; Zeng, Xubin; de Gonçalves, Luis Gustavo Gonçalves

    2012-04-01

    We present a novel rank-based fully multiple-criteria implementation of the Sobol' variance-based sensitivity analysis approach that implements an objective strategy to evaluate parameter sensitivity when model evaluation involves several metrics of performance. The method is superior to single-criterion approaches while avoiding the subjectivity observed in "pseudo" multiple-criteria methods. Further, it contributes to our understanding of structural characteristics of a model and simplifies parameter estimation by identifying insensitive parameters that can be fixed to default values during model calibration studies. We illustrate the approach by applying it to the problem of identifying the most influential parameters in the Simple Biosphere 3 (SiB3) model using a network of flux towers in Brazil. We find 27-31 (out of 42) parameters to be influential, most (˜78%) of which are primarily associated with physiology, soil, and carbon properties, and that uncertainties in the physiological properties of the model contribute most to total model uncertainty in regard to energy and carbon fluxes. We also find that the second most important model component contributing to the total output uncertainty varies according to the flux analyzed; whereas morphological properties play an important role in sensible heat flux, soil properties are important for latent heat flux, and carbon properties (mainly associated with the soil respiration submodel) are important for carbon flux (as expected). These distinct sensitivities emphasize the need to account for the multioutput nature of land surface models during sensitivity analysis and parameter estimation. Applied to other similar models, our approach can help to establish which soil-plant-atmosphere processes matter most in land surface models of Amazonia and thereby aid in the design of field campaigns to characterize and measure the associated parameters. The approach can also be used with other sensitivity analysis

  7. Imaging flow cytometry for phytoplankton analysis.

    PubMed

    Dashkova, Veronika; Malashenkov, Dmitry; Poulton, Nicole; Vorobjev, Ivan; Barteneva, Natasha S

    2017-01-01

    This review highlights the concepts and instrumentation of imaging flow cytometry technology and in particular its use for phytoplankton analysis. Imaging flow cytometry, a hybrid technology combining speed and statistical capabilities of flow cytometry with imaging features of microscopy, is rapidly advancing as a cell imaging platform that overcomes many of the limitations of current techniques and contributed significantly to the advancement of phytoplankton analysis in recent years. This review presents the various instrumentation relevant to the field and currently used for assessment of complex phytoplankton communities' composition and abundance, size structure determination, biovolume estimation, detection of harmful algal bloom species, evaluation of viability and metabolic activity and other applications. Also we present our data on viability and metabolic assessment of Aphanizomenon sp. cyanobacteria using Imagestream X Mark II imaging cytometer. Herein, we highlight the immense potential of imaging flow cytometry for microalgal research, but also discuss limitations and future developments.

  8. Digital Image Analysis for DETCHIP® Code Determination

    PubMed Central

    Lyon, Marcus; Wilson, Mark V.; Rouhier, Kerry A.; Symonsbergen, David J.; Bastola, Kiran; Thapa, Ishwor; Holmes, Andrea E.

    2013-01-01

    DETECHIP® is a molecular sensing array used for identification of a large variety of substances. Previous methodology for the analysis of DETECHIP® used human vision to distinguish color changes induced by the presence of the analyte of interest. This paper describes several analysis techniques using digital images of DETECHIP®. Both a digital camera and flatbed desktop photo scanner were used to obtain Jpeg images. Color information within these digital images was obtained through the measurement of red-green-blue (RGB) values using software such as GIMP, Photoshop and ImageJ. Several different techniques were used to evaluate these color changes. It was determined that the flatbed scanner produced in the clearest and more reproducible images. Furthermore, codes obtained using a macro written for use within ImageJ showed improved consistency versus pervious methods. PMID:25267940

  9. Materials characterization through quantitative digital image analysis

    SciTech Connect

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  10. Theory of Image Analysis and Recognition.

    DTIC Science & Technology

    1983-01-24

    Narendra Ahuja Image models Ramalingam Chellappa Image models Matti Pietikainen * Texture analysis b David G. Morgenthaler’ 3D digital geometry c Angela Y. Wu...Restoration Parameter Choice A Quantitative Guide," TR-965, October 1980. 70. Matti Pietikainen , "On the Use of Hierarchically Computed ’Mexican Hat...81. Matti Pietikainen and Azriel Rosenfeld, "Image Segmenta- tion by Texture Using Pyramid Node Linking," TR-1008, February 1981. 82. David G. 1

  11. Analysis of dynamic brain imaging data.

    PubMed Central

    Mitra, P P; Pesaran, B

    1999-01-01

    Modern imaging techniques for probing brain function, including functional magnetic resonance imaging, intrinsic and extrinsic contrast optical imaging, and magnetoencephalography, generate large data sets with complex content. In this paper we develop appropriate techniques for analysis and visualization of such imaging data to separate the signal from the noise and characterize the signal. The techniques developed fall into the general category of multivariate time series analysis, and in particular we extensively use the multitaper framework of spectral analysis. We develop specific protocols for the analysis of fMRI, optical imaging, and MEG data, and illustrate the techniques by applications to real data sets generated by these imaging modalities. In general, the analysis protocols involve two distinct stages: "noise" characterization and suppression, and "signal" characterization and visualization. An important general conclusion of our study is the utility of a frequency-based representation, with short, moving analysis windows to account for nonstationarity in the data. Of particular note are 1) the development of a decomposition technique (space-frequency singular value decomposition) that is shown to be a useful means of characterizing the image data, and 2) the development of an algorithm, based on multitaper methods, for the removal of approximately periodic physiological artifacts arising from cardiac and respiratory sources. PMID:9929474

  12. Secure thin client architecture for DICOM image analysis

    NASA Astrophysics Data System (ADS)

    Mogatala, Harsha V. R.; Gallet, Jacqueline

    2005-04-01

    This paper presents a concept of Secure Thin Client (STC) Architecture for Digital Imaging and Communications in Medicine (DICOM) image analysis over Internet. STC Architecture provides in-depth analysis and design of customized reports for DICOM images using drag-and-drop and data warehouse technology. Using a personal computer and a common set of browsing software, STC can be used for analyzing and reporting detailed patient information, type of examinations, date, Computer Tomography (CT) dose index, and other relevant information stored within the images header files as well as in the hospital databases. STC Architecture is three-tier architecture. The First-Tier consists of drag-and-drop web based interface and web server, which provides customized analysis and reporting ability to the users. The Second-Tier consists of an online analytical processing (OLAP) server and database system, which serves fast, real-time, aggregated multi-dimensional data using OLAP technology. The Third-Tier consists of a smart algorithm based software program which extracts DICOM tags from CT images in this particular application, irrespective of CT vendor's, and transfers these tags into a secure database system. This architecture provides Winnipeg Regional Health Authorities (WRHA) with quality indicators for CT examinations in the hospitals. It also provides health care professionals with analytical tool to optimize radiation dose and image quality parameters. The information is provided to the user by way of a secure socket layer (SSL) and role based security criteria over Internet. Although this particular application has been developed for WRHA, this paper also discusses the effort to extend the Architecture to other hospitals in the region. Any DICOM tag from any imaging modality could be tracked with this software.

  13. Digital image processing in cephalometric analysis.

    PubMed

    Jäger, A; Döler, W; Schormann, T

    1989-01-01

    Digital image processing methods were applied to improve the practicability of cephalometric analysis. The individual X-ray film was digitized by the aid of a high resolution microscope-photometer. Digital processing was done using a VAX 8600 computer system. An improvement of the image quality was achieved by means of various digital enhancement and filtering techniques.

  14. Launch commit criteria performance trending analysis, phase 1, revision A. SRM and QA mission services

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An assessment of quantitative methods and measures for measuring launch commit criteria (LCC) performance measurement trends is made. A statistical performance trending analysis pilot study was processed and compared to STS-26 mission data. This study used four selected shuttle measurement types (solid rocket booster, external tank, space shuttle main engine, and range safety switch safe and arm device) from the five missions prior to mission 51-L. After obtaining raw data coordinates, each set of measurements was processed to obtain statistical confidence bounds and mean data profiles for each of the selected measurement types. STS-26 measurements were compared to the statistical data base profiles to verify the statistical capability of assessing occurrences of data trend anomalies and abnormal time-varying operational conditions associated with data amplitude and phase shifts.

  15. [Studies of performance evaluation and criteria for trans-fatty acids analysis using GC-FID].

    PubMed

    Watanabe, Takahiro; Ishikawa, Tomoko; Matsuda, Rieko

    2013-01-01

    Performance evaluation methods and criteria for trans-fatty acids analysis using GC-FID were examined. The measurement method constructed in this study was based on the American Oil Chemists' Society (AOCS) official standard methods Ce1h-05. The method for fat extraction from general foods was based on the methods for nutrition labeling notified by the Ministry of Health, Labour and Welfare of Japan and AOAC 996.06. To estimate trueness and precision, fortified samples were analyzed following the established experimental design. Five molecular species of trans-fatty acids that are rarely contained in foods were used for preparing the fortified samples. To estimate precision, more than four degrees of freedom of variance are required. Based on the results, within-laboratory trueness and reproducibility will be set at 90-110% and 10% (RSD%), respectively.

  16. An Imaging And Graphics Workstation For Image Sequence Analysis

    NASA Astrophysics Data System (ADS)

    Mostafavi, Hassan

    1990-01-01

    This paper describes an application-specific engineering workstation designed and developed to analyze imagery sequences from a variety of sources. The system combines the software and hardware environment of the modern graphic-oriented workstations with the digital image acquisition, processing and display techniques. The objective is to achieve automation and high throughput for many data reduction tasks involving metric studies of image sequences. The applications of such an automated data reduction tool include analysis of the trajectory and attitude of aircraft, missile, stores and other flying objects in various flight regimes including launch and separation as well as regular flight maneuvers. The workstation can also be used in an on-line or off-line mode to study three-dimensional motion of aircraft models in simulated flight conditions such as wind tunnels. The system's key features are: 1) Acquisition and storage of image sequences by digitizing real-time video or frames from a film strip; 2) computer-controlled movie loop playback, slow motion and freeze frame display combined with digital image sharpening, noise reduction, contrast enhancement and interactive image magnification; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored image sequence; 4) automatic and manual field-of-view and spatial calibration; 5) image sequence data base generation and management, including the measurement data products; 6) off-line analysis software for trajectory plotting and statistical analysis; 7) model-based estimation and tracking of object attitude angles; and 8) interface to a variety of video players and film transport sub-systems.

  17. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  18. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  19. Harnessing Ecosystem Models and Multi-Criteria Decision Analysis for the Support of Forest Management

    NASA Astrophysics Data System (ADS)

    Wolfslehner, Bernhard; Seidl, Rupert

    2010-12-01

    The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.

  20. Machine learning applications in cell image analysis.

    PubMed

    Kan, Andrey

    2017-04-04

    Machine learning (ML) refers to a set of automatic pattern recognition methods that have been successfully applied across various problem domains, including biomedical image analysis. This review focuses on ML applications for image analysis in light microscopy experiments with typical tasks of segmenting and tracking individual cells, and modelling of reconstructed lineage trees. After describing a typical image analysis pipeline and highlighting challenges of automatic analysis (for example, variability in cell morphology, tracking in presence of clutters) this review gives a brief historical outlook of ML, followed by basic concepts and definitions required for understanding examples. This article then presents several example applications at various image processing stages, including the use of supervised learning methods for improving cell segmentation, and the application of active learning for tracking. The review concludes with remarks on parameter setting and future directions.Immunology and Cell Biology advance online publication, 4 April 2017; doi:10.1038/icb.2017.16.

  1. A Robust Actin Filaments Image Analysis Framework

    PubMed Central

    Alioscha-Perez, Mitchel; Benadiba, Carine; Goossens, Katty; Kasas, Sandor; Dietler, Giovanni; Willaert, Ronnie; Sahli, Hichem

    2016-01-01

    The cytoskeleton is a highly dynamical protein network that plays a central role in numerous cellular physiological processes, and is traditionally divided into three components according to its chemical composition, i.e. actin, tubulin and intermediate filament cytoskeletons. Understanding the cytoskeleton dynamics is of prime importance to unveil mechanisms involved in cell adaptation to any stress type. Fluorescence imaging of cytoskeleton structures allows analyzing the impact of mechanical stimulation in the cytoskeleton, but it also imposes additional challenges in the image processing stage, such as the presence of imaging-related artifacts and heavy blurring introduced by (high-throughput) automated scans. However, although there exists a considerable number of image-based analytical tools to address the image processing and analysis, most of them are unfit to cope with the aforementioned challenges. Filamentous structures in images can be considered as a piecewise composition of quasi-straight segments (at least in some finer or coarser scale). Based on this observation, we propose a three-steps actin filaments extraction methodology: (i) first the input image is decomposed into a ‘cartoon’ part corresponding to the filament structures in the image, and a noise/texture part, (ii) on the ‘cartoon’ image, we apply a multi-scale line detector coupled with a (iii) quasi-straight filaments merging algorithm for fiber extraction. The proposed robust actin filaments image analysis framework allows extracting individual filaments in the presence of noise, artifacts and heavy blurring. Moreover, it provides numerous parameters such as filaments orientation, position and length, useful for further analysis. Cell image decomposition is relatively under-exploited in biological images processing, and our study shows the benefits it provides when addressing such tasks. Experimental validation was conducted using publicly available datasets, and in osteoblasts

  2. A Robust Actin Filaments Image Analysis Framework.

    PubMed

    Alioscha-Perez, Mitchel; Benadiba, Carine; Goossens, Katty; Kasas, Sandor; Dietler, Giovanni; Willaert, Ronnie; Sahli, Hichem

    2016-08-01

    The cytoskeleton is a highly dynamical protein network that plays a central role in numerous cellular physiological processes, and is traditionally divided into three components according to its chemical composition, i.e. actin, tubulin and intermediate filament cytoskeletons. Understanding the cytoskeleton dynamics is of prime importance to unveil mechanisms involved in cell adaptation to any stress type. Fluorescence imaging of cytoskeleton structures allows analyzing the impact of mechanical stimulation in the cytoskeleton, but it also imposes additional challenges in the image processing stage, such as the presence of imaging-related artifacts and heavy blurring introduced by (high-throughput) automated scans. However, although there exists a considerable number of image-based analytical tools to address the image processing and analysis, most of them are unfit to cope with the aforementioned challenges. Filamentous structures in images can be considered as a piecewise composition of quasi-straight segments (at least in some finer or coarser scale). Based on this observation, we propose a three-steps actin filaments extraction methodology: (i) first the input image is decomposed into a 'cartoon' part corresponding to the filament structures in the image, and a noise/texture part, (ii) on the 'cartoon' image, we apply a multi-scale line detector coupled with a (iii) quasi-straight filaments merging algorithm for fiber extraction. The proposed robust actin filaments image analysis framework allows extracting individual filaments in the presence of noise, artifacts and heavy blurring. Moreover, it provides numerous parameters such as filaments orientation, position and length, useful for further analysis. Cell image decomposition is relatively under-exploited in biological images processing, and our study shows the benefits it provides when addressing such tasks. Experimental validation was conducted using publicly available datasets, and in osteoblasts grown in

  3. A Latent Class Analysis of DSM-IV Alcohol Use Disorder Criteria and Binge Drinking in Undergraduates

    PubMed Central

    Beseler, Cheryl L.; Taylor, Laura A.; Kraemer, Deborah Tebes; Leeman, Robert F.

    2011-01-01

    Background Adolescent and adult samples have shown that DSM-IV abuse and dependence criteria lie on a continuum of alcohol problem severity, but information on criteria functioning in college students is lacking. Prior factor analyses in a college sample (Beseler et al., 2010) indicated that a two-factor solution fit the data better than a single-factor solution after a binge drinking criterion was included. The second dimension may indicate a clustering of criteria related to excessive alcohol use in this college sample. Methods The present study was an analysis of data from an anonymous, online survey of undergraduates (N = 361) that included items pertaining to the DSM-IV alcohol use disorder (AUD) diagnostic criteria and binge drinking. Latent class analysis (LCA) was used to determine whether the criteria best fit a categorical model, with and without a binge drinking criterion. Results In a LCA including the AUD criteria only, a 3-class solution was the best fit. Binge drinking worsened the fit of the models. The largest class (class 1, n = 217) primarily endorsed tolerance (18.4%); none were alcohol dependent. The middle class (class 2, n = 114) endorsed primarily tolerance (81.6%) and drinking more than intended (74.6%); 34.2% met criteria for dependence. The smallest class (class 3, n = 30) endorsed all criteria with high probabilities (30% to 100%); all met criteria for dependence. Alcohol consumption patterns did not differ significantly between classes 2 and 3. Class 3 was characterized by higher levels on several variables thought to predict risk of alcohol-related problems (e.g., enhancement motives for drinking, impulsivity and aggression). Conclusions Two classes of heavy drinking college students were identified, one of which appeared to be at higher risk than the other. The highest risk group may be less likely to “mature out” of high-risk drinking after college. PMID:22004067

  4. A latent class analysis of DSM-IV alcohol use disorder criteria and binge drinking in undergraduates.

    PubMed

    Beseler, Cheryl L; Taylor, Laura A; Kraemer, Deborah T; Leeman, Robert F

    2012-01-01

    Adolescent and adult samples have shown that the Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV) abuse and dependence criteria lie on a continuum of alcohol problem severity, but information on criteria functioning in college students is lacking. Prior factor analyses in a college sample (Beseler et al., 2010) indicated that a 2-factor solution fit the data better than a single-factor solution after a binge drinking criterion was included. The second dimension may indicate a clustering of criteria related to excessive alcohol use in this college sample. The present study was an analysis of data from an anonymous, online survey of undergraduates (N = 361) that included items pertaining to the DSM-IV alcohol use disorder (AUD) diagnostic criteria and binge drinking. Latent class analysis (LCA) was used to determine whether the criteria best fit a categorical model, with and without a binge drinking criterion. In an LCA including the AUD criteria only, a 3-class solution was the best fit. Binge drinking worsened the fit of the models. The largest class (class 1, n = 217) primarily endorsed tolerance (18.4%); none were alcohol dependent. The middle class (class 2, n = 114) endorsed primarily tolerance (81.6%) and drinking more than intended (74.6%); 34.2% met criteria for dependence. The smallest class (class 3, n = 30) endorsed all criteria with high probabilities (30 to 100%); all met criteria for dependence. Alcohol consumption patterns did not differ significantly between classes 2 and 3. Class 3 was characterized by higher levels on several variables thought to predict risk of alcohol-related problems (e.g., enhancement motives for drinking, impulsivity, and aggression). Two classes of heavy-drinking college students were identified, one of which appeared to be at higher risk than the other. The highest risk group may be less likely to "mature out" of high-risk drinking after college. Copyright © 2011 by the Research Society on Alcoholism.

  5. On image analysis in fractography (Methodological Notes)

    NASA Astrophysics Data System (ADS)

    Shtremel', M. A.

    2015-10-01

    As other spheres of image analysis, fractography has no universal method for information convolution. An effective characteristic of an image is found by analyzing the essence and origin of every class of objects. As follows from the geometric definition of a fractal curve, its projection onto any straight line covers a certain segment many times; therefore, neither a time series (one-valued function of time) nor an image (one-valued function of plane) can be a fractal. For applications, multidimensional multiscale characteristics of an image are necessary. "Full" wavelet series break the law of conservation of information.

  6. Retinal image analysis: concepts, applications and potential.

    PubMed

    Patton, Niall; Aslam, Tariq M; MacGillivray, Thomas; Deary, Ian J; Dhillon, Baljean; Eikelboom, Robert H; Yogesan, Kanagasingam; Constable, Ian J

    2006-01-01

    As digital imaging and computing power increasingly develop, so too does the potential to use these technologies in ophthalmology. Image processing, analysis and computer vision techniques are increasing in prominence in all fields of medical science, and are especially pertinent to modern ophthalmology, as it is heavily dependent on visually oriented signs. The retinal microvasculature is unique in that it is the only part of the human circulation that can be directly visualised non-invasively in vivo, readily photographed and subject to digital image analysis. Exciting developments in image processing relevant to ophthalmology over the past 15 years includes the progress being made towards developing automated diagnostic systems for conditions, such as diabetic retinopathy, age-related macular degeneration and retinopathy of prematurity. These diagnostic systems offer the potential to be used in large-scale screening programs, with the potential for significant resource savings, as well as being free from observer bias and fatigue. In addition, quantitative measurements of retinal vascular topography using digital image analysis from retinal photography have been used as research tools to better understand the relationship between the retinal microvasculature and cardiovascular disease. Furthermore, advances in electronic media transmission increase the relevance of using image processing in 'teleophthalmology' as an aid in clinical decision-making, with particular relevance to large rural-based communities. In this review, we outline the principles upon which retinal digital image analysis is based. We discuss current techniques used to automatically detect landmark features of the fundus, such as the optic disc, fovea and blood vessels. We review the use of image analysis in the automated diagnosis of pathology (with particular reference to diabetic retinopathy). We also review its role in defining and performing quantitative measurements of vascular topography

  7. Multiresolution morphological analysis of document images

    NASA Astrophysics Data System (ADS)

    Bloomberg, Dan S.

    1992-11-01

    An image-based approach to document image analysis is presented, that uses shape and textural properties interchangeably at multiple scales. Image-based techniques permit a relatively small number of simple and fast operations to be used for a wide variety of analysis problems with document images. The primary binary image operations are morphological and multiresolution. The generalized opening, a morphological operation, allows extraction of image features that have both shape and textural properties, and that are not limited by properties related to image connectivity. Reduction operations are necessary due to the large number of pixels at scanning resolution, and threshold reduction is used for efficient and controllable shape and texture transformations between resolution levels. Aspects of these techniques, which include sequences of threshold reductions, are illustrated by problems such as text/halftone segmentation and word-level extraction. Both the generalized opening and these multiresolution operations are then used to identify italic and bold words in text. These operations are performed without any attempt at identification of individual characters. Their robustness derives from the aggregation of statistical properties over entire words. However, the analysis of the statistical properties is performed implicitly, in large part through nonlinear image processing operations. The approximate computational cost of the basic operations is given, and the importance of operating at the lowest feasable resolution is demonstrated.

  8. Estimation of failure criteria in multivariate sensory shelf life testing using survival analysis.

    PubMed

    Giménez, Ana; Gagliardi, Andrés; Ares, Gastón

    2017-09-01

    For most food products, shelf life is determined by changes in their sensory characteristics. A predetermined increase or decrease in the intensity of a sensory characteristic has frequently been used to signal that a product has reached the end of its shelf life. Considering all attributes change simultaneously, the concept of multivariate shelf life allows a single measurement of deterioration that takes into account all these sensory changes at a certain storage time. The aim of the present work was to apply survival analysis to estimate failure criteria in multivariate sensory shelf life testing using two case studies, hamburger buns and orange juice, by modelling the relationship between consumers' rejection of the product and the deterioration index estimated using PCA. In both studies, a panel of 13 trained assessors evaluated the samples using descriptive analysis whereas a panel of 100 consumers answered a "yes" or "no" question regarding intention to buy or consume the product. PC1 explained the great majority of the variance, indicating all sensory characteristics evolved similarly with storage time. Thus, PC1 could be regarded as index of sensory deterioration and a single failure criterion could be estimated through survival analysis for 25 and 50% consumers' rejection. The proposed approach based on multivariate shelf life testing may increase the accuracy of shelf life estimations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Use of power analysis to develop detectable significance criteria for sea urchin toxicity tests

    USGS Publications Warehouse

    Carr, R.S.; Biedenbach, J.M.

    1999-01-01

    When sufficient data are available, the statistical power of a test can be determined using power analysis procedures. The term “detectable significance” has been coined to refer to this criterion based on power analysis and past performance of a test. This power analysis procedure has been performed with sea urchin (Arbacia punctulata) fertilization and embryological development data from sediment porewater toxicity tests. Data from 3100 and 2295 tests for the fertilization and embryological development tests, respectively, were used to calculate the criteria and regression equations describing the power curves. Using Dunnett's test, a minimum significant difference (MSD) (β = 0.05) of 15.5% and 19% for the fertilization test, and 16.4% and 20.6% for the embryological development test, for α ≤ 0.05 and α ≤ 0.01, respectively, were determined. The use of this second criterion reduces type I (false positive) errors and helps to establish a critical level of difference based on the past performance of the test.

  10. Scenario and multiple criteria decision analysis for energy and environmental security of military and industrial installations.

    PubMed

    Karvetski, Christopher W; Lambert, James H; Linkov, Igor

    2011-04-01

    Military and industrial facilities need secure and reliable power generation. Grid outages can result in cascading infrastructure failures as well as security breaches and should be avoided. Adding redundancy and increasing reliability can require additional environmental, financial, logistical, and other considerations and resources. Uncertain scenarios consisting of emergent environmental conditions, regulatory changes, growth of regional energy demands, and other concerns result in further complications. Decisions on selecting energy alternatives are made on an ad hoc basis. The present work integrates scenario analysis and multiple criteria decision analysis (MCDA) to identify combinations of impactful emergent conditions and to perform a preliminary benefits analysis of energy and environmental security investments for industrial and military installations. Application of a traditional MCDA approach would require significant stakeholder elicitations under multiple uncertain scenarios. The approach proposed in this study develops and iteratively adjusts a scoring function for investment alternatives to find the scenarios with the most significant impacts on installation security. A robust prioritization of investment alternatives can be achieved by integrating stakeholder preferences and focusing modeling and decision-analytical tools on a few key emergent conditions and scenarios. The approach is described and demonstrated for a campus of several dozen interconnected industrial buildings within a major installation. Copyright © 2010 SETAC.

  11. Malware analysis using visualized image matrices.

    PubMed

    Han, KyoungSoo; Kang, BooJoong; Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively.

  12. Edge enhanced morphology for infrared image analysis

    NASA Astrophysics Data System (ADS)

    Bai, Xiangzhi; Liu, Haonan

    2017-01-01

    Edge information is one of the critical information for infrared images. Morphological operators have been widely used for infrared image analysis. However, the edge information in infrared image is weak and the morphological operators could not well utilize the edge information of infrared images. To strengthen the edge information in morphological operators, the edge enhanced morphology is proposed in this paper. Firstly, the edge enhanced dilation and erosion operators are given and analyzed. Secondly, the pseudo operators which are derived from the edge enhanced dilation and erosion operators are defined. Finally, the applications for infrared image analysis are shown to verify the effectiveness of the proposed edge enhanced morphological operators. The proposed edge enhanced morphological operators are useful for the applications related to edge features, which could be extended to wide area of applications.

  13. Image Analysis of the Tumor Microenvironment.

    PubMed

    Lloyd, Mark C; Johnson, Joseph O; Kasprzak, Agnieszka; Bui, Marilyn M

    2016-01-01

    In the field of pathology it is clear that molecular genomics and digital imaging represent two promising future directions, and both are as relevant to the tumor microenvironment as they are to the tumor itself (Beck AH et al. Sci Transl Med 3(108):108ra113-08ra113, 2011). Digital imaging, or whole slide imaging (WSI), of glass histology slides facilitates a number of value-added competencies which were not previously possible with the traditional analog review of these slides under a microscope by a pathologist. As an important tool for investigational research, digital pathology can leverage the quantification and reproducibility offered by image analysis to add value to the pathology field. This chapter will focus on the application of image analysis to investigate the tumor microenvironment and how quantitative investigation can provide deeper insight into our understanding of the tumor to tumor microenvironment relationship.

  14. ASCI 2010 appropriateness criteria for cardiac computed tomography: a report of the Asian Society of Cardiovascular Imaging Cardiac Computed Tomography and Cardiac Magnetic Resonance Imaging Guideline Working Group.

    PubMed

    Tsai, I-Chen; Choi, Byoung Wook; Chan, Carmen; Jinzaki, Masahiro; Kitagawa, Kakuya; Yong, Hwan Seok; Yu, Wei

    2010-02-01

    In Asia, the healthcare system, populations and patterns of disease differ from Western countries. The current reports on the criteria for cardiac CT scans, provided by Western professional societies, are not appropriate for Asian cultures. The Asian Society of Cardiovascular Imaging, the only society dedicated to cardiovascular imaging in Asia, formed a Working Group and invited 23 Technical Panel members representing a variety of Asian countries to rate the 51 indications for cardiac CT in clinical practice in Asia. The indications were rated as 'appropriate' (7-9), 'uncertain' (4-6), or 'inappropriate' (1-3) on a scale of 1-9. The median score was used for the final result if there was no disagreement. The final ratings for indications were 33 appropriate, 14 uncertain and 4 inappropriate. And 20 of them are highly agreed (19 appropriate and 1 inappropriate). Specifically, the Asian representatives considered cardiac CT as an appropriate modality for Kawasaki disease and congenital heart diseases in follow up and in symptomatic patients. In addition, except for some specified conditions, cardiac CT was considered to be an appropriate modality for one-stop shop ischemic heart disease evaluation due to its general appropriateness in coronary, structure and function evaluation. This report is expected to have a significant impact on the clinical practice, research and reimbursement policy in Asia.

  15. Appropriate use criteria for amyloid PET: a report of the Amyloid Imaging Task Force, the Society of Nuclear Medicine and Molecular Imaging, and the Alzheimer's Association.

    PubMed

    Johnson, Keith A; Minoshima, Satoshi; Bohnen, Nicolaas I; Donohoe, Kevin J; Foster, Norman L; Herscovitch, Peter; Karlawish, Jason H; Rowe, Christopher C; Carrillo, Maria C; Hartley, Dean M; Hedrick, Saima; Pappas, Virginia; Thies, William H

    2013-01-01

    Positron emission tomography (PET) of brain amyloid b is a technology that is becoming more available, but its clinical utility in medical practice requires careful definition. To provide guidance to dementia care practitioners, patients, and caregivers, the Alzheimer's Association and the Society of Nuclear Medicine and Molecular Imaging convened the Amyloid Imaging Taskforce (AIT). The AIT considered a broad range of specific clinical scenarios in which amyloid PET could potentially be used appropriately. Peer-reviewed, published literature was searched to ascertain available evidence relevant to these scenarios, and the AIT developed a consensus of expert opinion. Although empirical evidence of impact on clinical outcomes is not yet available, a set of specific appropriate use criteria (AUC) were agreed on that define the types of patients and clinical circumstances in which amyloid PET could be used. Both appropriate and inappropriate uses were considered and formulated,and are reported and discussed here. Because both dementia care and amyloid PET technology are in active development, these AUC will require periodic reassessment. Future research directions are also outlined, including diagnostic utility and patient-centered outcomes.

  16. Appropriate use criteria for amyloid PET: a report of the Amyloid Imaging Task Force, the Society of Nuclear Medicine and Molecular Imaging, and the Alzheimer's Association.

    PubMed

    Johnson, Keith A; Minoshima, Satoshi; Bohnen, Nicolaas I; Donohoe, Kevin J; Foster, Norman L; Herscovitch, Peter; Karlawish, Jason H; Rowe, Christopher C; Carrillo, Maria C; Hartley, Dean M; Hedrick, Saima; Pappas, Virginia; Thies, William H

    2013-03-01

    Positron emission tomography (PET) of brain amyloid β is a technology that is becoming more available, but its clinical utility in medical practice requires careful definition. To provide guidance to dementia care practitioners, patients, and caregivers, the Alzheimer's Association and the Society of Nuclear Medicine and Molecular Imaging convened the Amyloid Imaging Taskforce (AIT). The AIT considered a broad range of specific clinical scenarios in which amyloid PET could potentially be used appropriately. Peer-reviewed, published literature was searched to ascertain available evidence relevant to these scenarios, and the AIT developed a consensus of expert opinion. Although empirical evidence of impact on clinical outcomes is not yet available, a set of specific appropriate use criteria (AUC) were agreed on that define the types of patients and clinical circumstances in which amyloid PET could be used. Both appropriate and inappropriate uses were considered and formulated, and are reported and discussed here. Because both dementia care and amyloid PET technology are in active development, these AUC will require periodic reassessment. Future research directions are also outlined, including diagnostic utility and patient-centered outcomes.

  17. Topological image texture analysis for quality assessment

    NASA Astrophysics Data System (ADS)

    Asaad, Aras T.; Rashid, Rasber Dh.; Jassim, Sabah A.

    2017-05-01

    Image quality is a major factor influencing pattern recognition accuracy and help detect image tampering for forensics. We are concerned with investigating topological image texture analysis techniques to assess different type of degradation. We use Local Binary Pattern (LBP) as a texture feature descriptor. For any image construct simplicial complexes for selected groups of uniform LBP bins and calculate persistent homology invariants (e.g. number of connected components). We investigated image quality discriminating characteristics of these simplicial complexes by computing these models for a large dataset of face images that are affected by the presence of shadows as a result of variation in illumination conditions. Our tests demonstrate that for specific uniform LBP patterns, the number of connected component not only distinguish between different levels of shadow effects but also help detect the infected regions as well.

  18. Image texture analysis of crushed wheat kernels

    NASA Astrophysics Data System (ADS)

    Zayas, Inna Y.; Martin, C. R.; Steele, James L.; Dempster, Richard E.

    1992-03-01

    The development of new approaches for wheat hardness assessment may impact the grain industry in marketing, milling, and breeding. This study used image texture features for wheat hardness evaluation. Application of digital imaging to grain for grading purposes is principally based on morphometrical (shape and size) characteristics of the kernels. A composite sample of 320 kernels for 17 wheat varieties were collected after testing and crushing with a single kernel hardness characterization meter. Six wheat classes where represented: HRW, HRS, SRW, SWW, Durum, and Club. In this study, parameters which characterize texture or spatial distribution of gray levels of an image were determined and used to classify images of crushed wheat kernels. The texture parameters of crushed wheat kernel images were different depending on class, hardness and variety of the wheat. Image texture analysis of crushed wheat kernels showed promise for use in class, hardness, milling quality, and variety discrimination.

  19. Multi-level multi-criteria analysis of alternative fuels for waste collection vehicles in the United States.

    PubMed

    Maimoun, Mousa; Madani, Kaveh; Reinhart, Debra

    2016-04-15

    Historically, the U.S. waste collection fleet was dominated by diesel-fueled waste collection vehicles (WCVs); the growing need for sustainable waste collection has urged decision makers to incorporate economically efficient alternative fuels, while mitigating environmental impacts. The pros and cons of alternative fuels complicate the decisions making process, calling for a comprehensive study that assesses the multiple factors involved. Multi-criteria decision analysis (MCDA) methods allow decision makers to select the best alternatives with respect to selection criteria. In this study, two MCDA methods, Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) and Simple Additive Weighting (SAW), were used to rank fuel alternatives for the U.S. waste collection industry with respect to a multi-level environmental and financial decision matrix. The environmental criteria consisted of life-cycle emissions, tail-pipe emissions, water footprint (WFP), and power density, while the financial criteria comprised of vehicle cost, fuel price, fuel price stability, and fueling station availability. The overall analysis showed that conventional diesel is still the best option, followed by hydraulic-hybrid WCVs, landfill gas (LFG) sourced natural gas, fossil natural gas, and biodiesel. The elimination of the WFP and power density criteria from the environmental criteria ranked biodiesel 100 (BD100) as an environmentally better alternative compared to other fossil fuels (diesel and natural gas). This result showed that considering the WFP and power density as environmental criteria can make a difference in the decision process. The elimination of the fueling station and fuel price stability criteria from the decision matrix ranked fossil natural gas second after LFG-sourced natural gas. This scenario was found to represent the status quo of the waste collection industry. A sensitivity analysis for the status quo scenario showed the overall ranking of diesel and

  20. Three-dimensional freehand ultrasound: image reconstruction and volume analysis.

    PubMed

    Barry, C D; Allott, C P; John, N W; Mellor, P M; Arundel, P A; Thomson, D S; Waterton, J C

    1997-01-01

    A system is described that rapidly produces a regular 3-dimensional (3-D) data block suitable for processing by conventional image analysis and volume measurement software. The system uses electromagnetic spatial location of 2-dimensional (2-D) freehand-scanned ultrasound B-mode images, custom-built signal-conditioning hardware, UNIX-based computer processing and an efficient 3-D reconstruction algorithm. Utilisation of images from multiple angles of insonation, "compounding," reduces speckle contrast, improves structure coherence within the reconstructed grey-scale image and enhances the ability to detect structure boundaries and to segment and quantify features. Volume measurements using a series of water-filled latex and cylindrical foam rubber phantoms with volumes down to 0.7 mL show that a high degree of accuracy, precision and reproducibility can be obtained. Extension of the technique to handle in vivo data sets by allowing physiological criteria to be taken into account in selecting the images used for construction is also illustrated.

  1. Single-image molecular analysis for accelerated fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Wang, Yan Mei

    2011-03-01

    We have developed a new single-molecule fluorescence imaging analysis method, SIMA, to improve the temporal resolution of single-molecule localization and tracking studies to millisecond timescales without compromising the nanometer range spatial resolution [1,2]. In this method, the width of the fluorescence intensity profile of a static or mobile molecule, imaged using submillisecond to milliseconds exposure time, is used for localization and dynamics analysis. We apply this method to three single-molecule studies: (1) subdiffraction molecular separation measurements, (2) axial localization precision measurements, and (3) protein diffusion coefficient measurements in free solution. Applications of SIMA in flagella IFT particle analysis, localizations of UgtP (a cell division regulator protein) in live cells, and diffusion coefficient measurement of LacI in vitro and in vivo will be discussed.

  2. Net Clinical Benefit of Oral Anticoagulants: A Multiple Criteria Decision Analysis

    PubMed Central

    Yang, Yea-Huei Kao; Lu, Christine Y.

    2015-01-01

    Background This study quantitatively evaluated the comparative efficacy and safety of new oral anticoagulants (dabigatran, rivaroxaban, and apizaban) and warfarin for treatment of nonvalvular atrial fibrillation. We also compared these agents under different scenarios, including population with high risk of stroke and for primary vs. secondary stroke prevention. Methods We used multiple criteria decision analysis (MCDA) to assess the benefit-risk of these medications. Our MCDA models contained criteria for benefits (prevention of ischemic stroke and systemic embolism) and risks (intracranial and extracranial bleeding). We calculated a performance score for each drug accounting for benefits and risks in comparison to treatment alternatives. Results Overall, new agents had higher performance scores than warfarin; in order of performance scores: dabigatran 150 mg (0.529), rivaroxaban (0.462), apixaban (0.426), and warfarin (0.191). For patients at a higher risk of stroke (CHADS2 score≥3), apixaban had the highest performance score (0.686); performance scores for other drugs were 0.462 for dabigatran 150 mg, 0.392 for dabigatran 110 mg, 0.271 for rivaroxaban, and 0.116 for warfarin. Dabigatran 150 mg had the highest performance score for primary stroke prevention, while dabigatran 110 mg had the highest performance score for secondary prevention. Conclusions Our results suggest that new oral anticoagulants might be preferred over warfarin. Selecting appropriate medicines according to the patient’s condition based on information from an integrated benefit-risk assessment of treatment options is crucial to achieve optimal clinical outcomes. PMID:25897861

  3. TESTING MULTI-CRITERIA DECISION ANALYSIS FOR MORE TRANSPARENT RESOURCE-ALLOCATION DECISION MAKING IN COLOMBIA.

    PubMed

    Castro Jaramillo, Hector Eduardo; Goetghebeur, Mireille; Moreno-Mattar, Ornella

    2016-01-01

    In 2012, Colombia experienced an important institutional transformation after the establishment of the Health Technology Assessment Institute (IETS), the disbandment of the Regulatory Commission for Health and the reassignment of reimbursement decision-making powers to the Ministry of Health and Social Protection (MoHSP). These dynamic changes provided the opportunity to test Multi-Criteria Decision Analysis (MCDA) for systematic and more transparent resource-allocation decision-making. During 2012 and 2013, the MCDA framework Evidence and Value: Impact on Decision Making (EVIDEM) was tested in Colombia. This consisted of a preparatory stage in which the investigators conducted literature searches and produced HTA reports for four interventions of interest, followed by a panel session with decision makers. This method was contrasted with a current approach used in Colombia for updating the publicly financed benefits package (POS), where narrative health technology assessment (HTA) reports are presented alongside comprehensive budget impact analyses (BIAs). Disease severity, size of population, and efficacy ranked at the top among fifteen preselected relevant criteria. MCDA estimates of technologies of interest ranged between 71 to 90 percent of maximum value. The ranking of technologies was sensitive to the methods used. Participants considered that a two-step approach including an MCDA template, complemented by a detailed BIA would be the best approach to assist decision-making in this context. Participants agreed that systematic priority setting should take place in Colombia. This work may serve as the basis to the MoHSP on its interest of setting up a systematic and more transparent process for resource-allocation decision-making.

  4. Hybrid Expert Systems In Image Analysis

    NASA Astrophysics Data System (ADS)

    Dixon, Mark J.; Gregory, Paul J.

    1987-04-01

    Vision systems capable of inspecting industrial components and assemblies have a large potential market if they can be easily programmed and produced quickly. Currently, vision application software written in conventional high-level languages such as C or Pascal are produced by experts in program design, image analysis, and process control. Applications written this way are difficult to maintain and modify. Unless other similar inspection problems can be found, the final program is essentially one-off redundant code. A general-purpose vision system targeted for the Visual Machines Ltd. C-VAS 3000 image processing workstation, is described which will make writing image analysis software accessible to the non-expert both in programming computers and image analysis. A significant reduction in the effort required to produce vision systems, will be gained through a graphically-driven interactive application generator. Finally, an Expert System will be layered on top to guide the naive user through the process of generating an application.

  5. Image analysis in comparative genomic hybridization

    SciTech Connect

    Lundsteen, C.; Maahr, J.; Christensen, B.

    1995-01-01

    Comparative genomic hybridization (CGH) is a new technique by which genomic imbalances can be detected by combining in situ suppression hybridization of whole genomic DNA and image analysis. We have developed software for rapid, quantitative CGH image analysis by a modification and extension of the standard software used for routine karyotyping of G-banded metaphase spreads in the Magiscan chromosome analysis system. The DAPI-counterstained metaphase spread is karyotyped interactively. Corrections for image shifts between the DAPI, FITC, and TRITC images are done manually by moving the three images relative to each other. The fluorescence background is subtracted. A mean filter is applied to smooth the FITC and TRITC images before the fluorescence ratio between the individual FITC and TRITC-stained chromosomes is computed pixel by pixel inside the area of the chromosomes determined by the DAPI boundaries. Fluorescence intensity ratio profiles are generated, and peaks and valleys indicating possible gains and losses of test DNA are marked if they exceed ratios below 0.75 and above 1.25. By combining the analysis of several metaphase spreads, consistent findings of gains and losses in all or almost all spreads indicate chromosomal imbalance. Chromosomal imbalances are detected either by visual inspection of fluorescence ratio (FR) profiles or by a statistical approach that compares FR measurements of the individual case with measurements of normal chromosomes. The complete analysis of one metaphase can be carried out in approximately 10 minutes. 8 refs., 7 figs., 1 tab.

  6. Multi-Criteria Analysis for Biomass Utilization Applying Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Morimoto, Hidetsugu; Hoshino, Satoshi; Kuki, Yasuaki

    This paper aimed to consider about material-recycling, preventing global warming, and economic efficiency on preset and planed 195 Biomass Towns applying DEA (Data Envelopment Analysis), which can evaluate operational efficiency entities such as private companies or projects. In the results, although the Biomass Town can recycle material efficiently, it was clarified that preventing global warming and business profitability was brushed off like it in Biomass Town Design. Moreover, from the point of view of operational efficiency, we suggested an improvement of the Biomass Town scale for more efficiency-enhancing applying DEA. We found that applying DEA was able to catch more improvements or indicator as compared with cost-benefit analysis and cost-effectiveness analysis.

  7. Multiscale data reduction with flexible saliency criterion for biological image analysis.

    PubMed

    Bosl, William J

    2009-01-01

    Analysis of biomedical images requires attention to image features that represent a small fraction of the total image size. A rapid method for eliminating unnecessary detail, analogous to pre-attentive processing in biological vision, allows computational resources to be applied where most needed for higher-level analysis. In this report we describe a method for bottom up merging of pixels into larger units based on flexible saliency criteria using a method similar to structured adaptive grid methods used for solving differential equations on physical domains. While creating a multiscale quadtree representation of the image, a saliency test is applied to prune the tree to eliminate unneeded details, resulting in an image with adaptive resolution. This method may be used as a first step for image segmentation and analysis and is inherently parallel, enabling implementation on programmable hardware or distributed memory clusters.

  8. Nicotine replacement therapy decision based on fuzzy multi-criteria analysis

    NASA Astrophysics Data System (ADS)

    Tarmudi, Zamali; Matmali, Norfazillah; Abdullah, Mohd Lazim

    2017-08-01

    It has been observed that Nicotine Replacement Therapy (NRT) is one of the alternatives to control and reduce smoking addiction among smokers. Since the decision to choose the best NRT alternative involves uncertainty, ambiguity factors and diverse input datasets, thus, this paper proposes a fuzzy multi-criteria analysis (FMA) to overcome these issues. It focuses on how the fuzzy approach can unify the diversity of datasets based on NRT's decision-making problem. The analysis done employed the advantage of the cost-benefit criterion to unify the mixture of dataset input. The performance matrix was utilised to derive the performance scores. An empirical example regarding the NRT's decision-making problem was employed to illustrate the proposed approach. Based on the calculations, this analytical approach was found to be highly beneficial in terms of usability. It was also very applicable and efficient in dealing with the mixture of input datasets. Hence, the decision-making process can easily be used by experts and patients who are interested to join the therapy/cessation program.

  9. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    PubMed

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results.

  10. A comparative analysis based on different strength criteria for evaluation of risk factor for dental implants.

    PubMed

    Natali, A N; Pavan, P G

    2002-04-01

    A numerical analysis is developed to study the interaction phenomena between endousseus titanium dental implants and surrounding jawbone tissue. The interest is focused on the most appropriate evaluation of the stress state arising in the tissue because of the implant under physiological loading. The problem is considered with regard to linear elastic response of the one and to short time effect. Different configurations of bone-implant system are described, using axial-symmetrical and three-dimensional models, by means of finite and geometric element method. The investigation attains to the stress states induced in bone that lead to a limit condition near the effective failure surface. The parameter commonly adopted in literature, such as the Von Mises stress, represents an excessive simplification of problem formulation, leading to an incorrect evaluation of the real failure risk for the implant, due to the assumption of the isotropic and deviatoric nature of the adopted stress measure. More suitable criterion can be assumed, such as the Tsai-Wu criterion, to take into account the anisotropy that characterises the response of bone, as well as the influence of a hydrostatic stress state. The analysis developed offers a comparison of results by using different criteria, leading to an evaluation of reliability of the procedure to be followed and addressing also to an evaluation of a risk factor for the implant investigated.

  11. Strategic rehabilitation planning of piped water networks using multi-criteria decision analysis.

    PubMed

    Scholten, Lisa; Scheidegger, Andreas; Reichert, Peter; Maurer, Max; Mauer, Max; Lienert, Judit

    2014-02-01

    To overcome the difficulties of strategic asset management of water distribution networks, a pipe failure and a rehabilitation model are combined to predict the long-term performance of rehabilitation strategies. Bayesian parameter estimation is performed to calibrate the failure and replacement model based on a prior distribution inferred from three large water utilities in Switzerland. Multi-criteria decision analysis (MCDA) and scenario planning build the framework for evaluating 18 strategic rehabilitation alternatives under future uncertainty. Outcomes for three fundamental objectives (low costs, high reliability, and high intergenerational equity) are assessed. Exploitation of stochastic dominance concepts helps to identify twelve non-dominated alternatives and local sensitivity analysis of stakeholder preferences is used to rank them under four scenarios. Strategies with annual replacement of 1.5-2% of the network perform reasonably well under all scenarios. In contrast, the commonly used reactive replacement is not recommendable unless cost is the only relevant objective. Exemplified for a small Swiss water utility, this approach can readily be adapted to support strategic asset management for any utility size and based on objectives and preferences that matter to the respective decision makers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.

    PubMed

    Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor

    2011-09-01

    Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied.

  13. Variability analysis of AGN: a review of results using new statistical criteria

    NASA Astrophysics Data System (ADS)

    Zibecchi, L.; Andruchow, I.; Cellone, S. A.; Romero, G. E.; Combi, J. A.

    We present here a re-analysis of the variability results of a sample of active galactic nuclei (AGN), which have been observed on several sessions with the 2.15 m "Jorge Sahade" telescope (CASLEO), San Juan, Argentina, and whose results are published (Romero et al. 1999, 2000, 2002; Cellone et al. 2000). The motivation for this new analysis is the implementation, dur- ing the last years, of improvements in the statistical criteria applied, taking quantitatively into account the incidence of the photometric errors (Cellone et al. 2007). This work is framed as a first step in an integral study on the statistical estimators of AGN variability. This study is motivated by the great diversity of statistical tests that have been proposed to analyze the variability of these objects. Since we note that, in some cases, the results of the object variability depend on the test used, we attempt to make a com- parative study of the various tests and analyze, under the given conditions, which of them is the most efficient and reliable.

  14. Use of the European preliminary criteria, the Breiman-classification tree and the American-European criteria for diagnosis of primary Sjögren's Syndrome in daily practice: a retrospective analysis.

    PubMed

    Langegger, C; Wenger, M; Duftner, C; Dejaco, C; Baldissera, I; Moncayo, R; Schirmer, M

    2007-06-01

    This study was conducted to assess the use of the European preliminary criteria, the Breiman-classification tree and the American-European criteria for diagnosis of primary Sjögren's Syndrome (pSS) in daily practice. A retrospective analysis of 17 consecutive patients with pSS (European criteria) was performed evaluating the application of the Schirmer test, semiquantitative sialoscintigraphy, immunologic tests, including rheumatoid factor, antinuclear antibodies, Sjögren's syndrome autoantibodies (SS-A, SS-B) and lip biopsy. Out of the 17 patients with pSS according to the European criteria, 15 patients fulfilled the classification tree (=88.2%), and 4 patients fulfilled the American-European criteria (=23.5%, P = 0.001). In the four patients fulfilling the American-European criteria, a positive result of the sialoscintigraphy was not crucial for the diagnosis according to these criteria. In conclusion, the American-European criteria are more stringent than the European preliminary criteria. We assume the role of sialoscintigraphy to be reduced when applying the American-European criteria.

  15. Retinal imaging analysis based on vessel detection.

    PubMed

    Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila

    2017-03-13

    With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art.

  16. MRI Image Processing Based on Fractal Analysis

    PubMed

    Marusina, Mariya Y; Mochalina, Alexandra P; Frolova, Ekaterina P; Satikov, Valentin I; Barchuk, Anton A; Kuznetcov, Vladimir I; Gaidukov, Vadim S; Tarakanov, Segrey A

    2017-01-01

    Background: Cancer is one of the most common causes of human mortality, with about 14 million new cases and 8.2 million deaths reported in in 2012. Early diagnosis of cancer through screening allows interventions to reduce mortality. Fractal analysis of medical images may be useful for this purpose. Materials and Methods: In this study, we examined magnetic resonance (MR) images of healthy livers and livers containing metastases from colorectal cancer. The fractal dimension and the Hurst exponent were chosen as diagnostic features for tomographic imaging using Image J software package for image processings FracLac for applied for fractal analysis with a 120x150 pixel area. Calculations of the fractal dimensions of pathological and healthy tissue samples were performed using the box-counting method. Results: In pathological cases (foci formation), the Hurst exponent was less than 0.5 (the region of unstable statistical characteristics). For healthy tissue, the Hurst index is greater than 0.5 (the zone of stable characteristics). Conclusions: The study indicated the possibility of employing fractal rapid analysis for the detection of focal lesions of the liver. The Hurst exponent can be used as an important diagnostic characteristic for analysis of medical images.

  17. Rock fracture image acquisition and analysis

    NASA Astrophysics Data System (ADS)

    Wang, W.; Zongpu, Jia; Chen, Liwan

    2007-12-01

    As a cooperation project between Sweden and China, this paper presents: rock fracture image acquisition and analysis. Rock fracture images are acquired by using UV light illumination and visible optical illumination. To present fracture network reasonable, we set up some models to characterize the network, based on the models, we used Best fit Ferret method to auto-determine fracture zone, then, through skeleton fractures to obtain endpoints, junctions, holes, particles, and branches. Based on the new parameters and a part of common parameters, the fracture network density, porosity, connectivity and complexities can be obtained, and the fracture network is characterized. In the following, we first present a basic consideration and basic parameters for fractures (Primary study of characteristics of rock fractures), then, set up a model for fracture network analysis (Fracture network analysis), consequently to use the model to analyze fracture network with different images (Two dimensional fracture network analysis based on slices), and finally give conclusions and suggestions.

  18. GIS, Geoscience, Multi-criteria Analysis and Integrated Management of the Coastal Zone

    NASA Astrophysics Data System (ADS)

    Kacimi, Y.; Barich, A.

    2011-12-01

    In this 3rd millennium, geology can be considered as a science of decision that intervenes in all the society domains. It has passed its academic dimension to spread toward some domains that until now were out of reach. Combining different Geoscience sub-disciplines emanates from a strong will to demonstrate the contribution of this science and its impact on the daily life, especially by making it applicable to various innovative projects. Geophysics, geochemistry and structural geology are complementary disciplines that can be applied in perfect symbiosis in many domains like construction, mining prospection, impact assessment, environment, etc. This can be proved by using collected data from these studies and integrate them into Geographic Information Systems (GIS), in order to make a multi-criteria analysis, which gives generally very impressive results. From this point, it is easy to set mining, eco-geotouristic and risk assessment models in order to establish land use projects but also in the case of integrated management of the coastal zone (IMCZ). Touristic projects in Morocco focus on its coast which represents at least 3500 km ; the management of this zone for building marinas or touristic infrastructures requires a deep and detailed study of marine currents on the coast, for example, by creating surveillance models and a coastal hazards map. An innovative project that will include geophysical, geochemical and structural geology studies associated to a multi-criteria analysis. The data will be integrated into a GIS to establish a coastal map that will highlight low-risk erosion zones and thus will facilitate implementation of ports and other construction projects. YES Morocco is a chapter of the International YES Network that aims to promote Geoscience in the service of society and professional development of Young and Early Career Geoscientists. Our commitment for such project will be of qualitative aspect into an associative framework that will involve

  19. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  20. The Politics of Determining Merit Aid Eligibility Criteria: An Analysis of the Policy Process

    ERIC Educational Resources Information Center

    Ness, Erik C.

    2010-01-01

    Despite the scholarly attention on the effects of merit aid on college access and choice, particularly on the significant effect that states' varied eligibility criteria play, no studies have examined the policy process through which merit aid criteria are determined. This is surprising given the recent attention to state-level policy dynamics and…

  1. Experiment in multiple-criteria energy policy analysis. [Using HOPE (holistic preference evaluation)

    SciTech Connect

    Ho, J K

    1980-07-01

    An international panel of energy analysts participated in an experiment to use HOPE (holistic preference evaluation): an interactive parametric linear-programming method for multiple-criteria optimization. The criteria of cost, environmental effect, crude oil, and nuclear fuel were considered according to BESOM: an energy model for the US in the year 2000.

  2. The application of integral performance criteria to the analysis of discrete maneuvers in a driving simulator

    NASA Technical Reports Server (NTRS)

    Repa, B. S.; Zucker, R. S.; Wierwille, W. W.

    1977-01-01

    The influence of vehicle transient response characteristics on driver-vehicle performance in discrete maneuvers as measured by integral performance criteria was investigated. A group of eight ordinary drivers was presented with a series of eight vehicle transfer function configurations in a driving simulator. Performance in two discrete maneuvers was analyzed by means of integral performance criteria. Results are presented.

  3. The Politics of Determining Merit Aid Eligibility Criteria: An Analysis of the Policy Process

    ERIC Educational Resources Information Center

    Ness, Erik C.

    2010-01-01

    Despite the scholarly attention on the effects of merit aid on college access and choice, particularly on the significant effect that states' varied eligibility criteria play, no studies have examined the policy process through which merit aid criteria are determined. This is surprising given the recent attention to state-level policy dynamics and…

  4. Deep Learning in Medical Image Analysis

    PubMed Central

    Shen, Dinggang; Wu, Guorong; Suk, Heung-Il

    2016-01-01

    The computer-assisted analysis for better interpreting images have been longstanding issues in the medical imaging field. On the image-understanding front, recent advances in machine learning, especially, in the way of deep learning, have made a big leap to help identify, classify, and quantify patterns in medical images. Specifically, exploiting hierarchical feature representations learned solely from data, instead of handcrafted features mostly designed based on domain-specific knowledge, lies at the core of the advances. In that way, deep learning is rapidly proving to be the state-of-the-art foundation, achieving enhanced performances in various medical applications. In this article, we introduce the fundamentals of deep learning methods; review their successes to image registration, anatomical/cell structures detection, tissue segmentation, computer-aided disease diagnosis or prognosis, and so on. We conclude by raising research issues and suggesting future directions for further improvements. PMID:28301734

  5. Single particle raster image analysis of diffusion.

    PubMed

    Longfils, M; Schuster, E; Lorén, N; Särkkä, A; Rudemo, M

    2017-04-01

    As a complement to the standard RICS method of analysing Raster Image Correlation Spectroscopy images with estimation of the image correlation function, we introduce the method SPRIA, Single Particle Raster Image Analysis. Here, we start by identifying individual particles and estimate the diffusion coefficient for each particle by a maximum likelihood method. Averaging over the particles gives a diffusion coefficient estimate for the whole image. In examples both with simulated and experimental data, we show that the new method gives accurate estimates. It also gives directly standard error estimates. The method should be possible to extend to study heterogeneous materials and systems of particles with varying diffusion coefficient, as demonstrated in a simple simulation example. A requirement for applying the SPRIA method is that the particle concentration is low enough so that we can identify the individual particles. We also describe a bootstrap method for estimating the standard error of standard RICS.

  6. Particle Pollution Estimation Based on Image Analysis.

    PubMed

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction.

  7. Image Processing for Galaxy Ellipticity Analysis

    NASA Astrophysics Data System (ADS)

    Stankus, Paul

    2015-04-01

    Shape analysis of statistically large samples of galaxy images can be used to reveal the imprint of weak gravitational lensing by dark matter distributions. As new, large-scale surveys expand the potential catalog, galaxy shape analysis suffers the (coupled) problems of high noise and uncertainty in the prior morphology. We investigate a new image processing technique to help mitigate these problems, in which repeated auto-correlations and auto-convolutions are employed to push the true shape toward a universal (Gaussian) attractor while relatively suppressing uncorrelated pixel noise. The goal is reliable reconstruction of original image moments, independent of image shape. First test evaluations of the technique on small control samples will be presented, and future applicability discussed. Supported by the US-DOE.

  8. Particle Pollution Estimation Based on Image Analysis

    PubMed Central

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction. PMID:26828757

  9. Functional data analysis in brain imaging studies.

    PubMed

    Tian, Tian Siva

    2010-01-01

    Functional data analysis (FDA) considers the continuity of the curves or functions, and is a topic of increasing interest in the statistics community. FDA is commonly applied to time-series and spatial-series studies. The development of functional brain imaging techniques in recent years made it possible to study the relationship between brain and mind over time. Consequently, an enormous amount of functional data is collected and needs to be analyzed. Functional techniques designed for these data are in strong demand. This paper discusses three statistically challenging problems utilizing FDA techniques in functional brain imaging analysis. These problems are dimension reduction (or feature extraction), spatial classification in functional magnetic resonance imaging studies, and the inverse problem in magneto-encephalography studies. The application of FDA to these issues is relatively new but has been shown to be considerably effective. Future efforts can further explore the potential of FDA in functional brain imaging studies.

  10. GIS-based multicriteria municipal solid waste landfill suitability analysis: a review of the methodologies performed and criteria implemented.

    PubMed

    Demesouka, O E; Vavatsikos, A P; Anagnostopoulos, K P

    2014-04-01

    Multicriteria spatial decision support systems (MC-SDSS) have emerged as an integration of the geographical information systems (GIS) and multiple criteria decision analysis (MCDA) methods. GIS-based MCDA allows the incorporation of conflicting objectives and decision maker (DM) preferences into spatial decision models. During recent decades, a variety of research articles have been published regarding the implementation of methods and/or tools in a variety of real-world case studies. The article discusses, in detail, the criteria and methods that are implemented in GIS-based landfill siting suitability analysis and especially the exclusionary and non-exclusionary criteria that can be considered when selecting sites for municipal solid waste (MSW) landfills. This paper reviews 36 seminal articles in which the evaluation of candidate landfill sites is conducted using MCDA methods. After a brief description of the main components of a MC-SDSS and the applied decision rules, the review focuses on the criteria incorporated into the decision models. The review provides a comprehensive guide to the landfill siting analysis criteria, providing details regarding the utilization methods, their decision or exclusionary nature and their monotonicity.

  11. Choices, choices: the application of multi-criteria decision analysis to a food safety decision-making problem.

    PubMed

    Fazil, A; Rajic, A; Sanchez, J; McEwen, S

    2008-11-01

    In the food safety arena, the decision-making process can be especially difficult. Decision makers are often faced with social and fiscal pressures when attempting to identify an appropriate balance among several choices. Concurrently, policy and decision makers in microbial food safety are under increasing pressure to demonstrate that their policies and decisions are made using transparent and accountable processes. In this article, we present a multi-criteria decision analysis approach that can be used to address the problem of trying to select a food safety intervention while balancing various criteria. Criteria that are important when selecting an intervention were determined, as a result of an expert consultation, to include effectiveness, cost, weight of evidence, and practicality associated with the interventions. The multi-criteria decision analysis approach we present is able to consider these criteria and arrive at a ranking of interventions. It can also provide a clear justification for the ranking as well as demonstrate to stakeholders, through a scenario analysis approach, how to potentially converge toward common ground. While this article focuses on the problem of selecting food safety interventions, the range of applications in the food safety arena is truly diverse and can be a significant tool in assisting decisions that need to be coherent, transparent, and justifiable. Most importantly, it is a significant contributor when there is a need to strike a fine balance between various potentially competing alternatives and/or stakeholder groups.

  12. Integral-geometry morphological image analysis

    NASA Astrophysics Data System (ADS)

    Michielsen, K.; De Raedt, H.

    2001-07-01

    This paper reviews a general method to characterize the morphology of two- and three-dimensional patterns in terms of geometrical and topological descriptors. Based on concepts of integral geometry, it involves the calculation of the Minkowski functionals of black-and-white images representing the patterns. The result of this approach is an objective, numerical characterization of a given pattern. We briefly review the basic elements of morphological image processing, a technique to transform images to patterns that are amenable to further morphological image analysis. The image processing technique is applied to electron microscope images of nano-ceramic particles and metal-oxide precipitates. The emphasis of this review is on the practical aspects of the integral-geometry-based morphological image analysis but we discuss its mathematical foundations as well. Applications to simple lattice structures, triply periodic minimal surfaces, and the Klein bottle serve to illustrate the basic steps of the approach. More advanced applications include random point sets, percolation and complex structures found in block copolymers.

  13. Data analysis for GOPEX image frames

    NASA Technical Reports Server (NTRS)

    Levine, B. M.; Shaik, K. S.; Yan, T.-Y.

    1993-01-01

    The data analysis based on the image frames received at the Solid State Imaging (SSI) camera of the Galileo Optical Experiment (GOPEX) demonstration conducted between 9-16 Dec. 1992 is described. Laser uplink was successfully established between the ground and the Galileo spacecraft during its second Earth-gravity-assist phase in December 1992. SSI camera frames were acquired which contained images of detected laser pulses transmitted from the Table Mountain Facility (TMF), Wrightwood, California, and the Starfire Optical Range (SOR), Albuquerque, New Mexico. Laser pulse data were processed using standard image-processing techniques at the Multimission Image Processing Laboratory (MIPL) for preliminary pulse identification and to produce public release images. Subsequent image analysis corrected for background noise to measure received pulse intensities. Data were plotted to obtain histograms on a daily basis and were then compared with theoretical results derived from applicable weak-turbulence and strong-turbulence considerations. Processing steps are described and the theories are compared with the experimental results. Quantitative agreement was found in both turbulence regimes, and better agreement would have been found, given more received laser pulses. Future experiments should consider methods to reliably measure low-intensity pulses, and through experimental planning to geometrically locate pulse positions with greater certainty.

  14. Chromatic Image Analysis For Quantitative Thermal Mapping

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  15. Making Good Decisions in Healthcare with Multi-Criteria Decision Analysis: The Use, Current Research and Future Development of MCDA.

    PubMed

    Mühlbacher, Axel C; Kaczynski, Anika

    2016-02-01

    Healthcare decision making is usually characterized by a low degree of transparency. The demand for transparent decision processes can be fulfilled only when assessment, appraisal and decisions about health technologies are performed under a systematic construct of benefit assessment. The benefit of an intervention is often multidimensional and, thus, must be represented by several decision criteria. Complex decision problems require an assessment and appraisal of various criteria; therefore, a decision process that systematically identifies the best available alternative and enables an optimal and transparent decision is needed. For that reason, decision criteria must be weighted and goal achievement must be scored for all alternatives. Methods of multi-criteria decision analysis (MCDA) are available to analyse and appraise multiple clinical endpoints and structure complex decision problems in healthcare decision making. By means of MCDA, value judgments, priorities and preferences of patients, insurees and experts can be integrated systematically and transparently into the decision-making process. This article describes the MCDA framework and identifies potential areas where MCDA can be of use (e.g. approval, guidelines and reimbursement/pricing of health technologies). A literature search was performed to identify current research in healthcare. The results showed that healthcare decision making is addressing the problem of multiple decision criteria and is focusing on the future development and use of techniques to weight and score different decision criteria. This article emphasizes the use and future benefit of MCDA.

  16. Advanced automated char image analysis techniques

    SciTech Connect

    Tao Wu; Edward Lester; Michael Cloke

    2006-05-15

    Char morphology is an important characteristic when attempting to understand coal behavior and coal burnout. In this study, an augmented algorithm has been proposed to identify char types using image analysis. On the basis of a series of image processing steps, a char image is singled out from the whole image, which then allows the important major features of the char particle to be measured, including size, porosity, and wall thickness. The techniques for automated char image analysis have been tested against char images taken from ICCP Char Atlas as well as actual char particles derived from pyrolyzed char samples. Thirty different chars were prepared in a drop tube furnace operating at 1300{sup o}C, 1% oxygen, and 100 ms from 15 different world coals sieved into two size fractions (53-75 and 106-125 {mu}m). The results from this automated technique are comparable with those from manual analysis, and the additional detail from the automated sytem has potential use in applications such as combustion modeling systems. Obtaining highly detailed char information with automated methods has traditionally been hampered by the difficulty of automatic recognition of individual char particles. 20 refs., 10 figs., 3 tabs.

  17. Computer assisted analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Sawicki, M.; Munhutu, P.; DaPonte, J.; Caragianis-Broadbridge, C.; Lehman, A.; Sadowski, T.; Garcia, E.; Heyden, C.; Mirabelle, L.; Benjamin, P.

    2009-01-01

    The use of Transmission Electron Microscopy (TEM) to characterize the microstructure of a material continues to grow in importance as technological advancements become increasingly more dependent on nanotechnology1 . Since nanoparticle properties such as size (diameter) and size distribution are often important in determining potential applications, a particle analysis is often performed on TEM images. Traditionally done manually, this has the potential to be labor intensive, time consuming, and subjective2. To resolve these issues, automated particle analysis routines are becoming more widely accepted within the community3. When using such programs, it is important to compare their performance, in terms of functionality and cost. The primary goal of this study was to apply one such software package, ImageJ to grayscale TEM images of nanoparticles with known size. A secondary goal was to compare this popular open-source general purpose image processing program to two commercial software packages. After a brief investigation of performance and price, ImageJ was identified as the software best suited for the particle analysis conducted in the study. While many ImageJ functions were used, the ability to break agglomerations that occur in specimen preparation into separate particles using a watershed algorithm was particularly helpful4.

  18. VAICo: visual analysis for image comparison.

    PubMed

    Schmidt, Johanna; Gröller, M Eduard; Bruckner, Stefan

    2013-12-01

    Scientists, engineers, and analysts are confronted with ever larger and more complex sets of data, whose analysis poses special challenges. In many situations it is necessary to compare two or more datasets. Hence there is a need for comparative visualization tools to help analyze differences or similarities among datasets. In this paper an approach for comparative visualization for sets of images is presented. Well-established techniques for comparing images frequently place them side-by-side. A major drawback of such approaches is that they do not scale well. Other image comparison methods encode differences in images by abstract parameters like color. In this case information about the underlying image data gets lost. This paper introduces a new method for visualizing differences and similarities in large sets of images which preserves contextual information, but also allows the detailed analysis of subtle variations. Our approach identifies local changes and applies cluster analysis techniques to embed them in a hierarchy. The results of this process are then presented in an interactive web application which allows users to rapidly explore the space of differences and drill-down on particular features. We demonstrate the flexibility of our approach by applying it to multiple distinct domains.

  19. On Two-Dimensional ARMA Models for Image Analysis.

    DTIC Science & Technology

    1980-03-24

    2-D ARMA models for image analysis . Particular emphasis is placed on restoration of noisy images using 2-D ARMA models. Computer results are...is concluded that the models are very effective linear models for image analysis . (Author)

  20. Extreme value distribution based gene selection criteria for discriminant microarray data analysis using logistic regression.

    PubMed

    Li, Wentian; Sun, Fengzhu; Grosse, Ivo

    2004-01-01

    One important issue commonly encountered in the analysis of microarray data is to decide which and how many genes should be selected for further studies. For discriminant microarray data analyses based on statistical models, such as the logistic regression models, gene selection can be accomplished by a comparison of the maximum likelihood of the model given the real data, L(D|M), and the expected maximum likelihood of the model given an ensemble of surrogate data with randomly permuted label, L(D(0)|M). Typically, the computational burden for obtaining L(D(0)M) is immense, often exceeding the limits of available computing resources by orders of magnitude. Here, we propose an approach that circumvents such heavy computations by mapping the simulation problem to an extreme-value problem. We present the derivation of an asymptotic distribution of the extreme-value as well as its mean, median, and variance. Using this distribution, we propose two gene selection criteria, and we apply them to two microarray datasets and three classification tasks for illustration.

  1. Using soil function evaluation in multi-criteria decision analysis for sustainability appraisal of remediation alternatives.

    PubMed

    Volchko, Yevheniya; Norrman, Jenny; Rosén, Lars; Bergknut, Magnus; Josefsson, Sarah; Söderqvist, Tore; Norberg, Tommy; Wiberg, Karin; Tysklind, Mats

    2014-07-01

    Soil contamination is one of the major threats constraining proper functioning of the soil and thus provision of ecosystem services. Remedial actions typically only address the chemical soil quality by reducing total contaminant concentrations to acceptable levels guided by land use. However, emerging regulatory requirements on soil protection demand a holistic view on soil assessment in remediation projects thus accounting for a variety of soil functions. Such a view would require not only that the contamination concentrations are assessed and attended to, but also that other aspects are taking into account, thus addressing also physical and biological as well as other chemical soil quality indicators (SQIs). This study outlines how soil function assessment can be a part of a holistic sustainability appraisal of remediation alternatives using multi-criteria decision analysis (MCDA). The paper presents a method for practitioners for evaluating the effects of remediation alternatives on selected ecological soil functions using a suggested minimum data set (MDS) containing physical, biological and chemical SQIs. The measured SQIs are transformed into sub-scores by the use of scoring curves, which allows interpretation and the integration of soil quality data into the MCDA framework. The method is demonstrated at a study site (Marieberg, Sweden) and the results give an example of how soil analyses using the suggested MDS can be used for soil function assessment and subsequent input to the MCDA framework.

  2. Multi-criteria Decision Analysis to Model Ixodes ricinus Habitat Suitability.

    PubMed

    Rousseau, Raphaël; McGrath, Guy; McMahon, Barry J; Vanwambeke, Sophie O

    2017-06-19

    Tick-borne diseases present a major threat to both human and livestock health throughout Europe. The risk of infection is directly related to the presence of its vector. Thereby it is important to know their distribution, which is strongly associated with environmental factors: the presence and availability of a suitable habitat, of a suitable climate and of hosts. The present study models the habitat suitability for Ixodes ricinus in Ireland, where data on tick distribution are scarce. Tick habitat suitability was estimated at a coarse scale (10 km) with a multi-criteria decision analysis (MCDA) method according to four different scenarios (depending on the variables used and on the weights granted to each of them). The western part of Ireland and the Wicklow mountains in the East were estimated to be the most suitable areas for I. ricinus in the island. There was a good level of agreement between results from the MCDA and recorded tick presence. The different scenarios did not affect the spatial outputs substantially. The current study suggests that tick habitat suitability can be mapped accurately at a coarse scale in a data-scarce context using knowledge-based methods. It can serve as a guideline for future countrywide sampling that would help to determine local risk of tick presence and refining knowledge on tick habitat suitability in Ireland.

  3. Rural tourism spatial distribution based on multi-criteria decision analysis and GIS

    NASA Astrophysics Data System (ADS)

    Zhang, Hongxian; Yang, Qingsheng

    2008-10-01

    To study spatial distribution of rural tourism can provide scientific decision basis for developing rural economics. Traditional ways of tourism spatial distribution have some limitations in quantifying priority locations of tourism development on small units. They can only produce the overall tourism distribution locations and whether locations are suitable to tourism development simply while the tourism develop ranking with different decision objectives should be considered. This paper presents a way to find ranking of location of rural tourism development in spatial by integrating multi-criteria decision analysis (MCDA) and geography information system (GIS). In order to develop country economics with inconvenient transportation, undeveloped economy and better tourism resource, these locations should be firstly develop rural tourism. Based on this objective, the tourism develop priority utility of each town is calculated with MCDA and GIS. Towns which should be first develop rural tourism can be selected with higher tourism develop priority utility. The method is used to find ranking of location of rural tourism in Ningbo City successfully. The result shows that MCDA is an effective way for distribution rural tourism in spatial based on special decision objectives and rural tourism can promote economic development.

  4. Development of tissue level brain injury criteria by finite element analysis.

    PubMed

    Ueno, K; Melvin, J W; Li, L; Lighthall, J W

    1995-08-01

    A three-dimensional finite element model of the direct cortical impact experiment was built and a preliminary validation against mechanical response was completed. The motion of the impactor was enforced in the model by applying the same acceleration history as that of the experimental impactor. A nonlinear contact surface algorithm was used for impactor-brain interface with the ABAQUS general purpose finite element program. The resulting motion of the impactor and the contacting node in the brain model confirmed that the impactor moved realistically and contacted the brain surface. The pressure generated in the model compared favorably with that measured by a pressure transducer in the experiment. The pattern of high shear deformation generated at the impact site in the model was similar to the pattern of contusion hemorrhage seen in the experiment. The pressure generated at the impact site propagated to the skull-brain boundary, especially, at the posterior margin of the cerebellum. Analysis of experimental data using a biomechanically validated finite element model will enable determination of tissue-level injury criteria for application in human brain models to predict head injury potential in contact, noncontact, or side impact situations.

  5. Prediction of Depression in Cancer Patients With Different Classification Criteria, Linear Discriminant Analysis versus Logistic Regression.

    PubMed

    Shayan, Zahra; Mohammad Gholi Mezerji, Naser; Shayan, Leila; Naseri, Parisa

    2015-11-03

    Logistic regression (LR) and linear discriminant analysis (LDA) are two popular statistical models for prediction of group membership. Although they are very similar, the LDA makes more assumptions about the data. When categorical and continuous variables used simultaneously, the optimal choice between the two models is questionable. In most studies, classification error (CE) is used to discriminate between subjects in several groups, but this index is not suitable to predict the accuracy of the outcome. The present study compared LR and LDA models using classification indices. This cross-sectional study selected 243 cancer patients. Sample sets of different sizes (n = 50, 100, 150, 200, 220) were randomly selected and the CE, B, and Q classification indices were calculated by the LR and LDA models. CE revealed the a lack of superiority for one model over the other, but the results showed that LR performed better than LDA for the B and Q indices in all situations. No significant effect for sample size on CE was noted for selection of an optimal model. Assessment of the accuracy of prediction of real data indicated that the B and Q indices are appropriate for selection of an optimal model. The results of this study showed that LR performs better in some cases and LDA in others when based on CE. The CE index is not appropriate for classification, although the B and Q indices performed better and offered more efficient criteria for comparison and discrimination between groups.

  6. Multiple-criteria decision analysis reveals high stakeholder preference to remove pharmaceuticals from hospital wastewater.

    PubMed

    Lienert, Judit; Koller, Mirjam; Konrad, Jonas; McArdell, Christa S; Schuwirth, Nele

    2011-05-01

    Point-source measures have been suggested to decrease pharmaceuticals in water bodies. We analyzed 68 and 50 alternatives, respectively, for a typical Swiss general and psychiatric hospital to decrease pharmaceutical discharge. Technical alternatives included reverse osmosis, ozonation, and activated carbon; organizational alternatives included urine separation. To handle this complex decision, we used Multiple-Criteria Decision Analysis (MCDA) and combined expert predictions (e.g., costs, pharmaceutical mass flows, ecotoxicological risk, pathogen removal) with subjective preference-valuations from 26 stakeholders (authorities, hospital-internal actors, experts). The general hospital contributed ca. 38% to the total pharmaceutical load at the wastewater treatment plant, the psychiatry contributed 5%. For the general hospital, alternatives removing all pharmaceuticals (especially reverse osmosis, or vacuum-toilets and incineration), performed systematically better than the status quo or urine separation, despite higher costs. They now require closer scrutiny. To remove X-ray contrast agents, introducing roadbags is promising. For the psychiatry with a lower pharmaceutical load, costs were more critical. Stakeholder feedback concerning MCDA was very positive, especially because the results were robust across different stakeholder-types. Our MCDA results provide insight into an important water protection issue: implementing measures to decrease pharmaceuticals will likely meet acceptance. Hospital point-sources merit consideration if the trade-off between costs and pharmaceutical removal is reasonable.

  7. Mapping wetland functions using Earth observation data and multi-criteria analysis.

    PubMed

    Rapinel, Sébastien; Hubert-Moy, Laurence; Clément, Bernard; Maltby, Edward

    2016-11-01

    Wetland functional assessment is commonly conducted based on field observations, and thus, is generally limited to small areas. However, there is often a need for wetland managers to obtain information on wetland functional performance over larger areas. For this purpose, we are proposing a new field-based functional assessment procedure in which wetland functions are evaluated and classified into hydrogeomorphic units according to a multi-criteria analysis approach. Wetland-related geographic information system layers derived from Earth observation data (LiDAR, multispectral and radar data) are used in this study for a large-scale functional evaluation. These include maps of a hydrogeomorphic units, ditches, vegetation, annual flood duration, biomass, meadows management, and wetland boundaries. To demonstrate the feasibility of this approach, a 132 km(2) international long-term ecological research site located in the west of France was assessed. Four wetland functions were evaluated: flood peak attenuation, low water attenuation, denitrification, and habitat. A spatial distribution map of the individual wetland functions was generated, and the intensity levels of the functions were highlighted. Antagonisms between functions within individual hydrogeomorphic units were also identified. Mapping of hydrological, biogeochemical, and ecological wetland functions over large areas can provide an efficient tool for policy makers and other stakeholders including water authorities, nature conservation agencies, and farmers. Specifically, this tool has the potential to provide a mapping of ecosystem services, conservation management priorities, and possible improvements in water resources management.

  8. ACR appropriateness criteria jaundice.

    PubMed

    Lalani, Tasneem; Couto, Corey A; Rosen, Max P; Baker, Mark E; Blake, Michael A; Cash, Brooks D; Fidler, Jeff L; Greene, Frederick L; Hindman, Nicole M; Katz, Douglas S; Kaur, Harmeet; Miller, Frank H; Qayyum, Aliya; Small, William C; Sudakoff, Gary S; Yaghmai, Vahid; Yarmish, Gail M; Yee, Judy

    2013-06-01

    A fundamental consideration in the workup of a jaundiced patient is the pretest probability of mechanical obstruction. Ultrasound is the first-line modality to exclude biliary tract obstruction. When mechanical obstruction is present, additional imaging with CT or MRI can clarify etiology, define level of obstruction, stage disease, and guide intervention. When mechanical obstruction is absent, additional imaging can evaluate liver parenchyma for fat and iron deposition and help direct biopsy in cases where underlying parenchymal disease or mass is found. Imaging techniques are reviewed for the following clinical scenarios: (1) the patient with painful jaundice, (2) the patient with painless jaundice, and (3) the patient with a nonmechanical cause for jaundice. The ACR Appropriateness Criteria are evidence-based guidelines for specific clinical conditions that are reviewed every 2 years by a multidisciplinary expert panel. The guideline development and review include an extensive analysis of current medical literature from peer-reviewed journals and the application of a well-established consensus methodology (modified Delphi) to rate the appropriateness of imaging and treatment procedures by the panel. In those instances where evidence is lacking or not definitive, expert opinion may be used to recommend imaging or treatment.

  9. Selecting an image analysis minicomputer system

    NASA Technical Reports Server (NTRS)

    Danielson, R.

    1981-01-01

    Factors to be weighed when selecting a minicomputer system as the basis for an image analysis computer facility vary depending on whether the user organization procures a new computer or selects an existing facility to serve as an image analysis host. Some conditions not directly related to hardware or software should be considered such as the flexibility of the computer center staff, their encouragement of innovation, and the availability of the host processor to a broad spectrum of potential user organizations. Particular attention must be given to: image analysis software capability; the facilities of a potential host installation; the central processing unit; the operating system and languages; main memory; disk storage; tape drives; hardcopy output; and other peripherals. The operational environment, accessibility; resource limitations; and operational supports are important. Charges made for program execution and data storage must also be examined.

  10. Multi-criteria decision analysis using hydrological indicators for decision support - a conceptual framework.

    NASA Astrophysics Data System (ADS)

    Butchart-Kuhlmann, Daniel; Kralisch, Sven; Meinhardt, Markus; Fleischer, Melanie

    2017-04-01

    Assessing the quantity and quality of water available in water stressed environments under various potential climate and land-use changes is necessary for good water and environmental resources management and governance. Within the region covered by the Southern African Science Service Centre for Climate Change and Adaptive Land Management (SASSCAL) project, such areas are common. One goal of the SASSCAL project is to develop and provide an integrated decision support system (DSS) with which decision makers (DMs) within a given catchment can obtain objective information regarding potential changes in water flow quantity and timing. The SASSCAL DSS builds upon existing data storage and distribution capability, through the SASSCAL Information System (IS), as well as the J2000 hydrological model. Using output from validated J2000 models, the SASSCAL DSS incorporates the calculation of a range of hydrological indicators based upon Indicators of Hydrological Alteration/Environmental Flow Components (IHA/EFC) calculated for a historic time series (pre-impact) and a set of model simulations based upon a selection of possible climate and land-use change scenarios (post-impact). These indicators, obtained using the IHA software package, are then used as input for a multi-criteria decision analysis (MCDA) undertaken using the open source diviz software package. The results of these analyses will provide DMs with an indication as to how various hydrological indicators within a catchment may be altered under different future scenarios, as well providing a ranking of how each scenario is preferred according to different DM preferences. Scenarios are represented through a combination of model input data and parameter settings in J2000, and preferences are represented through criteria weighting in the MCDA. Here, the methodology is presented and applied to the J2000 Luanginga model results using a set of hypothetical decision maker preference values as input for an MCDA based on

  11. CLINICAL AUDIT OF IMAGE QUALITY IN RADIOLOGY USING VISUAL GRADING CHARACTERISTICS ANALYSIS.

    PubMed

    Tesselaar, Erik; Dahlström, Nils; Sandborg, Michael

    2016-06-01

    The aim of this work was to assess whether an audit of clinical image quality could be efficiently implemented within a limited time frame using visual grading characteristics (VGC) analysis. Lumbar spine radiography, bedside chest radiography and abdominal CT were selected. For each examination, images were acquired or reconstructed in two ways. Twenty images per examination were assessed by 40 radiology residents using visual grading of image criteria. The results were analysed using VGC. Inter-observer reliability was assessed. The results of the visual grading analysis were consistent with expected outcomes. The inter-observer reliability was moderate to good and correlated with perceived image quality (r(2) = 0.47). The median observation time per image or image series was within 2 min. These results suggest that the use of visual grading of image criteria to assess the quality of radiographs provides a rapid method for performing an image quality audit in a clinical environment. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Image analysis of insulation mineral fibres.

    PubMed

    Talbot, H; Lee, T; Jeulin, D; Hanton, D; Hobbs, L W

    2000-12-01

    We present two methods for measuring the diameter and length of man-made vitreous fibres based on the automated image analysis of scanning electron microscopy images. The fibres we want to measure are used in materials such as glass wool, which in turn are used for thermal and acoustic insulation. The measurement of the diameters and lengths of these fibres is used by the glass wool industry for quality control purposes. To obtain reliable quality estimators, the measurement of several hundred images is necessary. These measurements are usually obtained manually by operators. Manual measurements, although reliable when performed by skilled operators, are slow due to the need for the operators to rest often to retain their ability to spot faint fibres on noisy backgrounds. Moreover, the task of measuring thousands of fibres every day, even with the help of semi-automated image analysis systems, is dull and repetitive. The need for an automated procedure which could replace manual measurements is quite real. For each of the two methods that we propose to accomplish this task, we present the sample preparation, the microscope setting and the image analysis algorithms used for the segmentation of the fibres and for their measurement. We also show how a statistical analysis of the results can alleviate most measurement biases, and how we can estimate the true distribution of fibre lengths by diameter class by measuring only the lengths of the fibres visible in the field of view.

  13. From Image Analysis to Computer Vision: Motives, Methods, and Milestones.

    DTIC Science & Technology

    1998-07-01

    images. Initially, work on digital image analysis dealt with specific classes of images such as text, photomicrographs, nuclear particle tracks, and aerial...photographs; but by the 1960’s, general algorithms and paradigms for image analysis began to be formulated. When the artificial intelligence...scene, but eventually from image sequences obtained by a moving camera; at this stage, image analysis had become scene analysis or computer vision

  14. Automated eXpert Spectral Image Analysis

    SciTech Connect

    Keenan, Michael R.

    2003-11-25

    AXSIA performs automated factor analysis of hyperspectral images. In such images, a complete spectrum is collected an each point in a 1-, 2- or 3- dimensional spatial array. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful information. Multivariate factor analysis techniques have proven effective for extracting the essential information from high dimensional data sets into a limted number of factors that describe the spectral characteristics and spatial distributions of the pure components comprising the sample. AXSIA provides tools to estimate different types of factor models including Singular Value Decomposition (SVD), Principal Component Analysis (PCA), PCA with factor rotation, and Alternating Least Squares-based Multivariate Curve Resolution (MCR-ALS). As part of the analysis process, AXSIA can automatically estimate the number of pure components that comprise the data and can scale the data to account for Poisson noise. The data analysis methods are fundamentally based on eigenanalysis of the data crossproduct matrix coupled with orthogonal eigenvector rotation and constrained alternating least squares refinement. A novel method for automatically determining the number of significant components, which is based on the eigenvalues of the crossproduct matrix, has also been devised and implemented. The data can be compressed spectrally via PCA and spatially through wavelet transforms, and algorithms have been developed that perform factor analysis in the transform domain while retaining full spatial and spectral resolution in the final result. These latter innovations enable the analysis of larger-than core-memory spectrum-images. AXSIA was designed to perform automated chemical phase analysis of spectrum-images acquired by a variety of chemical imaging techniques. Successful applications include Energy Dispersive X-ray Spectroscopy, X-ray Fluorescence

  15. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    NASA Astrophysics Data System (ADS)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  16. Objective facial photograph analysis using imaging software.

    PubMed

    Pham, Annette M; Tollefson, Travis T

    2010-05-01

    Facial analysis is an integral part of the surgical planning process. Clinical photography has long been an invaluable tool in the surgeon's practice not only for accurate facial analysis but also for enhancing communication between the patient and surgeon, for evaluating postoperative results, for medicolegal documentation, and for educational and teaching opportunities. From 35-mm slide film to the digital technology of today, clinical photography has benefited greatly from technological advances. With the development of computer imaging software, objective facial analysis becomes easier to perform and less time consuming. Thus, while the original purpose of facial analysis remains the same, the process becomes much more efficient and allows for some objectivity. Although clinical judgment and artistry of technique is never compromised, the ability to perform objective facial photograph analysis using imaging software may become the standard in facial plastic surgery practices in the future.

  17. Motion Analysis From Television Images

    NASA Astrophysics Data System (ADS)

    Silberberg, George G.; Keller, Patrick N.

    1982-02-01

    The Department of Defense ranges have relied on photographic instrumentation for gathering data of firings for all types of ordnance. A large inventory of cameras are available on the market that can be used for these tasks. A new set of optical instrumentation is beginning to appear which, in many cases, can directly replace photographic cameras for a great deal of the work being performed now. These are television cameras modified so they can stop motion, see in the dark, perform under hostile environments, and provide real time information. This paper discusses techniques for modifying television cameras so they can be used for motion analysis.

  18. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    DTIC Science & Technology

    2007-11-02

    Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for

  19. Automatic quantitative analysis of t-tubule organization in cardiac myocytes using ImageJ.

    PubMed

    Pasqualin, Côme; Gannier, François; Malécot, Claire O; Bredeloux, Pierre; Maupoil, Véronique

    2015-02-01

    The transverse tubule system in mammalian striated muscle is highly organized and contributes to optimal and homogeneous contraction. Diverse pathologies such as heart failure and atrial fibrillation include disorganization of t-tubules and contractile dysfunction. Few tools are available for the quantification of the organization of the t-tubule system. We developed a plugin for the ImageJ/Fiji image analysis platform developed by the National Institutes of Health. This plugin (TTorg) analyzes raw confocal microscopy images. Analysis options include the whole image, specific regions of the image (cropping), and z-axis analysis of the same image. Batch analysis of a series of images with identical criteria is also one of the options. There is no need to either reorientate any specimen to the horizontal or to do a thresholding of the image to perform analysis. TTorg includes a synthetic "myocyte-like" image generator to test the plugin's efficiency in the user's own experimental conditions. This plugin was validated on synthetic images for different simulated cell characteristics and acquisition parameters. TTorg was able to detect significant differences between the organization of the t-tubule systems in experimental data of mouse ventricular myocytes isolated from wild-type and dystrophin-deficient mice. TTorg is freely distributed, and its source code is available. It provides a reliable, easy-to-use, automatic, and unbiased measurement of t-tubule organization in a wide variety of experimental conditions. Copyright © 2015 the American Physiological Society.

  20. Analysis of extensively washed hair from cocaine users and drug chemists to establish new reporting criteria.

    PubMed

    Morris-Kukoski, Cynthia L; Montgomery, Madeline A; Hammer, Rena L

    2014-01-01

    Samples from a self-proclaimed cocaine (COC) user, from 19 drug users (postmortem) and from 27 drug chemists were extensively washed and analyzed for COC, benzoylecgonine, norcocaine (NC), cocaethylene (CE) and aryl hydroxycocaines by liquid chromatography-tandem mass spectrometry. Published wash criteria and cutoffs were applied to the results. Additionally, the data were used to formulate new reporting criteria and interpretation guidelines for forensic casework. Applying the wash and reporting criteria, hair that was externally contaminated with COC was distinguished from hair collected from individuals known to have consumed COC. In addition, CE, NC and hydroxycocaine metabolites were only present in COC users' hair and not in drug chemists' hair. When properly applied, the use of an extended wash, along with the reporting criteria defined here, will exclude false-positive results from environmental contact with COC.

  1. Recognizing systemic sclerosis: comparative analysis of various sets of classification criteria.

    PubMed

    Romanowska-Próchnicka, Katarzyna; Walczyk, Marcela; Olesińska, Marzena

    2016-01-01

    Systemic sclerosis is a complex disease characterized by autoimmunity, vasculopathy and tissue fibrosis. Although most patients present with some degree of skin sclerosis, which is a distinguishing hallmark, the clinical presentation vary greatly complicating the diagnosis. In this regard, new classification criteria were jointly published in 2013 by American College of Rheumatology (ACR) and European League Against Rheumatism (EULAR). A recent major development in the classification criteria is improved sensitivity, particularly for detecting early disease. The new criteria allow more cases to be classified as having systemic sclerosis (SSc), which leads to earlier treatment. Moreover it is clinically beneficial in preventing the disease progression with its irreversible fibrosis and organ damage. The aim of this review is to give insight into new classification criteria and current trends in the diagnosis of systemic sclerosis.

  2. Recognizing systemic sclerosis: comparative analysis of various sets of classification criteria

    PubMed Central

    Romanowska-Próchnicka, Katarzyna; Olesińska, Marzena

    2016-01-01

    Systemic sclerosis is a complex disease characterized by autoimmunity, vasculopathy and tissue fibrosis. Although most patients present with some degree of skin sclerosis, which is a distinguishing hallmark, the clinical presentation vary greatly complicating the diagnosis. In this regard, new classification criteria were jointly published in 2013 by American College of Rheumatology (ACR) and European League Against Rheumatism (EULAR). A recent major development in the classification criteria is improved sensitivity, particularly for detecting early disease. The new criteria allow more cases to be classified as having systemic sclerosis (SSc), which leads to earlier treatment. Moreover it is clinically beneficial in preventing the disease progression with its irreversible fibrosis and organ damage. The aim of this review is to give insight into new classification criteria and current trends in the diagnosis of systemic sclerosis. PMID:28115780

  3. Multi-Criteria Decision Making for a Spatial Decision Support System on the Analysis of Changing Risk

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; van Westen, Cees; Bakker, Wim H.; Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    Natural hazard risk management requires decision making in several stages. Decision making on alternatives for risk reduction planning starts with an intelligence phase for recognition of the decision problems and identifying the objectives. Development of the alternatives and assigning the variable by decision makers to each alternative are employed to the design phase. Final phase evaluates the optimal choice by comparing the alternatives, defining indicators, assigning a weight to each and ranking them. This process is referred to as Multi-Criteria Decision Making analysis (MCDM), Multi-Criteria Evaluation (MCE) or Multi-Criteria Analysis (MCA). In the framework of the ongoing 7th Framework Program "CHANGES" (2011-2014, Grant Agreement No. 263953) of the European Commission, a Spatial Decision Support System is under development, that has the aim to analyse changes in hydro-meteorological risk and provide support to selecting the best risk reduction alternative. This paper describes the module for Multi-Criteria Decision Making analysis (MCDM) that incorporates monetary and non-monetary criteria in the analysis of the optimal alternative. The MCDM module consists of several components. The first step is to define criteria (or Indicators) which are subdivided into disadvantages (criteria that indicate the difficulty for implementing the risk reduction strategy, also referred to as Costs) and advantages (criteria that indicate the favorability, also referred to as benefits). In the next step the stakeholders can use the developed web-based tool for prioritizing criteria and decision matrix. Public participation plays a role in decision making and this is also planned through the use of a mobile web-version where the general local public can indicate their agreement on the proposed alternatives. The application is being tested through a case study related to risk reduction of a mountainous valley in the Alps affected by flooding. Four alternatives are evaluated in

  4. Discussion paper on applicability of oil and grease analysis for RCRA closure criteria

    SciTech Connect

    1995-02-01

    A site characterization (SC) was performed for the Building 9409-5 Diked Tank Storage Facility. The initial SC indicated areas which had oil and grease levels above the criteria of the currently proposed RCRA closure plan. After further investigation, it was demonstrated that the oil and grease parameter may not be an accurate indication of a release from this facility and should not be included as a contaminant of concern in the closure criteria.

  5. Addressing preference heterogeneity in public health policy by combining Cluster Analysis and Multi-Criteria Decision Analysis: Proof of Method.

    PubMed

    Kaltoft, Mette Kjer; Turner, Robin; Cunich, Michelle; Salkeld, Glenn; Nielsen, Jesper Bo; Dowie, Jack

    2015-01-01

    The use of subgroups based on biological-clinical and socio-demographic variables to deal with population heterogeneity is well-established in public policy. The use of subgroups based on preferences is rare, except when religion based, and controversial. If it were decided to treat subgroup preferences as valid determinants of public policy, a transparent analytical procedure is needed. In this proof of method study we show how public preferences could be incorporated into policy decisions in a way that respects both the multi-criterial nature of those decisions, and the heterogeneity of the population in relation to the importance assigned to relevant criteria. It involves combining Cluster Analysis (CA), to generate the subgroup sets of preferences, with Multi-Criteria Decision Analysis (MCDA), to provide the policy framework into which the clustered preferences are entered. We employ three techniques of CA to demonstrate that not only do different techniques produce different clusters, but that choosing among techniques (as well as developing the MCDA structure) is an important task to be undertaken in implementing the approach outlined in any specific policy context. Data for the illustrative, not substantive, application are from a Randomized Controlled Trial of online decision aids for Australian men aged 40-69 years considering Prostate-specific Antigen testing for prostate cancer. We show that such analyses can provide policy-makers with insights into the criterion-specific needs of different subgroups. Implementing CA and MCDA in combination to assist in the development of policies on important health and community issues such as drug coverage, reimbursement, and screening programs, poses major challenges -conceptual, methodological, ethical-political, and practical - but most are exposed by the techniques, not created by them.

  6. SCORE: a novel multi-criteria decision analysis approach to assessing the sustainability of contaminated land remediation.

    PubMed

    Rosén, Lars; Back, Pär-Erik; Söderqvist, Tore; Norrman, Jenny; Brinkhoff, Petra; Norberg, Tommy; Volchko, Yevheniya; Norin, Malin; Bergknut, Magnus; Döberl, Gernot

    2015-04-01

    The multi-criteria decision analysis (MCDA) method provides for a comprehensive and transparent basis for performing sustainability assessments. Development of a relevant MCDA-method requires consideration of a number of key issues, e.g. (a) definition of assessment boundaries, (b) definition of performance scales, both temporal and spatial, (c) selection of relevant criteria (indicators) that facilitate a comprehensive sustainability assessment while avoiding double-counting of effects, and (d) handling of uncertainties. Adding to the complexity is the typically wide variety of inputs, including quantifications based on existing data, expert judgements, and opinions expressed in interviews. The SCORE (Sustainable Choice Of REmediation) MCDA-method was developed to provide a transparent assessment of the sustainability of possible remediation alternatives for contaminated sites relative to a reference alternative, considering key criteria in the economic, environmental, and social sustainability domains. The criteria were identified based on literature studies, interviews and focus-group meetings. SCORE combines a linear additive model to rank the alternatives with a non-compensatory approach to identify alternatives regarded as non-sustainable. The key strengths of the SCORE method are as follows: a framework that at its core is designed to be flexible and transparent; the possibility to integrate both quantitative and qualitative estimations on criteria; its ability, unlike other sustainability assessment tools used in industry and academia, to allow for the alteration of boundary conditions where necessary; the inclusion of a full uncertainty analysis of the results, using Monte Carlo simulation; and a structure that allows preferences and opinions of involved stakeholders to be openly integrated into the analysis. A major insight from practical application of SCORE is that its most important contribution may be that it initiates a process where criteria

  7. Multicriteria decision analysis methods with 1000Minds for developing systemic sclerosis classification criteria.

    PubMed

    Johnson, Sindhu R; Naden, Raymond P; Fransen, Jaap; van den Hoogen, Frank; Pope, Janet E; Baron, Murray; Tyndall, Alan; Matucci-Cerinic, Marco; Denton, Christopher P; Distler, Oliver; Gabrielli, Armando; van Laar, Jacob M; Mayes, Maureen; Steen, Virginia; Seibold, James R; Clements, Phillip; Medsger, Thomas A; Carreira, Patricia E; Riemekasten, Gabriela; Chung, Lorinda; Fessler, Barri J; Merkel, Peter A; Silver, Richard; Varga, John; Allanore, Yannick; Mueller-Ladner, Ulf; Vonk, Madelon C; Walker, Ulrich A; Cappelli, Susanna; Khanna, Dinesh

    2014-06-01

    Classification criteria for systemic sclerosis (SSc) are being developed. The objectives were to develop an instrument for collating case data and evaluate its sensibility; use forced-choice methods to reduce and weight criteria; and explore agreement among experts on the probability that cases were classified as SSc. A standardized instrument was tested for sensibility. The instrument was applied to 20 cases covering a range of probabilities that each had SSc. Experts rank ordered cases from highest to lowest probability; reduced and weighted the criteria using forced-choice methods; and reranked the cases. Consistency in rankings was evaluated using intraclass correlation coefficients (ICCs). Experts endorsed clarity (83%), comprehensibility (100%), face and content validity (100%). Criteria were weighted (points): finger skin thickening (14-22), fingertip lesions (9-21), friction rubs (21), finger flexion contractures (16), pulmonary fibrosis (14), SSc-related antibodies (15), Raynaud phenomenon (13), calcinosis (12), pulmonary hypertension (11), renal crisis (11), telangiectasia (10), abnormal nailfold capillaries (10), esophageal dilation (7), and puffy fingers (5). The ICC across experts was 0.73 [95% confidence interval (CI): 0.58, 0.86] and improved to 0.80 (95% CI: 0.68, 0.90). Using a sensible instrument and forced-choice methods, the number of criteria were reduced by 39% (range, 23-14) and weighted. Our methods reflect the rigors of measurement science and serve as a template for developing classification criteria. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Medical image analysis with artificial neural networks.

    PubMed

    Jiang, J; Trundle, P; Ren, J

    2010-12-01

    Given that neural networks have been widely reported in the research community of medical imaging, we provide a focused literature survey on recent neural network developments in computer-aided diagnosis, medical image segmentation and edge detection towards visual content analysis, and medical image registration for its pre-processing and post-processing, with the aims of increasing awareness of how neural networks can be applied to these areas and to provide a foundation for further research and practical development. Representative techniques and algorithms are explained in detail to provide inspiring examples illustrating: (i) how a known neural network with fixed structure and training procedure could be applied to resolve a medical imaging problem; (ii) how medical images could be analysed, processed, and characterised by neural networks; and (iii) how neural networks could be expanded further to resolve problems relevant to medical imaging. In the concluding section, a highlight of comparisons among many neural network applications is included to provide a global view on computational intelligence with neural networks in medical imaging. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Tunable filter-based multispectral imaging for detection of blood stains on construction material substrates. Part 1. Developing blood stain discrimination criteria.

    PubMed

    Janchaysang, Suwatwong; Sumriddetchkajorn, Sarun; Buranasiri, Prathan

    2012-10-10

    In this article, we establish blood stain detection criteria that are less substrate dependent for use in a liquid crystal tunable filter-based multispectral-imaging system. Kubelka-Munk (KM) theory is applied to transform the acquired stains' reflectance spectra into the less substrate dependent spectra. Chosen spectral parameters are extracted from the KM absorbance spectra of several stain samples on several substrates. Blood discrimination criteria based upon those spectral parameters are then established from empirical data, tested, and refined. In our newly invented method, instead of introducing conventional contrast enhancement on the blood stain image, blood stain determination is executed mathematically via Boolean logic, resulting in more discriminative blood stain identification. This proposed approach allows for nondestructive, quick, discriminative, and easy-to-improve presumptive blood stain detection. Experimental results confirm that our blood stain discrimination criteria can be used to locate blood stains on several construction materials with high precision. True positive rates (sensitivity) from 0.60 to 0.95 are achieved depending on blood stain faintness and substrate types. Also, true negative rates (specificity) between 0.55 and 0.96 and identification time of 4-5 min are accomplished, respectively. The established blood stain discrimination criteria will be incorporated in a real blood stain detection system in part 2 of this article, where system design and considerations as well as speed issues are discussed.

  10. Can physicians identify inappropriate nuclear stress tests? An examination of inter-rater reliability for the 2009 appropriate use criteria for radionuclide imaging.

    PubMed

    Ye, Siqin; Rabbani, LeRoy E; Kelly, Christopher R; Kelly, Maureen R; Lewis, Matthew; Paz, Yehuda; Peck, Clara L; Rao, Shaline; Bokhari, Sabahat; Weiner, Shepard D; Einstein, Andrew J

    2015-01-01

    We sought to determine inter-rater reliability of the 2009 Appropriate Use Criteria for radionuclide imaging and whether physicians at various levels of training can effectively identify nuclear stress tests with inappropriate indications. Four hundred patients were randomly selected from a consecutive cohort of patients undergoing nuclear stress testing at an academic medical center. Raters with different levels of training (including cardiology attending physicians, cardiology fellows, internal medicine hospitalists, and internal medicine interns) classified individual nuclear stress tests using the 2009 Appropriate Use Criteria. Consensus classification by 2 cardiologists was considered the operational gold standard, and sensitivity and specificity of individual raters for identifying inappropriate tests were calculated. Inter-rater reliability of the Appropriate Use Criteria was assessed using Cohen κ statistics for pairs of different raters. The mean age of patients was 61.5 years; 214 (54%) were female. The cardiologists rated 256 (64%) of 400 nuclear stress tests as appropriate, 68 (18%) as uncertain, 55 (14%) as inappropriate; 21 (5%) tests were unable to be classified. Inter-rater reliability for noncardiologist raters was modest (unweighted Cohen κ, 0.51, 95% confidence interval, 0.45-0.55). Sensitivity of individual raters for identifying inappropriate tests ranged from 47% to 82%, while specificity ranged from 85% to 97%. Inter-rater reliability for the 2009 Appropriate Use Criteria for radionuclide imaging is modest, and there is considerable variation in the ability of raters at different levels of training to identify inappropriate tests. © 2015 American Heart Association, Inc.

  11. Deep Learning in Medical Image Analysis.

    PubMed

    Shen, Dinggang; Wu, Guorong; Suk, Heung-Il

    2017-03-09

    This review covers computer-assisted analysis of images in the field of medical imaging. Recent advances in machine learning, especially with regard to deep learning, are helping to identify, classify, and quantify patterns in medical images. At the core of these advances is the ability to exploit hierarchical feature representations learned solely from data, instead of features designed by hand according to domain-specific knowledge. Deep learning is rapidly becoming the state of the art, leading to enhanced performance in various medical applications. We introduce the fundamentals of deep learning methods and review their successes in image registration, detection of anatomical and cellular structures, tissue segmentation, computer-aided disease diagnosis and prognosis, and so on. We conclude by discussing research issues and suggesting future directions for further improvement. Expected final online publication date for the Annual Review of Biomedical Engineering Volume 19 is June 4, 2017. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  12. Fourier analysis: from cloaking to imaging

    NASA Astrophysics Data System (ADS)

    Wu, Kedi; Cheng, Qiluan; Wang, Guo Ping

    2016-04-01

    Regarding invisibility cloaks as an optical imaging system, we present a Fourier approach to analytically unify both Pendry cloaks and complementary media-based invisibility cloaks into one kind of cloak. By synthesizing different transfer functions, we can construct different devices to realize a series of interesting functions such as hiding objects (events), creating illusions, and performing perfect imaging. In this article, we give a brief review on recent works of applying Fourier approach to analysis invisibility cloaks and optical imaging through scattering layers. We show that, to construct devices to conceal an object, no constructive materials with extreme properties are required, making most, if not all, of the above functions realizable by using naturally occurring materials. As instances, we experimentally verify a method of directionally hiding distant objects and create illusions by using all-dielectric materials, and further demonstrate a non-invasive method of imaging objects completely hidden by scattering layers.

  13. Curvelet Based Offline Analysis of SEM Images

    PubMed Central

    Shirazi, Syed Hamad; Haq, Nuhman ul; Hayat, Khizar; Naz, Saeeda; Haque, Ihsan ul

    2014-01-01

    Manual offline analysis, of a scanning electron microscopy (SEM) image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method employs a state of the art Curvelet transform followed by segmentation through a combination of entropy filtering, thresholding and mathematical morphology (MM). The quantification is carried out by the application of a box-counting algorithm, for fractal dimension (FD) calculations, with the ultimate goal of measuring the parameters, like surface area and perimeter. The perimeter is estimated indirectly by counting the boundary boxes of the filled shapes. The proposed method, when applied to a representative set of SEM images, not only showed better results in image segmentation but also exhibited a good accuracy in the calculation of surface area and perimeter. The proposed method outperforms the well-known Watershed segmentation algorithm. PMID:25089617

  14. Update on appropriate use criteria for amyloid PET imaging: dementia experts, mild cognitive impairment, and education. Amyloid Imaging Task Force of the Alzheimer’s Association and Society for Nuclear Medicine and Molecular Imaging.

    PubMed

    Johnson, Keith A; Minoshima, Satoshi; Bohnen, Nicolaas I; Donohoe, Kevin J; Foster, Norman L; Herscovitch, Peter; Karlawish, Jason H; Rowe, Christopher C; Hedrick, Saima; Pappas, Virginia; Carrillo, Maria C; Hartley, Dean M

    2013-07-01

    Amyloid PET imaging is a novel diagnostic test that can detect in living humans one of the two defining pathologic lesions of Alzheimer disease, amyloid-β deposition in the brain. The Amyloid Imaging Task Force of the Alzheimer's Association and Society for Nuclear Medicine and Molecular Imaging previously published appropriate use criteria for amyloid PET as an important tool for increasing the certainty of a diagnosis of Alzheimer disease in specific patient populations. Here, the task force further clarifies and expands 3 topics discussed in the original paper: first, defining dementia experts and their use of proper documentation to demonstrate the medical necessity of an amyloid PET scan; second, identifying a specific subset of individuals with mild cognitive impairment for whom an amyloid PET scan is appropriate; and finally, developing educational programs to increase awareness of the amyloid PET appropriate use criteria and providing instructions on how this test should be used in the clinical decision-making process. Copyright © 2013 The Alzheimer's Association. All rights reserved.

  15. Multi-criteria decision analysis for bioenergy in the Centre Region of Portugal

    NASA Astrophysics Data System (ADS)

    Esteves, T. C. J.; Cabral, P.; Ferreira, A. J. D.; Teixeira, J. C.

    2012-04-01

    With the consumption of fossil fuels, the resources essential to Man's survival are being rapidly contaminated. A sustainable future may be achieved by the use of renewable energies, allowing countries without non-renewable energy resources to guarantee energetic sovereignty. Using bioenergy may mean a steep reduction and/or elimination of the external dependency, enhancing the countries' capital and potentially reducing of the negative effects that outcome from the use of fossil fuels, such as loss of biodiversity, air, water, and soil pollution, … This work's main focus is to increase bioenergy use in the centre region of Portugal by allying R&D to facilitate determination of bioenergy availability and distribution throughout the study area.This analysis is essential, given that nowadays this knowledge is still very limited in the study area. Geographic Information Systems (GIS) was the main tool used to asses this study, due to its unseeingly ability to integrate various types of information (such as alphanumerical, statistical, geographical, …) and various sources of biomass (forest, agricultural, husbandry, municipal and industrial residues, shrublands, used vegetable oil and energy crops) to determine the bioenergy potential of the study area, as well as their spatial distribution. By allying GIS with multi-criteria decision analysis, the initial table-like information of difficult comprehension is transformed into tangible and easy to read results: both intermediate and final results of the created models will facilitate the decision making process. General results show that the major contributors for the bioenergy potential in the Centre Region of Portugal are forest residues, which are mostly located in the inner region of the study area. However, a more detailed analysis should be made to analyze the viability to use energy crops. As a main conclusion, we can say that, although this region may not use only this type of energy to be completely

  16. Measuring toothbrush interproximal penetration using image analysis

    NASA Astrophysics Data System (ADS)

    Hayworth, Mark S.; Lyons, Elizabeth K.

    1994-09-01

    An image analysis method of measuring the effectiveness of a toothbrush in reaching the interproximal spaces of teeth is described. Artificial teeth are coated with a stain that approximates real plaque and then brushed with a toothbrush on a brushing machine. The teeth are then removed and turned sideways so that the interproximal surfaces can be imaged. The areas of stain that have been removed within masked regions that define the interproximal regions are measured and reported. These areas correspond to the interproximal areas of the tooth reached by the toothbrush bristles. The image analysis method produces more precise results (10-fold decrease in standard deviation) in a fraction (22%) of the time as compared to our prior visual grading method.

  17. Piecewise flat embeddings for hyperspectral image analysis

    NASA Astrophysics Data System (ADS)

    Hayes, Tyler L.; Meinhold, Renee T.; Hamilton, John F.; Cahill, Nathan D.

    2017-05-01

    Graph-based dimensionality reduction techniques such as Laplacian Eigenmaps (LE), Local Linear Embedding (LLE), Isometric Feature Mapping (ISOMAP), and Kernel Principal Components Analysis (KPCA) have been used in a variety of hyperspectral image analysis applications for generating smooth data embeddings. Recently, Piecewise Flat Embeddings (PFE) were introduced in the computer vision community as a technique for generating piecewise constant embeddings that make data clustering / image segmentation a straightforward process. In this paper, we show how PFE arises by modifying LE, yielding a constrained ℓ1-minimization problem that can be solved iteratively. Using publicly available data, we carry out experiments to illustrate the implications of applying PFE to pixel-based hyperspectral image clustering and classification.

  18. Unsupervised hyperspectral image analysis using independent component analysis (ICA)

    SciTech Connect

    S. S. Chiang; I. W. Ginsberg

    2000-06-30

    In this paper, an ICA-based approach is proposed for hyperspectral image analysis. It can be viewed as a random version of the commonly used linear spectral mixture analysis, in which the abundance fractions in a linear mixture model are considered to be unknown independent signal sources. It does not require the full rank of the separating matrix or orthogonality as most ICA methods do. More importantly, the learning algorithm is designed based on the independency of the material abundance vector rather than the independency of the separating matrix generally used to constrain the standard ICA. As a result, the designed learning algorithm is able to converge to non-orthogonal independent components. This is particularly useful in hyperspectral image analysis since many materials extracted from a hyperspectral image may have similar spectral signatures and may not be orthogonal. The AVIRIS experiments have demonstrated that the proposed ICA provides an effective unsupervised technique for hyperspectral image classification.

  19. Carbon storage, timber production, and biodiversity: comparing ecosystem services with multi-criteria decision analysis

    USGS Publications Warehouse

    Schwenk, W. Scott; Donovan, Therese; Keeton, William S.; Nunery, Jared S.

    2012-01-01

    Increasingly, land managers seek ways to manage forests for multiple ecosystem services and functions, yet considerable challenges exist in comparing disparate services and balancing trade-offs among them. We applied multi-criteria decision analysis (MCDA) and forest simulation models to simultaneously consider three objectives: (1) storing carbon, (2) producing timber and wood products, and (3) sustaining biodiversity. We used the Forest Vegetation Simulator (FVS) applied to 42 northern hardwood sites to simulate forest development over 100 years and to estimate carbon storage and timber production. We estimated biodiversity implications with occupancy models for 51 terrestrial bird species that were linked to FVS outputs. We simulated four alternative management prescriptions that spanned a range of harvesting intensities and forest structure retention. We found that silvicultural approaches emphasizing less frequent harvesting and greater structural retention could be expected to achieve the greatest net carbon storage but also produce less timber. More intensive prescriptions would enhance biodiversity because positive responses of early successional species exceeded negative responses of late successional species within the heavily forested study area. The combinations of weights assigned to objectives had a large influence on which prescriptions were scored as optimal. Overall, we found that a diversity of silvicultural approaches is likely to be preferable to any single approach, emphasizing the need for landscape-scale management to provide a full range of ecosystem goods and services. Our analytical framework that combined MCDA with forest simulation modeling was a powerful tool in understanding trade-offs among management objectives and how they can be simultaneously accommodated.

  20. Environmental condition assessment of US military installations using GIS based spatial multi-criteria decision analysis.

    PubMed

    Singer, Steve; Wang, Guangxing; Howard, Heidi; Anderson, Alan

    2012-08-01

    Environment functions in various aspects including soil and water conservation, biodiversity and habitats, and landscape aesthetics. Comprehensive assessment of environmental condition is thus a great challenge. The issues include how to assess individual environmental components such as landscape aesthetics and integrate them into an indicator that can comprehensively quantify environmental condition. In this study, a geographic information systems based spatial multi-criteria decision analysis was used to integrate environmental variables and create the indicator. This approach was applied to Fort Riley Military installation in which land condition and its dynamics due to military training activities were assessed. The indicator was derived by integrating soil erosion, water quality, landscape fragmentation, landscape aesthetics, and noise based on the weights from the experts by assessing and ranking the environmental variables in terms of their importance. The results showed that landscape level indicator well quantified the overall environmental condition and its dynamics, while the indicator at level of patch that is defined as a homogeneous area that is different from its surroundings detailed the spatiotemporal variability of environmental condition. The environmental condition was mostly determined by soil erosion, then landscape fragmentation, water quality, landscape aesthetics, and noise. Overall, environmental condition at both landscape and patch levels greatly varied depending on the degree of ground and canopy disturbance and their spatial patterns due to military training activities and being related to slope. It was also determined the environment itself could be recovered quickly once military training was halt or reduced. Thus, this study provided an effective tool for the army land managers to monitor environmental dynamics and plan military training activities. Its limitation lies at that the obtained values of the indicator vary and are

  1. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation

    PubMed Central

    2016-01-01

    River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA) is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes), elicited the experts´ preferences for objectives on a standardized scale (value functions) and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral), depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and resource

  2. What Should Be the Cut Point for Classification Criteria of Studies in Gout? A Conjoint Analysis.

    PubMed

    Fransen, Jaap; Kievit, Wietske; Neogi, Tuhina; Schumacher, Ralph; Jansen, Tim; Dalbeth, Nicola; Taylor, William J

    2016-11-01

    To determine the acceptable level of positive predictive value (PPV) and negative predictive value (NPV) for classification criteria for gout, given the type of study. We conducted an international web-based survey with 91 general practitioners and rheumatologists experienced in gout. Conjoint analysis was used as the framework for designing and analyzing pairs of 2 profiles, each describing a study type, a PPV, and an NPV. There were 5 study types presented: a phase III randomized controlled trial (RCT) of a nonsteroidal antiinflammatory drug versus prednisone for acute gout flares, a phase III RCT of a biologic agent for acute gout flares, a phase II RCT of a novel uricosuric drug of unknown efficacy and limited toxicity data, a case-control, genome-wide association study of gout, and a cohort study examining long-term outcomes of gout. PPV and NPV both had 5 levels ranging from 60-99%. The panelists in majority were male (65%) rheumatologists (93%) with an average of 19 years of practice, seeing 5 to 60 gout patients monthly. PPV was most highly weighted in decision making: the relative importance was 59% for PPV, 29% for NPV, and 13% for study type. The preferred PPV was 90% or 80%, with an accompanying NPV of 70% or 80%, dependent on study type. Preferred PPVs and NPVs range between 70% and 90% and differ by study type. A single cut point can be a reasonable approach for all study types if a PPV of 90% and NPV of 80% is approximated. © 2016, American College of Rheumatology.

  3. Digital image analysis of haematopoietic clusters.

    PubMed

    Benzinou, A; Hojeij, Y; Roudot, A-C

    2005-02-01

    Counting and differentiating cell clusters is a tedious task when performed with a light microscope. Moreover, biased counts and interpretation are difficult to avoid because of the difficulties to evaluate the limits between different types of clusters. Presented here, is a computer-based application able to solve these problems. The image analysis system is entirely automatic, from the stage screening, to the statistical analysis of the results of each experimental plate. Good correlations are found with measurements made by a specialised technician.

  4. ImageJ: Image processing and analysis in Java

    NASA Astrophysics Data System (ADS)

    Rasband, W. S.

    2012-06-01

    ImageJ is a public domain Java image processing program inspired by NIH Image. It can display, edit, analyze, process, save and print 8-bit, 16-bit and 32-bit images. It can read many image formats including TIFF, GIF, JPEG, BMP, DICOM, FITS and "raw". It supports "stacks", a series of images that share a single window. It is multithreaded, so time-consuming operations such as image file reading can be performed in parallel with other operations.

  5. Visualization of parameter space for image analysis.

    PubMed

    Pretorius, A Johannes; Bray, Mark-Anthony P; Carpenter, Anne E; Ruddle, Roy A

    2011-12-01

    Image analysis algorithms are often highly parameterized and much human input is needed to optimize parameter settings. This incurs a time cost of up to several days. We analyze and characterize the conventional parameter optimization process for image analysis and formulate user requirements. With this as input, we propose a change in paradigm by optimizing parameters based on parameter sampling and interactive visual exploration. To save time and reduce memory load, users are only involved in the first step--initialization of sampling--and the last step--visual analysis of output. This helps users to more thoroughly explore the parameter space and produce higher quality results. We describe a custom sampling plug-in we developed for CellProfiler--a popular biomedical image analysis framework. Our main focus is the development of an interactive visualization technique that enables users to analyze the relationships between sampled input parameters and corresponding output. We implemented this in a prototype called Paramorama. It provides users with a visual overview of parameters and their sampled values. User-defined areas of interest are presented in a structured way that includes image-based output and a novel layout algorithm. To find optimal parameter settings, users can tag high- and low-quality results to refine their search. We include two case studies to illustrate the utility of this approach.

  6. Visualization of Parameter Space for Image Analysis

    PubMed Central

    Pretorius, A. Johannes; Bray, Mark-Anthony P.; Carpenter, Anne E.; Ruddle, Roy A.

    2013-01-01

    Image analysis algorithms are often highly parameterized and much human input is needed to optimize parameter settings. This incurs a time cost of up to several days. We analyze and characterize the conventional parameter optimization process for image analysis and formulate user requirements. With this as input, we propose a change in paradigm by optimizing parameters based on parameter sampling and interactive visual exploration. To save time and reduce memory load, users are only involved in the first step - initialization of sampling - and the last step - visual analysis of output. This helps users to more thoroughly explore the parameter space and produce higher quality results. We describe a custom sampling plug-in we developed for CellProfiler - a popular biomedical image analysis framework. Our main focus is the development of an interactive visualization technique that enables users to analyze the relationships between sampled input parameters and corresponding output. We implemented this in a prototype called Paramorama. It provides users with a visual overview of parameters and their sampled values. User-defined areas of interest are presented in a structured way that includes image-based output and a novel layout algorithm. To find optimal parameter settings, users can tag high- and low-quality results to refine their search. We include two case studies to illustrate the utility of this approach. PMID:22034361

  7. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  8. Using Image Analysis to Build Reading Comprehension

    ERIC Educational Resources Information Center

    Brown, Sarah Drake; Swope, John

    2010-01-01

    Content area reading remains a primary concern of history educators. In order to better prepare students for encounters with text, the authors propose the use of two image analysis strategies tied with a historical theme to heighten student interest in historical content and provide a basis for improved reading comprehension.

  9. Scale Free Reduced Rank Image Analysis.

    ERIC Educational Resources Information Center

    Horst, Paul

    In the traditional Guttman-Harris type image analysis, a transformation is applied to the data matrix such that each column of the transformed data matrix is the best least squares estimate of the corresponding column of the data matrix from the remaining columns. The model is scale free. However, it assumes (1) that the correlation matrix is…

  10. Multi-criteria multi-stakeholder decision analysis using a fuzzy-stochastic approach for hydrosystem management

    NASA Astrophysics Data System (ADS)

    Subagadis, Y. H.; Schütze, N.; Grundmann, J.

    2014-09-01

    The conventional methods used to solve multi-criteria multi-stakeholder problems are less strongly formulated, as they normally incorporate only homogeneous information at a time and suggest aggregating objectives of different decision-makers avoiding water-society interactions. In this contribution, Multi-Criteria Group Decision Analysis (MCGDA) using a fuzzy-stochastic approach has been proposed to rank a set of alternatives in water management decisions incorporating heterogeneous information under uncertainty. The decision making framework takes hydrologically, environmentally, and socio-economically motivated conflicting objectives into consideration. The criteria related to the performance of the physical system are optimized using multi-criteria simulation-based optimization, and fuzzy linguistic quantifiers have been used to evaluate subjective criteria and to assess stakeholders' degree of optimism. The proposed methodology is applied to find effective and robust intervention strategies for the management of a coastal hydrosystem affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. Preliminary results show that the MCGDA based on a fuzzy-stochastic approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.

  11. Automatic identification of heart failure diagnostic criteria, using text analysis of clinical notes from electronic health records

    PubMed Central

    Byrd, Roy J.; Steinhubl, Steven R.; Sun, Jimeng; Ebadollahi, Shahram; Stewart, Walter F.

    2017-01-01

    Objective Early detection of Heart Failure (HF) could mitigate the enormous individual and societal burden from this disease. Clinical detection is based, in part, on recognition of the multiple signs and symptoms comprising the Framingham HF diagnostic criteria that are typically documented, but not necessarily synthesized, by primary care physicians well before more specific diagnostic studies are done. We developed a natural language processing (NLP) procedure to identify Framingham HF signs and symptoms among primary care patients, using electronic health record (EHR) clinical notes, as a prelude to pattern analysis and clinical decision support for early detection of HF. Design We developed a hybrid NLP pipeline that performs two levels of analysis: (1) At the criteria mention level, a rule-based NLP system is constructed to annotate all affirmative and negative mentions of Framingham criteria. (2) At the encounter level, we construct a system to label encounters according to whether any Framingham criterion is asserted, denied, or unknown. Measurements Precision, recall, and F-score are used as performance metrics for criteria mention extraction and for encounter labeling. Results Our criteria mention extractions achieve a precision of 0.925, a recall of 0.896, and an F-score of 0.910. Encounter labeling achieves an F-score of 0.932. Conclusion Our system accurately identifies and labels affirmations and denials of Framingham diagnostic criteria in primary care clinical notes and may help in the attempt to improve the early detection of HF. With adaptation and tooling, our development methodology can be repeated in new problem settings. PMID:23317809

  12. Balancing costs and benefits at different stages of medical innovation: a systematic review of Multi-criteria decision analysis (MCDA).

    PubMed

    Wahlster, Philip; Goetghebeur, Mireille; Kriza, Christine; Niederländer, Charlotte; Kolominsky-Rabas, Peter

    2015-07-09

    The diffusion of health technologies from translational research to reimbursement depends on several factors included the results of health economic analysis. Recent research identified several flaws in health economic concepts. Additionally, the heterogeneous viewpoints of participating stakeholders are rarely systematically addressed in current decision-making. Multi-criteria Decision Analysis (MCDA) provides an opportunity to tackle these issues. The objective of this study was to review applications of MCDA methods in decisions addressing the trade-off between costs and benefits. Using basic steps of the PRISMA guidelines, a systematic review of the healthcare literature was performed to identify original research articles from January 1990 to April 2014. Medline, PubMed, Springer Link and specific journals were searched. Using predefined categories, bibliographic records were systematically extracted regarding the type of policy applications, MCDA methodology, criteria used and their definitions. 22 studies were included in the analysis. 15 studies (68 %) used direct MCDA approaches and seven studies (32 %) used preference elicitation approaches. Four studies (19 %) focused on technologies in the early innovation process. The majority (18 studies - 81 %) examined reimbursement decisions. Decision criteria used in studies were obtained from the literature research and context-specific studies, expert opinions, and group discussions. The number of criteria ranged between three up to 15. The most frequently used criteria were health outcomes (73 %), disease impact (59 %), and implementation of the intervention (40 %). Economic criteria included cost-effectiveness criteria (14 studies, 64 %), and total costs/budget impact of an intervention (eight studies, 36 %). The process of including economic aspects is very different among studies. Some studies directly compare costs with other criteria while some include economic consideration in a second step. In early

  13. Fake fingerprint detection based on image analysis

    NASA Astrophysics Data System (ADS)

    Jin, Sang-il; Bae, You-suk; Maeng, Hyun-ju; Lee, Hyun-suk

    2010-01-01

    Fingerprint recognition systems have become prevalent in various security applications. However, recent studies have shown that it is not difficult to deceive the system with fake fingerprints made of silicon or gelatin. The fake fingerprints have almost the same ridge-valley patterns as ones of genuine fingerprints so that conventional systems are unable to detect fake fingerprints without a particular detection method. Many previous works against fake fingers required extra sensors; thus, they lacked practicality. This paper proposes a practical and effective method that detects fake fingerprints, using only an image sensor. Two criteria are introduced to differentiate genuine and fake fingerprints: the histogram distance and Fourier spectrum distance. In the proposed method, after identifying an input fingerprint of a user, the system computes two distances between the input and the reference that comes from the registered fingerprints of the user. Depending on the two distances, the system classifies the input as a genuine fingerprint or a fake. In the experiment, 2,400 fingerprint images including 1,600 fakes were tested, and the proposed method has shown a high recognition rate of 95%. The fake fingerprints were all accepted by a commercial system; thus, the use of these fake fingerprints qualifies the experiment.

  14. The Bradford Hill criteria and zinc-induced anosmia: a causality analysis.

    PubMed

    Davidson, Terence M; Smith, Wendy M

    2010-07-01

    To apply the Bradford Hill criteria, which are widely used to establish causality between an environmental agent and disease, to evaluate the relationship between over-the-counter intranasal zinc gluconate therapy and anosmia. Patient and literature review applying the Bradford Hill criteria on causation. University of California, San Diego, Nasal Dysfunction Clinic. The study included 25 patients who presented to the University of California, San Diego, Nasal Dysfunction Clinic complaining of acute-onset anosmia after intranasal application of homeopathic zinc gluconate gel. Each of the 9 Bradford Hill criteria--strength of association, consistency, specificity, temporality, biological gradient (dose-response), biological plausibility, biological coherence, experimental evidence, and analogy--was applied to intranasal zinc gluconate therapy and olfactory dysfunction using published, peer-reviewed medical literature and reported clinical experiences. Clinical, biological, and experimental data support the Bradford Hill criteria to demonstrate that intranasal zinc gluconate therapy causes hyposmia and anosmia. The Bradford Hill criteria represent an important tool for scientifically determining cause between environmental exposure and disease. Increased Food and Drug Administration oversight of homeopathic medications is needed to monitor the safety of these popular remedies.

  15. [Comparative analysis of Light's criteria and other biochemical parameters to distinguish exudates from transudates].

    PubMed

    Jiménez Castro, D; Díaz Nuevo, G; Pérez-Rodríguez, E

    2002-01-01

    Light's criteria have classically been used to differentiate exudates from transudates. Nevertheless, a number of studies have attempted to identify more efficient parameters. The objective of our study was to determine the usefulness of biochemical parameters to differentiate transudates from exudates, and to compare them with the so far best studied criteria: the Light's criteria. We prospectively analysed 850 non selected cases of pleural effusion, with closed final diagnosis after its confirmation, therapeutic response and follow-up, collected consecutively at the Pleura Unit of our hospital. The parameters evaluated as potentially discriminatory between transudates and exudates included: glucose, proteins, albumin, lactate-dehydrogenase (LDH), cholesterol, triglycerides, bilirubin, alkaline phosphatase and adenosin-deaminase (ADA), both separately and in combination to obtain the highest yield. The highest diagnostic yield was observed with the combination of pleural cholesterol, pleural LDH, and the pleural fluid/serum protein ratio, but without significant differences between combinations of pleural cholesterol and LDH, pleaural LDH and pleural proteins, Light's criteria or modified Light's criteria. We recommend the use of pleural cholesterol higher than 47 mg/dl and pleural LDH higher than 222 IU/l to offer the same yield as the combination of three parameters, due to its lower cost and because the necessity of serum determinations is avoided.

  16. Analysis of diagnostic criteria in adamantiades-behçet disease: a retrospective study.

    PubMed

    di Meo, Nicola; Bergamo, S; Vidimari, P; Bonin, S; Trevisan, G

    2013-07-01

    Adamantiades-Behçet's disease (ABD) is a chronic-relapsing, inflammatory and multi-systemic disease. Any organ or system may be involved: ABD presents a great variety of cutaneous and mucosal lesions, ocular manifestations, central and peripheral nervous system abnormalities, joint as well as gastrointestinal involvement. Since clear pathognomonic clinical features and laboratory tests are lacking, the diagnosis of ABD mainly relies on the characteristic clinical features. Several sets of diagnostic criteria have been used. The International Study Group for Behçet Disease (ISGBD) in 1990 formulated a set of criteria to warrant uniformity of both diagnosis and classification. Therefore, in 2006, a new set was proposed by the International Team for the Revision of the International Criteria for Behçet's Disease (ITR-ICBD) not only to uniform the previous criteria but also to establish best accuracy, along with an optimum sensivity and specificity. The aims of this study are both to analyze the clinical features of ABD patients and to validate the ISGBD and ITR-ICDB criteria for the diagnosis of ABD in our cohort.

  17. Good relationships between computational image analysis and radiological physics

    SciTech Connect

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  18. Good relationships between computational image analysis and radiological physics

    NASA Astrophysics Data System (ADS)

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-01

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  19. Automated retinal image analysis over the internet.

    PubMed

    Tsai, Chia-Ling; Madore, Benjamin; Leotta, Matthew J; Sofka, Michal; Yang, Gehua; Majerovics, Anna; Tanenbaum, Howard L; Stewart, Charles V; Roysam, Badrinath

    2008-07-01

    Retinal clinicians and researchers make extensive use of images, and the current emphasis is on digital imaging of the retinal fundus. The goal of this paper is to introduce a system, known as retinal image vessel extraction and registration system, which provides the community of retinal clinicians, researchers, and study directors an integrated suite of advanced digital retinal image analysis tools over the Internet. The capabilities include vasculature tracing and morphometry, joint (simultaneous) montaging of multiple retinal fields, cross-modality registration (color/red-free fundus photographs and fluorescein angiograms), and generation of flicker animations for visualization of changes from longitudinal image sequences. Each capability has been carefully validated in our previous research work. The integrated Internet-based system can enable significant advances in retina-related clinical diagnosis, visualization of the complete fundus at full resolution from multiple low-angle views, analysis of longitudinal changes, research on the retinal vasculature, and objective, quantitative computer-assisted scoring of clinical trials imagery. It could pave the way for future screening services from optometry facilities.

  20. Digital imaging analysis to assess scar phenotype.

    PubMed

    Smith, Brian J; Nidey, Nichole; Miller, Steven F; Moreno Uribe, Lina M; Baum, Christian L; Hamilton, Grant S; Wehby, George L; Dunnwald, Martine

    2014-01-01

    In order to understand the link between the genetic background of patients and wound clinical outcomes, it is critical to have a reliable method to assess the phenotypic characteristics of healed wounds. In this study, we present a novel imaging method that provides reproducible, sensitive, and unbiased assessments of postsurgical scarring. We used this approach to investigate the possibility that genetic variants in orofacial clefting genes are associated with suboptimal healing. Red-green-blue digital images of postsurgical scars of 68 patients, following unilateral cleft lip repair, were captured using the 3dMD imaging system. Morphometric and colorimetric data of repaired regions of the philtrum and upper lip were acquired using ImageJ software, and the unaffected contralateral regions were used as patient-specific controls. Repeatability of the method was high with intraclass correlation coefficient score > 0.8. This method detected a very significant difference in all three colors, and for all patients, between the scarred and the contralateral unaffected philtrum (p ranging from 1.20(-05) to 1.95(-14) ). Physicians' clinical outcome ratings from the same images showed high interobserver variability (overall Pearson coefficient = 0.49) as well as low correlation with digital image analysis results. Finally, we identified genetic variants in TGFB3 and ARHGAP29 associated with suboptimal healing outcome.

  1. Digital imaging analysis to assess scar phenotype

    PubMed Central

    Smith, Brian J.; Nidey, Nichole; Miller, Steven F.; Moreno, Lina M.; Baum, Christian L.; Hamilton, Grant S.; Wehby, George L.; Dunnwald, Martine

    2015-01-01

    In order to understand the link between the genetic background of patients and wound clinical outcomes, it is critical to have a reliable method to assess the phenotypic characteristics of healed wounds. In this study, we present a novel imaging method that provides reproducible, sensitive and unbiased assessments of post-surgical scarring. We used this approach to investigate the possibility that genetic variants in orofacial clefting genes are associated with suboptimal healing. Red-green-blue (RGB) digital images of post-surgical scars of 68 patients, following unilateral cleft lip repair, were captured using the 3dMD image system. Morphometric and colorimetric data of repaired regions of the philtrum and upper lip were acquired using ImageJ software and the unaffected contralateral regions were used as patient-specific controls. Repeatability of the method was high with interclass correlation coefficient score > 0.8. This method detected a very significant difference in all three colors, and for all patients, between the scarred and the contralateral unaffected philtrum (P ranging from 1.20−05 to 1.95−14). Physicians’ clinical outcome ratings from the same images showed high inter-observer variability (overall Pearson coefficient = 0.49) as well as low correlation with digital image analysis results. Finally, we identified genetic variants in TGFB3 and ARHGAP29 associated with suboptimal healing outcome. PMID:24635173

  2. Multi-criteria analysis for the detection of the most critical European UNESCO Heritage sites

    NASA Astrophysics Data System (ADS)

    Valagussa, Andrea; Frattini, Paolo; Berta, Nadia; Spizzichino, Daniele; Leoni, Gabriele; Margottini, Claudio; Battista Crosta, Giovanni

    2017-04-01

    A GIS-based multi-criteria analysis has been implemented to identify and to rank the most critical UNESCO Heritage sites at the European scale in the context of PROTHEGO JPI-Project. Two multi-criteria methods have been tested and applied to more than 300 European UNESCO Sites. First, the Analytic Hierarchy Procedure (AHP) was applied to the data of the UNESCO Periodic Report, in relation to 13 natural hazards that have affected or can potentially affect the Heritage sites. According to these reports, 22% of sites are without any documented hazard and 70% of the sites have at least one hazard affecting the site. The most important hazards on the European country are: fire (wildfire), storm, flooding, earthquake and erosion. For each UNESCO site, the potential risk was calculated as a weighed sum of the hazards that affect the site. The weighs of the 13 hazards were obtained by AHP procedure, which is a technique for multi-attribute decision making that enables the decomposition of a problem into hierarchy, based on the opinion of different experts about the dominance of risks. The weights are obtained by rescaling between 0 and 1 the eigenvectors relative to the maximum eigenvalue for the matrix of the coefficients. The internal coherence of the expert's attributions is defined through the calculation of the consistency ratio (Saaty, 1990). The result of the AHP method consists in a map of the UNESCO sites ranked according to the potential risk, where the site most at risk results to be the Geirangerfjord and Nærøyfjord in Norway. However, the quality of these results lies in the reliability of the Period Reports, which are produced by different experts with unknown level of scientific background. To test the reliability of these results, a comparison of the information of the periodic reports with available high-quality datasets (earthquake, volcano and landslide) at the Italian scale has been performed. Sites properly classified by the Period Reports range from

  3. Evaluation of the indications for performing magnetic resonance imaging of the female pelvis at a referral center for cancer, according to the American College of Radiology criteria

    PubMed Central

    Boaventura, Camila Silva; Rodrigues, Daniel Padilha; Silva, Olimpio Antonio Cornehl; Beltrani, Fabrício Henrique; de Melo, Rayssa Araruna Bezerra; Bitencourt, Almir Galvão Vieira; Mendes, Gustavo Gomes; Chojniak, Rubens

    2017-01-01

    Objective To evaluate the indications for performing magnetic resonance imaging of the female pelvis at a referral center for cancer. Materials and Methods This was a retrospective, single-center study, conducted by reviewing medical records and imaging reports. We included 1060 female patients who underwent magnetic resonance imaging of the pelvis at a cancer center between January 2013 and June 2014. The indications for performing the examination were classified according to the American College of Radiology (ACR) criteria. Results The mean age of the patients was 52.6 ± 14.8 years, and 49.8% were perimenopausal or postmenopausal. The majority (63.9%) had a history of cancer, which was gynecologic in 29.5% and nongynecologic in 34.4%. Of the patients evaluated, 44.0% had clinical complaints, the most common being pelvic pain (in 11.5%) and bleeding (in 9.8%), and 34.7% of patients had previously had abnormal findings on ultrasound. Most (76.7%) of the patients met the criteria for undergoing magnetic resonance imaging, according to the ACR guidelines. The main indications were evaluation of tumor recurrence after surgical resection (in 25.9%); detection and staging of gynecologic neoplasms (in 23.3%); and evaluation of pelvic pain or of a mass (in 17.1%). Conclusion In the majority of the cases evaluated, magnetic resonance imaging was clearly indicated according to the ACR criteria. The main indication was local recurrence after surgical treatment of pelvic malignancies, which is consistent with the routine protocols at cancer centers. PMID:28298725

  4. SU-E-T-679: Retrospective Analysis of the Sensitivity of Planar Dose Measurements To Gamma Analysis Criteria

    SciTech Connect

    Elguindi, S; Ezzell, G; Gagneur, J

    2015-06-15

    Purpose: IMRT QA using planar dose measurements is still a widely used method for checking the accuracy of treatment plans. A pass/fail judgment is made using gamma analysis based on a single endpoint. Using more stringent criteria is a way to increase the sensitivity to planning and delivery errors. Before such implementation, it is necessary to understand how the sensitivity to different gamma criteria settings affects gamma passing rates (GPR). Methods: 752 IMRT QA measurements were re-analyzed with varying distance to agreement (DTA) and dose difference (DD) percentages using a Matlab program. Other quantifying information such as the mean dose difference in the treatment target (defined as points that are greater than 80% of maximal dose) were stored in a relational database for retrospective analysis. Results: The average and standard deviation of GPR (%) fell from 99.84 ± (0.43) to 89.61 ± (6.08) when restricting DD from 5 − 1% respectively, as compared to a drop from 99.15 ± (1.19) to 95.00 ± (4.43), when restricting the DTA from 5 − 1 mm respectively. The mean dose difference (%) in the treatment target between measured and calculated dose was −1.96 ± (0. 83), −0.09 ± (0.98), and 1.44 ± (0. 86) for each of our institution’s three matched linear accelerators (LINAC 1, 2, and 3 respectively). For plans that are approximately 2.7 sigma below the mean GPR, an average of 78.4% of those plans were measured on LINAC 1 or 3, while only 48% of the total plans were run on those machines. Conclusion: The data demonstrates that when restricting gamma criterion, such as the DD, the greatest indicator of reduced GPR in our institution is which matched LINAC the plan was measured on. While small, these differences manifest themselves to levels comparable to other treatment related differences and possibly confound the gamma analysis.

  5. ALISA: adaptive learning image and signal analysis

    NASA Astrophysics Data System (ADS)

    Bock, Peter

    1999-01-01

    ALISA (Adaptive Learning Image and Signal Analysis) is an adaptive statistical learning engine that may be used to detect and classify the surfaces and boundaries of objects in images. The engine has been designed, implemented, and tested at both the George Washington University and the Research Institute for Applied Knowledge Processing in Ulm, Germany over the last nine years with major funding from Robert Bosch GmbH and Lockheed-Martin Corporation. The design of ALISA was inspired by the multi-path cortical- column architecture and adaptive functions of the mammalian visual cortex.

  6. Analysis of spatial pseudodepolarizers in imaging systems

    NASA Technical Reports Server (NTRS)

    Mcguire, James P., Jr.; Chipman, Russell A.

    1990-01-01

    The objective of a number of optical instruments is to measure the intensity accurately without bias as to the incident polarization state. One method to overcome polarization bias in optical systems is the insertion of a spatial pseudodepolarizer. Both the degree of depolarization and image degradation (from the polarization aberrations of the pseudodepolarizer) are analyzed for two depolarizer designs: (1) the Cornu pseudodepolarizer, effective for linearly polarized light, and (2) the dual Babinet compensator pseudodepolarizer, effective for all incident polarization states. The image analysis uses a matrix formalism to describe the polarization dependence of the diffraction patterns and optical transfer function.

  7. Characterization of microrod arrays by image analysis

    NASA Astrophysics Data System (ADS)

    Hillebrand, Reinald; Grimm, Silko; Giesa, Reiner; Schmidt, Hans-Werner; Mathwig, Klaus; Gösele, Ulrich; Steinhart, Martin

    2009-04-01

    The uniformity of the properties of array elements was evaluated by statistical analysis of microscopic images of array structures, assuming that the brightness of the array elements correlates quantitatively or qualitatively with a microscopically probed quantity. Derivatives and autocorrelation functions of cumulative frequency distributions of the object brightnesses were used to quantify variations in object properties throughout arrays. Thus, different specimens, the same specimen at different stages of its fabrication or use, and different imaging conditions can be compared systematically. As an example, we analyzed scanning electron micrographs of microrod arrays and calculated the percentage of broken microrods.

  8. Recent Advances in Morphological Cell Image Analysis

    PubMed Central

    Chen, Shengyong; Zhao, Mingzhu; Wu, Guang; Yao, Chunyan; Zhang, Jianwei

    2012-01-01

    This paper summarizes the recent advances in image processing methods for morphological cell analysis. The topic of morphological analysis has received much attention with the increasing demands in both bioinformatics and biomedical applications. Among many factors that affect the diagnosis of a disease, morphological cell analysis and statistics have made great contributions to results and effects for a doctor. Morphological cell analysis finds the cellar shape, cellar regularity, classification, statistics, diagnosis, and so forth. In the last 20 years, about 1000 publications have reported the use of morphological cell analysis in biomedical research. Relevant solutions encompass a rather wide application area, such as cell clumps segmentation, morphological characteristics extraction, 3D reconstruction, abnormal cells identification, and statistical analysis. These reports are summarized in this paper to enable easy referral to suitable methods for practical solutions. Representative contributions and future research trends are also addressed. PMID:22272215

  9. Autonomous Image Analysis for Future Mars Missions

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Ruzon, M. A.; Bandari, E.; Roush, T. L.

    1999-01-01

    To explore high priority landing sites and to prepare for eventual human exploration, future Mars missions will involve rovers capable of traversing tens of kilometers. However, the current process by which scientists interact with a rover does not scale to such distances. Specifically, numerous command cycles are required to complete even simple tasks, such as, pointing the spectrometer at a variety of nearby rocks. In addition, the time required by scientists to interpret image data before new commands can be given and the limited amount of data that can be downlinked during a given command cycle constrain rover mobility and achievement of science goals. Experience with rover tests on Earth supports these concerns. As a result, traverses to science sites as identified in orbital images would require numerous science command cycles over a period of many weeks, months or even years, perhaps exceeding rover design life and other constraints. Autonomous onboard science analysis can address these problems in two ways. First, it will allow the rover to preferentially transmit "interesting" images, defined as those likely to have higher science content. Second, the rover will be able to anticipate future commands. For example, a rover might autonomously acquire and return spectra of "interesting" rocks along with a high-resolution image of those rocks in addition to returning the context images in which they were detected. Such approaches, coupled with appropriate navigational software, help to address both the data volume and command cycle bottlenecks that limit both rover mobility and science yield. We are developing fast, autonomous algorithms to enable such intelligent on-board decision making by spacecraft. Autonomous algorithms developed to date have the ability to identify rocks and layers in a scene, locate the horizon, and compress multi-spectral image data. We are currently investigating the possibility of reconstructing a 3D surface from a sequence of images

  10. Fast image analysis in polarization SHG microscopy.

    PubMed

    Amat-Roldan, Ivan; Psilodimitrakopoulos, Sotiris; Loza-Alvarez, Pablo; Artigas, David

    2010-08-02

    Pixel resolution polarization-sensitive second harmonic generation (PSHG) imaging has been recently shown as a promising imaging modality, by largely enhancing the capabilities of conventional intensity-based SHG microscopy. PSHG is able to obtain structural information from the elementary SHG active structures, which play an important role in many biological processes. Although the technique is of major interest, acquiring such information requires long offline processing, even with current computers. In this paper, we present an approach based on Fourier analysis of the anisotropy signature that allows processing the PSHG images in less than a second in standard single core computers. This represents a temporal improvement of several orders of magnitude compared to conventional fitting algorithms. This opens up the possibility for fast PSHG information with the subsequent benefit of potential use in medical applications.

  11. Adult "termination-of-resuscitation" (TOR)-criteria may not be suitable for children - a retrospective analysis.

    PubMed

    Rotering, Victoria Maria; Trepels-Kottek, Sonja; Heimann, Konrad; Brokmann, Jörg-Christian; Orlikowsky, Thorsten; Schoberer, Mark

    2016-12-07

    Only a small number of patients survive out-of-hospital-cardiac-arrest (OHCA). The duration of CPR varies considerably and transportation of patients under CPR is often unsuccessful. Termination-of-resuscitation (TOR)-criteria aim to preclude futile resuscitation efforts. Our goal was to find out to which extent existing TOR-criteria can be transferred to paediatric OHCA-patients with special regard to their prognostic value. We performed a retrospective analysis of an eleven-year single centre patient cohort. 43 paediatric patients admitted to our institution after emergency-medical-system (EMS)-confirmed OHCA from 2003 to 2013 were included. Morrison's BLS- and ALS-TOR-rules as well as the Trauma-TOR-criteria by the American Association of EMS Physicians were evaluated for application in children, by calculating sensitivity, specificity, negative and positive predictive value for death-, as well as survival-prediction in our cohort. 26 patients achieved ROSC and 14 were discharged alive (n = 7 PCPC 1/2, n = 7 PCPC 5). Sensitivity for BLS-TOR-criteria predicting death was 48.3%, specificity 92.9%, the PPV 93.3% and the NPV 46.4%. ALS-TOR-criteria for death had a sensitivity of 10.3%, specificity of 100%, a PPV of 100% and an NPV of 35%. Retrospective application of the BLS-TOR-rule in our patient cohort identified the resuscitation of one later survivor as futile. ALS-TOR-criteria did not give false predictions of death. The proportion of CPRs that could have been abandoned is 48.2% for the BLS-TOR and only 10.3% for the ALS-TOR-rule. Both rules therefore appear not to be transferable to a paediatric population.

  12. Assessing the sustainability of forest management: an application of multi-criteria decision analysis to community forests in northern Ethiopia.

    PubMed

    Balana, Bedru Babulo; Mathijs, Erik; Muys, Bart

    2010-06-01

    Continuous deterioration of the natural resource base has become a serious threat to both the ecological systems and economic production in Ethiopia. Many of these problems have been attributed directly or indirectly to the rapid dwindling of the country's forest cover which is associated with unsustainable forest use and management. Closing community woodlands from human and livestock intervention to promote natural regeneration of forests has been one of the environmental restoration strategies pursued in the degraded highland areas of northern Ethiopia. However, local pressure to use reforested community lands for economic benefit has become a major threat to forest sustainability. Using locally identified sets of criteria and indicators for sustainable community forest management, this paper applies a multi-criteria decision analysis tool to evaluate forest management problems in the northern province of Tigray, Ethiopia. Three MCA methods - ranking, pair-wise comparison, and scoring - were used in evaluating the sets of criteria and indicators and alternative forest management scenarios. Results from the study indicate a number of noteworthy points: 1) MCA techniques both for identifying local level sustainability criteria and indicators and evaluating management schemes in a participatory decision environment appear to be effective tools to address local resource management problems; 2) Evaluated against the selected sets of criteria and indicators, the current forest management regime in the study area is not on a sustainable path; 3) Acquainting local people with adequate environmental knowledge and raising local awareness about the long-term consequences of environmental degradation ranked first among the set of sustainability criteria; and 4) In order to harmonize both environmental and economic objectives, the present 'ecological-biased' forest management regime needs to be substituted by an appropriate holistic scheme that takes into account

  13. Automated quantitative image analysis of nanoparticle assembly

    NASA Astrophysics Data System (ADS)

    Murthy, Chaitanya R.; Gao, Bo; Tao, Andrea R.; Arya, Gaurav

    2015-05-01

    The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly.The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated

  14. Alzheimer's disease - a neurospirochetosis. Analysis of the evidence following Koch's and Hill's criteria

    PubMed Central

    2011-01-01

    It is established that chronic spirochetal infection can cause slowly progressive dementia, brain atrophy and amyloid deposition in late neurosyphilis. Recently it has been suggested that various types of spirochetes, in an analogous way to Treponema pallidum, could cause dementia and may be involved in the pathogenesis of Alzheimer's disease (AD). Here, we review all data available in the literature on the detection of spirochetes in AD and critically analyze the association and causal relationship between spirochetes and AD following established criteria of Koch and Hill. The results show a statistically significant association between spirochetes and AD (P = 1.5 × 10-17, OR = 20, 95% CI = 8-60, N = 247). When neutral techniques recognizing all types of spirochetes were used, or the highly prevalent periodontal pathogen Treponemas were analyzed, spirochetes were observed in the brain in more than 90% of AD cases. Borrelia burgdorferi was detected in the brain in 25.3% of AD cases analyzed and was 13 times more frequent in AD compared to controls. Periodontal pathogen Treponemas (T. pectinovorum, T. amylovorum, T. lecithinolyticum, T. maltophilum, T. medium, T. socranskii) and Borrelia burgdorferi were detected using species specific PCR and antibodies. Importantly, co-infection with several spirochetes occurs in AD. The pathological and biological hallmarks of AD were reproduced in vitro by exposure of mammalian cells to spirochetes. The analysis of reviewed data following Koch's and Hill's postulates shows a probable causal relationship between neurospirochetosis and AD. Persisting inflammation and amyloid deposition initiated and sustained by chronic spirochetal infection form together with the various hypotheses suggested to play a role in the pathogenesis of AD a comprehensive entity. As suggested by Hill, once the probability of a causal relationship is established prompt action is needed. Support and attention should be given to this field of AD research

  15. Estimation Criteria for Rock Brittleness Based on Energy Analysis During the Rupturing Process

    NASA Astrophysics Data System (ADS)

    Ai, Chi; Zhang, Jun; Li, Yu-wei; Zeng, Jia; Yang, Xin-liang; Wang, Ji-gang

    2016-12-01

    Brittleness is one of the most important mechanical properties of rock: it plays a significant role in evaluating the risk of rock bursts and in analysis of borehole-wall stability during shale gas development. Brittleness is also a critical parameter in the design of hydraulic fracturing. However, there is still no widely accepted definition of the concept of brittleness in rock mechanics. Although many criteria have been proposed to characterize rock brittleness, their applicability and reliability have yet to be verified. In this paper, the brittleness of rock under compression is defined as the ability of a rock to accumulate elastic energy during the pre-peak stage and to self-sustain fracture propagation in the post-peak stage. This ability is related to three types of energy: fracture energy, post-peak released energy and pre-peak dissipation energy. New brittleness evaluation indices B 1 and B 2 are proposed based on the stress-strain curve from the viewpoint of energy. The new indices can describe the entire transition of rock from absolute plasticity to absolute brittleness. In addition, the brittle characteristics reflected by other brittleness indices can be described, and the calculation results of B 1 and B 2 are continuous and monotonic. Triaxial compression tests on different types of rock were carried out under different confining pressures. Based on B 1 and B 2, the brittleness of different rocks shows different trends with rising confining pressure. The brittleness of red sandstone decreases with increasing confining pressure, whereas for black shale it initially increases and then decreases in a certain range of confining pressure. Granite displays a constant increasing trend. The brittleness anisotropy of black shale is discussed. The smaller the angle between the loading direction and the bedding plane, the greater the brittleness. The calculation B 1 and B 2 requires experimental data, and the values of these two indices represent only

  16. SENSITIVITY ANALYSIS OF THE APPLICATION OF CHEMICAL EXPOSURE CRITERIA FOR COMPARING SITES AND WATERSHEDS

    EPA Science Inventory

    A methodology was developed for deriving quantitative exposure criteria useful for comparing a site or watershed to a reference condition. The prototype method used indicators of exposures to oil contamination and combustion by-products, naphthalene and benzo(a)pyrene metabolites...

  17. Gender Bias in Diagnostic Criteria for Personality Disorders: An Item Response Theory Analysis

    PubMed Central

    Jane, J. Serrita; Oltmanns, Thomas F.; South, Susan C.; Turkheimer, Eric

    2015-01-01

    The authors examined gender bias in the diagnostic criteria for Diagnostic and Statistical Manual of Mental Disorders (4th ed., text revision; American Psychiatric Association, 2000) personality disorders. Participants (N = 599) were selected from 2 large, nonclinical samples on the basis of information from self-report questionnaires and peer nominations that suggested the presence of personality pathology. All were interviewed with the Structured Interview for DSM–IV Personality (B. Pfohl, N. Blum, & M. Zimmerman, 1997). Using item response theory methods, the authors compared data from 315 men and 284 women, searching for evidence of differential item functioning in the diagnostic features of 10 personality disorder categories. Results indicated significant but moderate measurement bias pertaining to gender for 6 specific criteria. In other words, men and women with equivalent levels of pathology endorsed the items at different rates. For 1 paranoid personality disorder criterion and 3 antisocial criteria, men were more likely to endorse the biased items. For 2 schizoid personality disorder criteria, women were more likely to endorse the biased items. PMID:17324027

  18. Gender bias in diagnostic criteria for personality disorders: an item response theory analysis.

    PubMed

    Jane, J Serrita; Oltmanns, Thomas F; South, Susan C; Turkheimer, Eric

    2007-02-01

    The authors examined gender bias in the diagnostic criteria for Diagnostic and Statistical Manual of Mental Disorders (4th ed., text revision; American Psychiatric Association, 2000) personality disorders. Participants (N=599) were selected from 2 large, nonclinical samples on the basis of information from self-report questionnaires and peer nominations that suggested the presence of personality pathology. All were interviewed with the Structured Interview for DSM-IV Personality (B. Pfohl, N. Blum, & M. Zimmerman, 1997). Using item response theory methods, the authors compared data from 315 men and 284 women, searching for evidence of differential item functioning in the diagnostic features of 10 personality disorder categories. Results indicated significant but moderate measurement bias pertaining to gender for 6 specific criteria. In other words, men and women with equivalent levels of pathology endorsed the items at different rates. For 1 paranoid personality disorder criterion and 3 antisocial criteria, men were more likely to endorse the biased items. For 2 schizoid personality disorder criteria, women were more likely to endorse the biased items.

  19. Analysis of Time-Sharing Contract Agreements with Related Suggested Systems Evaluation Criteria.

    ERIC Educational Resources Information Center

    Chanoux, Jo Ann J.

    While avoiding evaluation or specification of individual companies, computer time-sharing commercial contract agreements are analyzed. Price and non-price contract elements are analyzed according to 22 evaluation criteria: confidentiality measures assumed by the vendor; consultation services available; package programs and user routines; languages…

  20. SENSITIVITY ANALYSIS OF THE APPLICATION OF CHEMICAL EXPOSURE CRITERIA FOR COMPARING SITES AND WATERSHEDS

    EPA Science Inventory

    A methodology was developed for deriving quantitative exposure criteria useful for comparing a site or watershed to a reference condition. The prototype method used indicators of exposures to oil contamination and combustion by-products, naphthalene and benzo(a)pyrene metabolites...

  1. Diagnosing Behavior Disorders: An Analysis of State Definitions, Eligibility Criteria and Recommended Procedures.

    ERIC Educational Resources Information Center

    Swartz, Stanley L.; And Others

    Using information collected in a survey of all 50 states and the District of Columbia, the study analyzed state definitions of the "behavior disordered/emotionally disturbed" (BD/ED) category of handicapped children, program entrance and exit criteria, and procedures for referral, evaluation, and program placement. A general lack of…

  2. Endoscopic image analysis in semantic space.

    PubMed

    Kwitt, R; Vasconcelos, N; Rasiwasia, N; Uhl, A; Davis, B; Häfner, M; Wrba, F

    2012-10-01

    A novel approach to the design of a semantic, low-dimensional, encoding for endoscopic imagery is proposed. This encoding is based on recent advances in scene recognition, where semantic modeling of image content has gained considerable attention over the last decade. While the semantics of scenes are mainly comprised of environmental concepts such as vegetation, mountains or sky, the semantics of endoscopic imagery are medically relevant visual elements, such as polyps, special surface patterns, or vascular structures. The proposed semantic encoding differs from the representations commonly used in endoscopic image analysis (for medical decision support) in that it establishes a semantic space, where each coordinate axis has a clear human interpretation. It is also shown to establish a connection to Riemannian geometry, which enables principled solutions to a number of problems that arise in both physician training and clinical practice. This connection is exploited by leveraging results from information geometry to solve problems such as (1) recognition of important semantic concepts, (2) semantically-focused image browsing, and (3) estimation of the average-case semantic encoding for a collection of images that share a medically relevant visual detail. The approach can provide physicians with an easily interpretable, semantic encoding of visual content, upon which further decisions, or operations, can be naturally carried out. This is contrary to the prevalent practice in endoscopic image analysis for medical decision support, where image content is primarily captured by discriminative, high-dimensional, appearance features, which possess discriminative power but lack human interpretability. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Analysis of Handling Qualities Design Criteria for Active Inceptor Force-Feel Characteristics

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos A.; Lusardi, Jeff A.

    2013-01-01

    ratio. While these two studies produced boundaries for acceptable/unacceptable stick dynamics for rotorcraft, they were not able to provide guidance on how variations of the stick dynamics in the acceptable region impact handling qualities. More recently, a ground based simulation study [5] suggested little benefit was to be obtained from variations of the damping ratio for a side-stick controller exhibiting high natural frequencies (greater than 17 rad/s) and damping ratios (greater than 2.0). A flight test campaign was conducted concurrently on the RASCAL JUH-60A in-flight simulator and the ACT/FHS EC-135 in flight simulator [6]. Upon detailed analysis of the pilot evaluations the study identified a clear preference for a high damping ratio and natural frequency of the center stick inceptors. Side stick controllers were found to be less sensitive to the damping. While these studies have compiled a substantial amount of data, in the form of qualitative and quantitative pilot opinion, a fundamental analysis of the effect of the inceptor force-feel system on flight control is found to be lacking. The study of Ref. [6] specifically concluded that a systematic analysis was necessary, since discrepancies with the assigned handling qualities showed that proposed analytical design metrics, or criteria, were not suitable. The overall goal of the present study is to develop a clearer fundamental understanding of the underlying mechanisms associated with the inceptor dynamics that govern the handling qualities using a manageable analytical methodology.

  4. The synthesis and analysis of color images

    NASA Technical Reports Server (NTRS)

    Wandell, B. A.

    1985-01-01

    A method is described for performing the synthesis and analysis of digital color images. The method is based on two principles. First, image data are represented with respect to the separate physical factors, surface reflectance and the spectral power distribution of the ambient light, that give rise to the perceived color of an object. Second, the encoding is made efficient by using a basis expansion for the surface spectral reflectance and spectral power distribution of the ambient light that takes advantage of the high degree of correlation across the visible wavelengths normally found in such functions. Within this framework, the same basic methods can be used to synthesize image data for color display monitors and printed materials, and to analyze image data into estimates of the spectral power distribution and surface spectral reflectances. The method can be applied to a variety of tasks. Examples of applications include the color balancing of color images, and the identification of material surface spectral reflectance when the lighting cannot be completely controlled.

  5. Image analysis for measuring rod network properties

    NASA Astrophysics Data System (ADS)

    Kim, Dongjae; Choi, Jungkyu; Nam, Jaewook

    2015-12-01

    In recent years, metallic nanowires have been attracting significant attention as next-generation flexible transparent conductive films. The performance of films depends on the network structure created by nanowires. Gaining an understanding of their structure, such as connectivity, coverage, and alignment of nanowires, requires the knowledge of individual nanowires inside the microscopic images taken from the film. Although nanowires are flexible up to a certain extent, they are usually depicted as rigid rods in many analysis and computational studies. Herein, we propose a simple and straightforward algorithm based on the filtering in the frequency domain for detecting the rod-shape objects inside binary images. The proposed algorithm uses a specially designed filter in the frequency domain to detect image segments, namely, the connected components aligned in a certain direction. Those components are post-processed to be combined under a given merging rule in a single rod object. In this study, the microscopic properties of the rod networks relevant to the analysis of nanowire networks were measured for investigating the opto-electric performance of transparent conductive films and their alignment distribution, length distribution, and area fraction. To verify and find the optimum parameters for the proposed algorithm, numerical experiments were performed on synthetic images with predefined properties. By selecting proper parameters, the algorithm was used to investigate silver nanowire transparent conductive films fabricated by the dip coating method.

  6. Evidential Reasoning in Expert Systems for Image Analysis.

    DTIC Science & Technology

    1985-02-01

    techniques to image analysis (IA). There is growing evidence that these techniques offer significant improvements in image analysis , particularly in the...2) to provide a common framework for analysis, (3) to structure the ER process for major expert-system tasks in image analysis , and (4) to identify...approaches to three important tasks for expert systems in the domain of image analysis . This segment concluded with an assessment of the strengths

  7. Evaluation of the 2010 McDonald multiple sclerosis criteria in children with a clinically isolated syndrome.

    PubMed

    Kornek, Barbara; Schmitl, Beate; Vass, Karl; Zehetmayer, Sonja; Pritsch, Martin; Penzien, Johann; Karenfort, Michael; Blaschek, Astrid; Seidl, Rainer; Prayer, Daniela; Rostasy, Kevin

    2012-12-01

    Magnetic resonance imaging diagnostic criteria for paediatric multiple sclerosis have been established on the basis of brain imaging findings alone. The 2010 McDonald criteria for the diagnosis of multiple sclerosis, however, include spinal cord imaging for detection of lesion dissemination in space. The new criteria have been recommended in paediatric multiple sclerosis. (1) To evaluate the 2010 McDonald multiple sclerosis criteria in children with a clinically isolated syndrome and to compare them with recently proposed magnetic resonance criteria for children; (2) to assess whether the inclusion of spinal cord imaging provided additional value to the 2010 McDonald criteria. We performed a retrospective analysis of brain and spinal cord magnetic resonance imaging scans from 52 children with a clinically isolated syndrome. Sensitivity, specificity and accuracy of the magnetic resonance criteria were assessed. The 2010 McDonald dissemination in space criteria were more sensitive (85% versus 74%) but less specific (80% versus 100%) compared to the 2005 McDonald criteria. The Callen criteria were more accurate (89%) compared to the 2010 McDonald (85%), the 2005 McDonald criteria for dissemination in space (81%), the KIDMUS criteria (46%) and the Canadian Pediatric Demyelinating Disease Network criteria (76%). The 2010 McDonald criteria for dissemination in time were more accurate (93%) than the dissemination in space criteria (85%). Inclusion of the spinal cord did not increase the accuracy of the McDonald criteria.

  8. Pain related inflammation analysis using infrared images

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Bardhan, Shawli; Das, Kakali; Bhattacharjee, Debotosh; Nath, Satyabrata

    2016-05-01

    Medical Infrared Thermography (MIT) offers a potential non-invasive, non-contact and radiation free imaging modality for assessment of abnormal inflammation having pain in the human body. The assessment of inflammation mainly depends on the emission of heat from the skin surface. Arthritis is a disease of joint damage that generates inflammation in one or more anatomical joints of the body. Osteoarthritis (OA) is the most frequent appearing form of arthritis, and rheumatoid arthritis (RA) is the most threatening form of them. In this study, the inflammatory analysis has been performed on the infrared images of patients suffering from RA and OA. For the analysis, a dataset of 30 bilateral knee thermograms has been captured from the patient of RA and OA by following a thermogram acquisition standard. The thermograms are pre-processed, and areas of interest are extracted for further processing. The investigation of the spread of inflammation is performed along with the statistical analysis of the pre-processed thermograms. The objectives of the study include: i) Generation of a novel thermogram acquisition standard for inflammatory pain disease ii) Analysis of the spread of the inflammation related to RA and OA using K-means clustering. iii) First and second order statistical analysis of pre-processed thermograms. The conclusion reflects that, in most of the cases, RA oriented inflammation affects bilateral knees whereas inflammation related to OA present in the unilateral knee. Also due to the spread of inflammation in OA, contralateral asymmetries are detected through the statistical analysis.

  9. Vibration signature analysis of AFM images

    SciTech Connect

    Joshi, G.A.; Fu, J.; Pandit, S.M.

    1995-12-31

    Vibration signature analysis has been commonly used for the machine condition monitoring and the control of errors. However, it has been rarely employed for the analysis of the precision instruments such as an atomic force microscope (AFM). In this work, an AFM was used to collect vibration data from a sample positioning stage under different suspension and support conditions. Certain structural characteristics of the sample positioning stage show up as a result of the vibration signature analysis of the surface height images measured using an AFM. It is important to understand these vibration characteristics in order to reduce vibrational uncertainty, improve the damping and structural design, and to eliminate the imaging imperfections. The choice of method applied for vibration analysis may affect the results. Two methods, the data dependent systems (DDS) analysis and the Welch`s periodogram averaging method were investigated for application to this problem. Both techniques provide smooth spectrum plots from the data. Welch`s periodogram provides a coarse resolution as limited by the number of samples and requires a choice of window to be decided subjectively by the user. The DDS analysis provides sharper spectral peaks at a much higher resolution and a much lower noise floor. A decomposition of the signal variance in terms of the frequencies is provided as well. The technique is based on an objective model adequacy criterion.

  10. Weighting of Criteria for Disease Prioritization Using Conjoint Analysis and Based on Health Professional and Student Opinion.

    PubMed

    Stebler, Nadine; Schuepbach-Regula, Gertraud; Braam, Peter; Falzon, Laura Cristina

    2016-01-01

    Disease prioritization exercises have been used by several organizations to inform surveillance and control measures. Though most methodologies for disease prioritization are based on expert opinion, it is becoming more common to include different stakeholders in the prioritization exercise. This study was performed to compare the weighting of disease criteria, and the consequent prioritization of zoonoses, by both health professionals and students in Switzerland using a Conjoint Analysis questionnaire. The health professionals comprised public health and food safety experts, cantonal physicians and cantonal veterinarians, while the student group comprised first-year veterinary and agronomy students. Eight criteria were selected for this prioritization based on expert elicitation and literature review. These criteria, described on a 3-tiered scale, were evaluated through a choice-based Conjoint Analysis questionnaire with 25 choice tasks. Questionnaire results were analyzed to obtain importance scores (for each criterion) and mean utility values (for each criterion level), and the latter were then used to rank 16 zoonoses. While the most important criterion for both groups was "Severity of the disease in humans", the second ranked criteria by the health professionals and students were "Economy" and "Treatment in humans", respectively. Regarding the criterion "Control and Prevention", health professionals tended to prioritize a disease when the control and preventive measures were described to be 95% effective, while students prioritized a disease if there were almost no control and preventive measures available. Bovine Spongiform Encephalopathy was the top-ranked disease by both groups. Health professionals and students agreed on the weighting of certain criteria such as "Severity" and "Treatment of disease in humans", but disagreed on others such as "Economy" or "Control and Prevention". Nonetheless, the overall disease ranking lists were similar, and these may be

  11. Weighting of Criteria for Disease Prioritization Using Conjoint Analysis and Based on Health Professional and Student Opinion

    PubMed Central

    Stebler, Nadine; Schuepbach-Regula, Gertraud; Braam, Peter; Falzon, Laura Cristina

    2016-01-01

    Disease prioritization exercises have been used by several organizations to inform surveillance and control measures. Though most methodologies for disease prioritization are based on expert opinion, it is becoming more common to include different stakeholders in the prioritization exercise. This study was performed to compare the weighting of disease criteria, and the consequent prioritization of zoonoses, by both health professionals and students in Switzerland using a Conjoint Analysis questionnaire. The health professionals comprised public health and food safety experts, cantonal physicians and cantonal veterinarians, while the student group comprised first-year veterinary and agronomy students. Eight criteria were selected for this prioritization based on expert elicitation and literature review. These criteria, described on a 3-tiered scale, were evaluated through a choice-based Conjoint Analysis questionnaire with 25 choice tasks. Questionnaire results were analyzed to obtain importance scores (for each criterion) and mean utility values (for each criterion level), and the latter were then used to rank 16 zoonoses. While the most important criterion for both groups was “Severity of the disease in humans”, the second ranked criteria by the health professionals and students were “Economy” and “Treatment in humans”, respectively. Regarding the criterion “Control and Prevention”, health professionals tended to prioritize a disease when the control and preventive measures were described to be 95% effective, while students prioritized a disease if there were almost no control and preventive measures available. Bovine Spongiform Encephalopathy was the top-ranked disease by both groups. Health professionals and students agreed on the weighting of certain criteria such as “Severity” and “Treatment of disease in humans”, but disagreed on others such as “Economy” or “Control and Prevention”. Nonetheless, the overall disease ranking

  12. Deciding on success criteria for predictability of pharmacokinetic parameters from in vitro studies: an analysis based on in vivo observations.

    PubMed

    Abduljalil, Khaled; Cain, Theresa; Humphries, Helen; Rostami-Hodjegan, Amin

    2014-09-01

    Prediction accuracy of pharmacokinetic parameters is often assessed using prediction fold error, i.e., being within 2-, 3-, or n-fold of observed values. However, published studies disagree on which fold error represents an accurate prediction. In addition, "observed data" from only one clinical study are often used as the gold standard for in vitro to in vivo extrapolation (IVIVE) studies, despite data being subject to significant interstudy variability and subjective selection from various available reports. The current study involved analysis of published systemic clearance (CL) and volume of distribution at steady state (Vss) values taken from over 200 clinical studies. These parameters were obtained for 17 different drugs after intravenous administration. Data were analyzed with emphasis on the appropriateness to use a parameter value from one particular clinical study to judge the performance of IVIVE and the ability of CL and Vss values obtained from one clinical study to "predict" the same values obtained in a different clinical study using the n-fold criteria for prediction accuracy. The twofold criteria method was of interest because it is widely used in IVIVE predictions. The analysis shows that in some cases the twofold criteria method is an unreasonable expectation when the observed data are obtained from studies with small sample size. A more reasonable approach would allow prediction criteria to include clinical study information such as sample size and the variance of the parameter of interest. A method is proposed that allows the "success" criteria to be linked to the measure of variation in the observed value. Copyright © 2014 by The American Society for Pharmacology and Experimental Therapeutics.

  13. Principle component analysis based hyperspectral image fusion in imaging spectropolarimeter

    NASA Astrophysics Data System (ADS)

    Ren, Wenyi; Wu, Dan; Jiang, Jiangang; Yang, Guoan; Zhang, Chunmin

    2017-02-01

    Image fusion is of great importance in object detection. A PCA based image fusion method was proposed. A pixel-level average method and a wavelet-based methods have been implemented for a comparison study. Different performance metrics without reference image are implemented to evaluate the performance of image fusion algorithms. It has been concluded that image fusion using PCA based method showed better performance.

  14. BioImage Suite: An integrated medical image analysis suite: An update.

    PubMed

    Papademetris, Xenophon; Jackowski, Marcel P; Rajeevan, Nallakkandi; DiStasio, Marcello; Okuda, Hirohito; Constable, R Todd; Staib, Lawrence H

    2006-01-01

    BioImage Suite is an NIH-supported medical image analysis software suite developed at Yale. It leverages both the Visualization Toolkit (VTK) and the Insight Toolkit (ITK) and it includes many additional algorithms for image analysis especially in the areas of segmentation, registration, diffusion weighted image processing and fMRI analysis. BioImage Suite has a user-friendly user interface developed in the Tcl scripting language. A final beta version is freely available for download.

  15. Computerized image analysis of digitized infrared images of breasts from a scanning infrared imaging system

    NASA Astrophysics Data System (ADS)

    Head, Jonathan F.; Lipari, Charles A.; Elliot, Robert L.

    1998-10-01

    Infrared imaging of the breasts has been shown to be of value in risk assessment, detection, diagnosis and prognosis of breast cancer. However, infrared imaging has not been widely accepted for a variety of reasons, including the lack of standardization of the subjective visual analysis method. The subjective nature of the standard visual analysis makes it difficult to achieve equivalent results with different equipment and different interpreters of the infrared patterns of the breasts. Therefore, this study was undertaken to develop more objective analysis methods for infrared images of the breasts by creating objective semiquantitative and quantitative analysis of computer assisted image analysis determined mean temperatures of whole breasts and quadrants of the breasts. When using objective quantitative data on whole breasts (comparing differences in means of left and right breasts), semiquantitative data on quadrants of the breast (determining an index by summation of scores for each quadrant), or summation of quantitative data on quadrants of the breasts there was a decrease in the number of abnormal patterns (positives) in patients being screen for breast cancer and an increases in the number of abnormal patterns (true positives) in the breast cancer patients. It is hoped that the decrease in positives in women being screened for breast cancer will translate into a decrease in the false positives but larger numbers of women with longer follow-up will be needed to clarify this. Also a much larger group of breast cancer patients will need to be studied in order to see if there is a true increase in the percentage of breast cancer patients presenting with abnormal infrared images of the breast with these objective image analysis methods.

  16. Machine learning for medical images analysis.

    PubMed

    Criminisi, A

    2016-10-01

    This article discusses the application of machine learning for the analysis of medical images. Specifically: (i) We show how a special type of learning models can be thought of as automatically optimized, hierarchically-structured, rule-based algorithms, and (ii) We discuss how the issue of collecting large labelled datasets applies to both conventional algorithms as well as machine learning techniques. The size of the training database is a function of model complexity rather than a characteristic of machine learning methods.

  17. Global Methods for Image Motion Analysis

    DTIC Science & Technology

    1992-10-01

    including the time for reviewing instructions , searching existing data sources, gathering and maintaining the data needed, and completing and reviewing...thanks go to Pankaj who inspired me in research , to Prasad from whom I have learned so much, and to Ronie and Laureen, the memories of whose company...of images to determine egomotion and to extract information from the scene. Research in motion analysis has been focussed on the problems of

  18. Follow-up of multicentric HCC according to the mRECIST criteria: role of 320-Row CT with semi-automatic 3D analysis software for evaluating the response to systemic therapy

    PubMed Central

    TELEGRAFO, M.; DILORENZO, G.; DI GIOVANNI, G.; CORNACCHIA, I.; STABILE IANORA, A.A.; ANGELELLI, G.; MOSCHETTA, M.

    2016-01-01

    Aim To evaluate the role of 320-detector row computed tomography (MDCT) with 3D analysis software in follow up of patients affected by multicentric hepatocellular carcinoma (HCC) treated with systemic therapy by using modified response evaluation criteria in solid tumors (mRECIST). Patients and methods 38 patients affected by multicentric HCC underwent MDCT. All exams were performed before and after iodinate contrast material intravenous injection by using a 320-detection row CT device. CT images were analyzed by two radiologists using multi-planar reconstructions (MPR) in order to assess the response to systemic therapy according to mRECIST criteria: complete response (CR), partial response (PR), progressive disease (PD), stable disease (SD). 30 days later, the same two radiologists evaluated target lesion response to systemic therapy according to mRECIST criteria by using 3D analysis software. The difference between the two systems in assessing HCC response to therapy was assessed by the analysis of the variance (Anova Test). Interobserver agreement between the two radiologists by using MPR images and 3D analysis software was calculated by using Cohen’s Kappa test. Results PR occurred in 10/38 cases (26%), PD in 6/38 (16%), SD in 22/38 (58%). Anova Test showed no statistically significant difference between the two systems for assessing target lesion response to therapy (p >0.05). Inter-observer agreement (k) was respectively of 0.62 for MPR images measurements and 0.86 for 3D analysis ones. Conclusions 3D Analysis software provides a semiautomatic system for assessing target lesion response to therapy according to mRE-CIST criteria in patient affected by multifocal HCC treated with systemic therapy. The reliability of 3D analysis software makes it useful in the clinical practice. PMID:28098056

  19. Delirium diagnosis defined by cluster analysis of symptoms versus diagnosis by DSM and ICD criteria: diagnostic accuracy study.

    PubMed

    Sepulveda, Esteban; Franco, José G; Trzepacz, Paula T; Gaviria, Ana M; Meagher, David J; Palma, José; Viñuelas, Eva; Grau, Imma; Vilella, Elisabet; de Pablo, Joan

    2016-05-26

    Information on validity and reliability of delirium criteria is necessary for clinicians, researchers, and further developments of DSM or ICD. We compare four DSM and ICD delirium diagnostic criteria versions, which were developed by consensus of experts, with a phenomenology-based natural diagnosis delineated using cluster analysis of delirium features in a sample with a high prevalence of dementia. We also measured inter-rater reliability of each system when applied by two evaluators from distinct disciplines. Cross-sectional analysis of 200 consecutive patients admitted to a skilled nursing facility, independently assessed within 24-48 h after admission with the Delirium Rating Scale-Revised-98 (DRS-R98) and for DSM-III-R, DSM-IV, DSM-5, and ICD-10 criteria for delirium. Cluster analysis (CA) delineated natural delirium and nondelirium reference groups using DRS-R98 items and then diagnostic systems' performance were evaluated against the CA-defined groups using logistic regression and crosstabs for discriminant analysis (sensitivity, specificity, percentage of subjects correctly classified by each diagnostic system and their individual criteria, and performance for each system when excluding each individual criterion are reported). Kappa Index (K) was used to report inter-rater reliability for delirium diagnostic systems and their individual criteria. 117 (58.5 %) patients had preexisting dementia according to the Informant Questionnaire on Cognitive Decline in the Elderly. CA delineated 49 delirium subjects and 151 nondelirium. Against these CA groups, delirium diagnosis accuracy was highest using DSM-III-R (87.5 %) followed closely by DSM-IV (86.0 %), ICD-10 (85.5 %) and DSM-5 (84.5 %). ICD-10 had the highest specificity (96.0 %) but lowest sensitivity (53.1 %). DSM-III-R had the best sensitivity (81.6 %) and the best sensitivity-specificity balance. DSM-5 had the highest inter-rater reliability (K =0.73) while DSM-III-R criteria were the least

  20. Applying air pollution modelling within a multi-criteria decision analysis framework to evaluate UK air quality policies

    NASA Astrophysics Data System (ADS)

    Chalabi, Zaid; Milojevic, Ai; Doherty, Ruth M.; Stevenson, David S.; MacKenzie, Ian A.; Milner, James; Vieno, Massimo; Williams, Martin; Wilkinson, Paul

    2017-10-01

    A decision support system for evaluating UK air quality policies is presented. It combines the output from a chemistry transport model, a health impact model and other impact models within a multi-criteria decision analysis (MCDA) framework. As a proof-of-concept, the MCDA framework is used to evaluate and compare idealized emission reduction policies in four sectors (combustion in energy and transformation industries, non-industrial combustion plants, road transport and agriculture) and across six outcomes or criteria (mortality, health inequality, greenhouse gas emissions, biodiversity, crop yield and air quality legal compliance). To illustrate a realistic use of the MCDA framework, the relative importance of the criteria were elicited from a number of stakeholders acting as proxy policy makers. In the prototype decision problem, we show that reducing emissions from industrial combustion (followed very closely by road transport and agriculture) is more advantageous than equivalent reductions from the other sectors when all the criteria are taken into account. Extensions of the MCDA framework to support policy makers in practice are discussed.

  1. Skin age testing criteria: characterization of human skin structures by 500 MHz MRI multiple contrast and image processing.

    PubMed

    Sharma, Rakesh

    2010-07-21

    Ex vivo magnetic resonance microimaging (MRM) image characteristics are reported in human skin samples in different age groups. Human excised skin samples were imaged using a custom coil placed inside a 500 MHz NMR imager for high-resolution microimaging. Skin MRI images were processed for characterization of different skin structures. Contiguous cross-sectional T1-weighted 3D spin echo MRI, T2-weighted 3D spin echo MRI and proton density images were compared with skin histopathology and NMR peaks. In all skin specimens, epidermis and dermis thickening and hair follicle size were measured using MRM. Optimized parameters TE and TR and multicontrast enhancement generated better MRI visibility of different skin components. Within high MR signal regions near to the custom coil, MRI images with short echo time were comparable with digitized histological sections for skin structures of the epidermis, dermis and hair follicles in 6 (67%) of the nine specimens. Skin % tissue composition, measurement of the epidermis, dermis, sebaceous gland and hair follicle size, and skin NMR peaks were signatures of skin type. The image processing determined the dimensionality of skin tissue components and skin typing. The ex vivo MRI images and histopathology of the skin may be used to measure the skin structure and skin NMR peaks with image processing may be a tool for determining skin typing and skin composition.

  2. Skin age testing criteria: characterization of human skin structures by 500 MHz MRI multiple contrast and image processing

    NASA Astrophysics Data System (ADS)

    Sharma, Rakesh

    2010-07-01

    Ex vivo magnetic resonance microimaging (MRM) image characteristics are reported in human skin samples in different age groups. Human excised skin samples were imaged using a custom coil placed inside a 500 MHz NMR imager for high-resolution microimaging. Skin MRI images were processed for characterization of different skin structures. Contiguous cross-sectional T1-weighted 3D spin echo MRI, T2-weighted 3D spin echo MRI and proton density images were compared with skin histopathology and NMR peaks. In all skin specimens, epidermis and dermis thickening and hair follicle size were measured using MRM. Optimized parameters TE and TR and multicontrast enhancement generated better MRI visibility of different skin components. Within high MR signal regions near to the custom coil, MRI images with short echo time were comparable with digitized histological sections for skin structures of the epidermis, dermis and hair follicles in 6 (67%) of the nine specimens. Skin % tissue composition, measurement of the epidermis, dermis, sebaceous gland and hair follicle size, and skin NMR peaks were signatures of skin type. The image processing determined the dimensionality of skin tissue components and skin typing. The ex vivo MRI images and histopathology of the skin may be used to measure the skin structure and skin NMR peaks with image processing may be a tool for determining skin typing and skin composition.

  3. Image Correlation: Part 1. Simulation and Analysis

    DTIC Science & Technology

    1976-11-01

    prepared for UNITED STATES AIR FORCE PROJECT RAND D D Or,• Illnel lSANT Dr-- CA. 90 ft M A R . . . . . -- a .02 .0. The research described In this...Analysis, Deputy Chief of Staff, Research and Development, Hq USAF. Reports of The Rand Corporation do not necessarily reflect the opinions or policies of...the sponsors of Rand research , 4. . . , * R-2057/1-PR Novem-ber 1976 Image Correlation: Part I Simulation and Analysis H. H. Bailey, F. W. Blackwell

  4. Applying Multiple Criteria Decision Analysis to Comparative Benefit-Risk Assessment: Choosing among Statins in Primary Prevention.

    PubMed

    Tervonen, Tommi; Naci, Huseyin; van Valkenhoef, Gert; Ades, Anthony E; Angelis, Aris; Hillege, Hans L; Postmus, Douwe

    2015-10-01

    Decision makers in different health care settings need to weigh the benefits and harms of alternative treatment strategies. Such health care decisions include marketing authorization by regulatory agencies, practice guideline formulation by clinical groups, and treatment selection by prescribers and patients in clinical practice. Multiple criteria decision analysis (MCDA) is a family of formal methods that help make explicit the tradeoffs that decision makers accept between the benefit and risk outcomes of different treatment options. Despite the recent interest in MCDA, certain methodological aspects are poorly understood. This paper presents 7 guidelines for applying MCDA in benefit-risk assessment and illustrates their use in the selection of a statin drug for the primary prevention of cardiovascular disease. We provide guidance on the key methodological issues of how to define the decision problem, how to select a set of nonoverlapping evaluation criteria, how to synthesize and summarize the evidence, how to translate relative measures to absolute ones that permit comparisons between the criteria, how to define suitable scale ranges, how to elicit partial preference information from the decision makers, and how to incorporate uncertainty in the analysis. Our example on statins indicates that fluvastatin is likely to be the most preferred drug by our decision maker and that this result is insensitive to the amount of preference information incorporated in the analysis.

  5. Criteria for the use of regression analysis for remote sensing of sediment and pollutants

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R.

    1982-01-01

    An examination of limitations, requirements, and precision of the linear multiple-regression technique for quantification of marine environmental parameters is conducted. Both environmental and optical physics conditions have been defined for which an exact solution to the signal response equations is of the same form as the multiple regression equation. Various statistical parameters are examined to define a criteria for selection of an unbiased fit when upwelled radiance values contain error and are correlated with each other. Field experimental data are examined to define data smoothing requirements in order to satisfy the criteria of Daniel and Wood (1971). Recommendations are made concerning improved selection of ground-truth locations to maximize variance and to minimize physical errors associated with the remote sensing experiment.

  6. Tomographic spectral imaging: analysis of localized corrosion.

    SciTech Connect

    Michael, Joseph Richard; Kotula, Paul Gabriel; Keenan, Michael Robert

    2005-02-01

    Microanalysis is typically performed to analyze the near surface of materials. There are many instances where chemical information about the third spatial dimension is essential to the solution of materials analyses. The majority of 3D analyses however focus on limited spectral acquisition and/or analysis. For truly comprehensive 3D chemical characterization, 4D spectral images (a complete spectrum from each volume element of a region of a specimen) are needed. Furthermore, a robust statistical method is needed to extract the maximum amount of chemical information from that extremely large amount of data. In this paper, an example of the acquisition and multivariate statistical analysis of 4D (3-spatial and 1-spectral dimension) x-ray spectral images is described. The method of utilizing a single- or dual-beam FIB (w/o or w/SEM) to get at 3D chemistry has been described by others with respect to secondary-ion mass spectrometry. The basic methodology described in those works has been modified for comprehensive x-ray microanalysis in a dual-beam FIB/SEM (FEI Co. DB-235). In brief, the FIB is used to serially section a site-specific region of a sample and then the electron beam is rastered over the exposed surfaces with x-ray spectral images being acquired at each section. All this is performed without rotating or tilting the specimen between FIB cutting and SEM imaging/x-ray spectral image acquisition. The resultant 4D spectral image is then unfolded (number of volume elements by number of channels) and subjected to the same multivariate curve resolution (MCR) approach that has proven successful for the analysis of lower-dimension x-ray spectral images. The TSI data sets can be in excess of 4Gbytes. This problem has been overcome (for now) and images up to 6Gbytes have been analyzed in this work. The method for analyzing such large spectral images will be described in this presentation. A comprehensive 3D chemical analysis was performed on several corrosion specimens

  7. Criteria and Thresholds for U.S. Navy Acoustic and Explosive Effects Analysis

    DTIC Science & Technology

    2012-04-01

    Cheloniidae (loggerhead, green, hawksbill , Kemp’s ridley, olive ridley, flatback sea turtle ) Family Dermochelyidae (leatherback sea turtle ) Criteria...Southwood, Higgins et al. (2007) Chelonia mydas Green turtle 8.7 Wood and Wood (1993) Eretmochelys imbricata Hawksbill turtle 7.4 Okuyama, Shimizu et...Journal of Experimental Biology 208: 4181- 4188. Okuyama, J., T. Shimizu, et al. (2010). "Wild Versus Head-Started Hawksbill Turtles Eretmochelys

  8. F-106 data summary and model results relative to threat criteria and protection design analysis

    NASA Technical Reports Server (NTRS)

    Pitts, F. L.; Finelli, G. B.; Perala, R. A.; Rudolph, T. H.

    1986-01-01

    The NASA F-106 has acquired considerable data on the rates-of-change of EM parameters on the aircraft surface during 690 direct lightning strikes while penetrating thunderstorms at altitudes from 15,000 to 40,000 feet. The data are presently being used in updating previous lightning criteria and standards. The new lightning standards will, therefore, be the first which reflect actual aircraft responses measured at flight altitudes.

  9. Image analysis from root system pictures

    NASA Astrophysics Data System (ADS)

    Casaroli, D.; Jong van Lier, Q.; Metselaar, K.

    2009-04-01

    Root research has been hampered by a lack of good methods and by the amount of time involved in making measurements. In general the studies from root system are made with either monolith or minirhizotron method which is used as a quantitative tool but requires comparison with conventional destructive methods. This work aimed to analyze roots systems images, obtained from a root atlas book, to different crops in order to find the root length and root length density and correlate them with the literature. Five crops images from Zea mays, Secale cereale, Triticum aestivum, Medicago sativa and Panicum miliaceum were divided in horizontal and vertical layers. Root length distribution was analyzed for horizontal as well as vertical layers. In order to obtain the root length density, a cuboidal volume was supposed to correspond to each part of the image. The results from regression analyses showed root length distributions according to horizontal or vertical layers. It was possible to find the root length distribution for single horizontal layers as a function of vertical layers, and also for single vertical layers as a function of horizontal layers. Regression analysis showed good fits when the root length distributions were grouped in horizontal layers according to the distance from the root center. When root length distributions were grouped according to soil horizons the fits worsened. The resulting root length density estimates were lower than those commonly found in literature, possibly due to (1) the fact that the crop images resulted from single plant situations, while the analyzed field experiments had more than one plant; (2) root overlapping may occur in the field; (3) root experiments, both in the field and image analyses as performed here, are subject to sampling errors; (4) the (hand drawn) images used in this study may have omitted some of the smallest roots.

  10. Image analysis applied to luminescence microscopy

    NASA Astrophysics Data System (ADS)

    Maire, Eric; Lelievre-Berna, Eddy; Fafeur, Veronique; Vandenbunder, Bernard

    1998-04-01

    We have developed a novel approach to study luminescent light emission during migration of living cells by low-light imaging techniques. The equipment consists in an anti-vibration table with a hole for a direct output under the frame of an inverted microscope. The image is directly captured by an ultra low- light level photon-counting camera equipped with an image intensifier coupled by an optical fiber to a CCD sensor. This installation is dedicated to measure in a dynamic manner the effect of SF/HGF (Scatter Factor/Hepatocyte Growth Factor) both on activation of gene promoter elements and on cell motility. Epithelial cells were stably transfected with promoter elements containing Ets transcription factor-binding sites driving a luciferase reporter gene. Luminescent light emitted by individual cells was measured by image analysis. Images of luminescent spots were acquired with a high aperture objective and time exposure of 10 - 30 min in photon-counting mode. The sensitivity of the camera was adjusted to a high value which required the use of a segmentation algorithm dedicated to eliminate the background noise. Hence, image segmentation and treatments by mathematical morphology were particularly indicated in these experimental conditions. In order to estimate the orientation of cells during their migration, we used a dedicated skeleton algorithm applied to the oblong spots of variable intensities emitted by the cells. Kinetic changes of luminescent sources, distance and speed of migration were recorded and then correlated with cellular morphological changes for each spot. Our results highlight the usefulness of the mathematical morphology to quantify kinetic changes in luminescence microscopy.

  11. Prevalence of restless legs syndrome in Ankara, Turkey: an analysis of diagnostic criteria and awareness.

    PubMed

    Yilmaz, Nesrin Helvaci; Akbostanci, Muhittin Cenk; Oto, Aycan; Aykac, Ozlem

    2013-09-01

    The aim of this study was threefold: (1) to investigate the prevalence of restless legs syndrome (RLS), in Ankara, Turkey; (2) to determine the predictive values of diagnostic criteria; and (3) to determine the frequency of physician referrals and the frequency of getting the correct diagnosis. A total of 815 individuals, from randomly selected addresses, above the age of 15, were reached using the questionnaire composed of the four diagnostic criteria. Individuals who responded by answering 'yes' for at least one question were interviewed by neurologists for the diagnosis of RLS. Frequency of physician referrals and frequency of getting the correct diagnosis of RLS were also determined for patients getting the final diagnoses of RLS. Prevalence of RLS in Ankara was 5.52 %; 41.0 % of the individuals diagnosed with RLS had replied 'yes' to either one, two or three questions asked by interviewers. However, only 21.3 % of individuals who replied 'yes' to all four questions received the diagnosis of RLS. Among the patients who had the final diagnosis of RLS, 25.7 % had referred to a physician for the symptoms and 22.2 % got the correct diagnosis. The RLS prevalence in Ankara was somewhere between Western and Far East countries compatible with the geographical location. Diagnostic criteria may not be fully predictive when applied by non-physician pollsters. Physician's probability of correctly diagnosing RLS is still low.

  12. A multiple criteria analysis for household solid waste management in the urban community of Dakar.

    PubMed

    Kapepula, Ka-Mbayu; Colson, Gerard; Sabri, Karim; Thonart, Philippe

    2007-01-01

    Household solid waste management is a severe problem in big cities of developing countries. Mismanaged solid waste dumpsites produce bad sanitary, ecological and economic consequences for the whole population, especially for the poorest urban inhabitants. Dealing with this problem, this paper utilizes field data collected in the urban community of Dakar, in view of ranking nine areas of the city with respect to multiple criteria of nuisance. Nine criteria are built and organized in three families that represent three classical viewpoints: the production of wastes, their collection and their treatment. Thanks to the method PROMETHEE and the software ARGOS, we do a pair-wise comparison of the nine areas, which allows their multiple criteria rankings according to each viewpoint and then globally. Finding the worst and best areas in terms of nuisance for a better waste management in the city is our final purpose, fitting as well as possible the needs of the urban community. Based on field knowledge and on the literature, we suggest applying general and area-specific remedies to the household solid waste problems.

  13. Criteria and indicators for the assessment of community forestry outcomes: a comparative analysis from Canada.

    PubMed

    Teitelbaum, Sara

    2014-01-01

    In Canada, there are few structured evaluations of community forestry despite more than twenty years of practice. This article presents a criteria and indicator framework, designed to elicit descriptive information about the types of socio-economic results being achieved by community forests in the Canadian context. The criteria and indicators framework draws on themes proposed by other researchers both in the field of community forestry and related areas. The framework is oriented around three concepts described as amongst the underlying objectives of community forestry, namely participatory governance, local economic benefits and multiple forest use. This article also presents the results of a field-based application of the criteria and indicators framework, comparing four case studies in three Canadian provinces. All four are community forests with direct tenure rights to manage and benefit from forestry activities. Results reveal that in terms of governance, the case studies adhere to two different models, which we name 'interest group' vs. 'local government'. Stronger participatory dimensions are evident in two case studies. In the area of local economic benefits, the four case studies perform similarly, with some of the strongest benefits being in employment creation, especially for those case studies that offer non-timber activities such as recreation and education. Two of four cases have clearly adopted a multiple-use approach to management. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. ACR Appropriateness Criteria Myelopathy.

    PubMed

    Roth, Christopher J; Angevine, Peter D; Aulino, Joseph M; Berger, Kevin L; Choudhri, Asim F; Fries, Ian Blair; Holly, Langston T; Kendi, Ayse Tuba Karaqulle; Kessler, Marcus M; Kirsch, Claudia F; Luttrull, Michael D; Mechtler, Laszlo L; O'Toole, John E; Sharma, Aseem; Shetty, Vilaas S; West, O Clark; Cornelius, Rebecca S; Bykowski, Julie

    2016-01-01

    Patients presenting with myelopathic symptoms may have a number of causative intradural and extradural etiologies, including disc degenerative diseases, spinal masses, infectious or inflammatory processes, vascular compromise, and vertebral fracture. Patients may present acutely or insidiously and may progress toward long-term paralysis if not treated promptly and effectively. Noncontrast CT is the most appropriate first examination in acute trauma cases to diagnose vertebral fracture as the cause of acute myelopathy. In most nontraumatic cases, MRI is the modality of choice to evaluate the location, severity, and causative etiology of spinal cord myelopathy, and predicts which patients may benefit from surgery. Myelopathy from spinal stenosis and spinal osteoarthritis is best confirmed without MRI intravenous contrast. Many other myelopathic conditions are more easily visualized after contrast administration. Imaging performed should be limited to the appropriate spinal levels, based on history, physical examination, and clinical judgment. The ACR Appropriateness Criteria are evidence-based guidelines for specific clinical conditions that are reviewed every three years by a multidisciplinary expert panel. The guideline development and review include an extensive analysis of current medical literature from peer-reviewed journals, and the application of a well-established consensus methodology (modified Delphi) to rate the appropriateness of imaging and treatment procedures by the panel. In those instances in which evidence is lacking or not definitive, expert opinion may be used to recommend imaging or treatment.

  15. Integrative analysis of cutaneous skin tumours using ultrasonogaphic criteria. Preliminary results.

    PubMed

    Crişan, Diana; Badea, Alexandru Florin; Crişan, Maria; Rastian, Ioana; Gheuca Solovastru, Laura; Badea, Radu

    2014-12-01

    of this study is to identify the US features of skin tumors, especially morphological and vascular, in order to develop an integrative and differentiating imaging model for benign and malignant skin tumors. Twenty three patients with solid skin tumors were included in the study. The diagnostic procedures were clinical examination, dermoscopy, multimodal ultrasonography (US), using high frequency and conventional US, contact elastography, and i.v. contrast enhanced ultrasound (CEUS). The US characteristics of the basal cell carcinomas were: hypoechoic, inhomogeneous masses, with hyperechoic or anechoic areas, depending on the histological differentiation, increased rigidity, uneven vascularization at Doppler examination, central or mixed type circulatory model, with 1-2 supply vessels, velocity >2 cm/s, intensely inhomogeneous load of the contrast agent (CA) and quick wash out time. The benign tumors were hypoechoic or echoic masses, with inhomogeneous structure, Doppler signal present only in dermofibromas, peripheral circulation model, velocities < 2.00 cm/s, a weak and uneven loading of the CA in the vascular bed, and a slow wash out time. Analysis of the CA dynamics evidenced a significantly higher value for the wash out time in the malignant tumors (38.2s+/- 15.15) as compared to the benign ones (54.2s +/- 8.5). Particularly the tumor thickness examination by HFUS evidences an ultrasound index that may be considered as a statistically significant predictive factor (p<0.05), highly sensitive (r =0.97) for the non-invasive assessment of the histological Breslow index. Elastography did not represent a differentiation examination in the cases studied. Ultrasound allows a complex, multimodal approach of skin tumors, which completes clinical and histological examinations, orients the therapeutic management and may assess the therapeutic efficacy and the tumoral prognosis.

  16. Automatic dirt trail analysis in dermoscopy images.

    PubMed

    Cheng, Beibei; Joe Stanley, R; Stoecker, William V; Osterwise, Christopher T P; Stricklin, Sherea M; Hinton, Kristen A; Moss, Randy H; Oliviero, Margaret; Rabinovitz, Harold S

    2013-02-01

    Basal cell carcinoma (BCC) is the most common cancer in the US. Dermatoscopes are devices used by physicians to facilitate the early detection of these cancers based on the identification of skin lesion structures often specific to BCCs. One new lesion structure, referred to as dirt trails, has the appearance of dark gray, brown or black dots and clods of varying sizes distributed in elongated clusters with indistinct borders, often appearing as curvilinear trails. In this research, we explore a dirt trail detection and analysis algorithm for extracting, measuring, and characterizing dirt trails based on size, distribution, and color in dermoscopic skin lesion images. These dirt trails are then used to automatically discriminate BCC from benign skin lesions. For an experimental data set of 35 BCC images with dirt trails and 79 benign lesion images, a neural network-based classifier achieved a 0.902 are under a receiver operating characteristic curve using a leave-one-out approach. Results obtained from this study show that automatic detection of dirt trails in dermoscopic images of BCC is feasible. This is important because of the large number of these skin cancers seen every year and the challenge of discovering these earlier with instrumentation. © 2011 John Wiley & Sons A/S.

  17. Hyperspectral imaging technology for pharmaceutical analysis

    NASA Astrophysics Data System (ADS)

    Hamilton, Sara J.; Lodder, Robert A.

    2002-06-01

    The sensitivity and spatial resolution of hyperspectral imaging instruments are tested in this paper using pharmaceutical applications. The first experiment tested the hypothesis that a near-IR tunable diode-based remote sensing system is capable of monitoring degradation of hard gelatin capsules at a relatively long distance. Spectra from the capsules were used to differentiate among capsules exposed to an atmosphere containing imaging spectrometry of tablets permits the identification and composition of multiple individual tables to be determined simultaneously. A near-IR camera was used to collect thousands of spectra simultaneously from a field of blister-packaged tablets. The number of tablets that a typical near-IR camera can currently analyze simultaneously form a field of blister- packaged tablets. The number of tablets that a typical near- IR camera can currently analyze simultaneously was estimated to be approximately 1300. The bootstrap error-adjusted single-sample technique chemometric-imaging algorithm was used to draw probability-density contour plots that revealed tablet composition. The single-capsule analysis provides an indication of how far apart the sample and instrumentation can be and still maintain adequate S/N, while the multiple- sample imaging experiment gives an indication of how many samples can be analyzed simultaneously while maintaining an adequate S/N and pixel coverage on each sample.

  18. Image analysis of Renaissance copperplate prints

    NASA Astrophysics Data System (ADS)

    Hedges, S. Blair

    2008-02-01

    From the fifteenth to the nineteenth centuries, prints were a common form of visual communication, analogous to photographs. Copperplate prints have many finely engraved black lines which were used to create the illusion of continuous tone. Line densities generally are 100-2000 lines per square centimeter and a print can contain more than a million total engraved lines 20-300 micrometers in width. Because hundreds to thousands of prints were made from a single copperplate over decades, variation among prints can have historical value. The largest variation is plate-related, which is the thinning of lines over successive editions as a result of plate polishing to remove time-accumulated corrosion. Thinning can be quantified with image analysis and used to date undated prints and books containing prints. Print-related variation, such as over-inking of the print, is a smaller but significant source. Image-related variation can introduce bias if images were differentially illuminated or not in focus, but improved imaging technology can limit this variation. The Print Index, the percentage of an area composed of lines, is proposed as a primary measure of variation. Statistical methods also are proposed for comparing and identifying prints in the context of a print database.

  19. Multispectral laser imaging for advanced food analysis

    NASA Astrophysics Data System (ADS)

    Senni, L.; Burrascano, P.; Ricci, M.

    2016-07-01

    A hardware-software apparatus for food inspection capable of realizing multispectral NIR laser imaging at four different wavelengths is herein discussed. The system was designed to operate in a through-transmission configuration to detect the presence of unwanted foreign bodies inside samples, whether packed or unpacked. A modified Lock-In technique was employed to counterbalance the significant signal intensity attenuation due to transmission across the sample and to extract the multispectral information more efficiently. The NIR laser wavelengths used to acquire the multispectral images can be varied to deal with different materials and to focus on specific aspects. In the present work the wavelengths were selected after a preliminary analysis to enhance the image contrast between foreign bodies and food in the sample, thus identifying the location and nature of the defects. Experimental results obtained from several specimens, with and without packaging, are presented and the multispectral image processing as well as the achievable spatial resolution of the system are discussed.

  20. Quantitative color analysis for capillaroscopy image segmentation.

    PubMed

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  1. Optimal site selection for sitting a solar park using multi-criteria decision analysis and geographical information systems

    NASA Astrophysics Data System (ADS)

    Georgiou, Andreas; Skarlatos, Dimitrios

    2016-07-01

    Among the renewable power sources, solar power is rapidly becoming popular because it is inexhaustible, clean, and dependable. It has also become more efficient since the power conversion efficiency of photovoltaic solar cells has increased. Following these trends, solar power will become more affordable in years to come and considerable investments are to be expected. Despite the size of solar plants, the sitting procedure is a crucial factor for their efficiency and financial viability. Many aspects influence such a decision: legal, environmental, technical, and financial to name a few. This paper describes a general integrated framework to evaluate land suitability for the optimal placement of photovoltaic solar power plants, which is based on a combination of a geographic information system (GIS), remote sensing techniques, and multi-criteria decision-making methods. An application of the proposed framework for the Limassol district in Cyprus is further illustrated. The combination of a GIS and multi-criteria methods produces an excellent analysis tool that creates an extensive database of spatial and non-spatial data, which will be used to simplify problems as well as solve and promote the use of multiple criteria. A set of environmental, economic, social, and technical constrains, based on recent Cypriot legislation, European's Union policies, and expert advice, identifies the potential sites for solar park installation. The pairwise comparison method in the context of the analytic hierarchy process (AHP) is applied to estimate the criteria weights in order to establish their relative importance in site evaluation. In addition, four different methods to combine information layers and check their sensitivity were used. The first considered all the criteria as being equally important and assigned them equal weight, whereas the others grouped the criteria and graded them according to their objective perceived importance. The overall suitability of the study

  2. Multiple Criteria Decision Analysis (MCDA) for evaluating new medicines in Health Technology Assessment and beyond: The Advance Value Framework.

    PubMed

    Angelis, Aris; Kanavos, Panos

    2017-09-01

    Escalating drug prices have catalysed the generation of numerous "value frameworks" with the aim of informing payers, clinicians and patients on the assessment and appraisal process of new medicines for the purpose of coverage and treatment selection decisions. Although this is an important step towards a more inclusive Value Based Assessment (VBA) approach, aspects of these frameworks are based on weak methodologies and could potentially result in misleading recommendations or decisions. In this paper, a Multiple Criteria Decision Analysis (MCDA) methodological process, based on Multi Attribute Value Theory (MAVT), is adopted for building a multi-criteria evaluation model. A five-stage model-building process is followed, using a top-down "value-focused thinking" approach, involving literature reviews and expert consultations. A generic value tree is structured capturing decision-makers' concerns for assessing the value of new medicines in the context of Health Technology Assessment (HTA) and in alignment with decision theory. The resulting value tree (Advance Value Tree) consists of three levels of criteria (top level criteria clusters, mid-level criteria, bottom level sub-criteria or attributes) relating to five key domains that can be explicitly measured and assessed: (a) burden of disease, (b) therapeutic impact, (c) safety profile (d) innovation level and (e) socioeconomic impact. A number of MAVT modelling techniques are introduced for operationalising (i.e. estimating) the model, for scoring the alternative treatment options, assigning relative weights of importance to the criteria, and combining scores and weights. Overall, the combination of these MCDA modelling techniques for the elicitation and construction of value preferences across the generic value tree provides a new value framework (Advance Value Framework) enabling the comprehensive measurement of value in a structured and transparent way. Given its flexibility to meet diverse requirements and

  3. Chosen aspects of multi-criteria analysis applied to support the choice of materials for building structures

    NASA Astrophysics Data System (ADS)

    Szafranko, E.

    2017-08-01

    When planning a building structure, dilemmas arise as to what construction and material solutions are feasible. The decisions are not always obvious. A procedure for selecting the variant that will best satisfy the expectations of the investor and future users of a structure must be founded on mathematical methods. The following deserve special attention: the MCE methods, Hierarchical Analysis Methods and Weighting Methods. Another interesting solution, particularly useful when dealing with evaluations which take into account negative values, is the Indicator Method. MCE methods are relatively popular owing to the simplicity of the calculations and ease of the interpretation of the results. Having prepared the input data properly, they enable the user to compare them on the same level. In a situation where an analysis involves a large number of data, it is more convenient to divide them into groups according to main criteria and subcriteria. This option is provided by hierarchical analysis methods. They are based on ordered sets of criteria, which are evaluated in groups. In some cases, this approach yields the results that are superior and easier to read. If an analysis encompasses direct and indirect effects, an Indicator Method seems to be a justified choice for selecting the right solution. The Indicator Method is different in character and relies on weights and assessments of effects. It allows the user to evaluate effectively the analyzed variants. This article explains the methodology of conducting a multi-criteria analysis, showing its advantages and disadvantages. An example of calculations contained in the article shows what problems can be encountered when making an assessment of various solutions regarding building materials and structures. For comparison, an analysis based on graphical methods developed by the author was presented.

  4. Simple Low Level Features for Image Analysis

    NASA Astrophysics Data System (ADS)

    Falcoz, Paolo

    As human beings, we perceive the world around us mainly through our eyes, and give what we see the status of “reality”; as such we historically tried to create ways of recording this reality so we could augment or extend our memory. From early attempts in photography like the image produced in 1826 by the French inventor Nicéphore Niépce (Figure 2.1) to the latest high definition camcorders, the number of recorded pieces of reality increased exponentially, posing the problem of managing all that information. Most of the raw video material produced today has lost its memory augmentation function, as it will hardly ever be viewed by any human; pervasive CCTVs are an example. They generate an enormous amount of data each day, but there is not enough “human processing power” to view them. Therefore the need for effective automatic image analysis tools is great, and a lot effort has been put in it, both from the academia and the industry. In this chapter, a review of some of the most important image analysis tools are presented.

  5. Nursing image: an evolutionary concept analysis.

    PubMed

    Rezaei-Adaryani, Morteza; Salsali, Mahvash; Mohammadi, Eesa

    2012-12-01

    A long-term challenge to the nursing profession is the concept of image. In this study, we used the Rodgers' evolutionary concept analysis approach to analyze the concept of nursing image (NI). The aim of this concept analysis was to clarify the attributes, antecedents, consequences, and implications associated with the concept. We performed an integrative internet-based literature review to retrieve English literature published from 1980-2011. Findings showed that NI is a multidimensional, all-inclusive, paradoxical, dynamic, and complex concept. The media, invisibility, clothing style, nurses' behaviors, gender issues, and professional organizations are the most important antecedents of the concept. We found that NI is pivotal in staff recruitment and nursing shortage, resource allocation to nursing, nurses' job performance, workload, burnout and job dissatisfaction, violence against nurses, public trust, and salaries available to nurses. An in-depth understanding of the NI concept would assist nurses to eliminate negative stereotypes and build a more professional image for the nurse and the profession.

  6. Intra voxel analysis in magnetic resonance imaging.

    PubMed

    Ambrosanio, Michele; Baselice, Fabio; Ferraioli, Giampaolo; Lenti, Flavia; Pascazio, Vito

    2017-04-01

    A technique for analyzing the composition of each voxel, in the magnetic resonance imaging (MRI) framework, is presented. By combining different acquisitions, a novel methodology, called intra voxel analysis (IVA), for the detection of multiple tissues and the estimation of their spin-spin relaxation times is proposed. The methodology exploits the sparse Bayesian learning (SBL) approach in order to solve a highly underdetermined problem imposing the solution sparsity. IVA, developed for spin echo imaging sequence, can be easily extended to any acquisition scheme. For validating the approach, simulated and real data sets are considered. Monte Carlo simulations have been implemented for evaluating the performances of IVA compared to methods existing in literature. Two clinical datasets acquired with a 3T scanner have been considered for validating the approach. With respect to other approaches presented in literature, IVA has proved to be more effective in the voxel composition analysis, in particular in the case of few acquired images. Results are interesting and very promising: IVA is expected to have a remarkable impact on the research community and on the diagnostic field. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Classifying CT/MR findings in patients with suspicion of hepatocellular carcinoma: Comparison of liver imaging reporting and data system and criteria-free Likert scale reporting models.

    PubMed

    Zhang, Yu-Dong; Zhu, Fei-Peng; Xu, Xun; Wang, Qing; Wu, Chen-Jiang; Liu, Xi-Sheng; Shi, Hai-Bin

    2016-02-01

    To compare the Liver Imaging Reporting and Data System (LI-RADS) and a criteria-free Likert scale (LS) reporting models for classifying computed tomography/magnetic resonance imaging (CT/MR) findings of suspicious hepatocellular carcinoma (HCC). Imaging data of 281 hepatic nodules in 203 patients were retrospectively included. Imaging characteristics including diameter, arterial hyperenhancement, washout, and capsule were reviewed independently by two groups of readers using LI-RADS and LS (range, score 1-5). LS is primarily based on the overall impression of image findings without using fixed criteria. Interreader agreement (IRA), intraclass agreement (ICA), and diagnostic performance were determined by Fleiss, Cohen's kappa (κ), and logistic regression, respectively. There were 167 contrast-enhanced CT (CECT) versus 114 MR data. Overall, IRA was moderate (κ = 0.47, 0.52); IRA was moderate-to-good for arterial hyperenhancement, washout, and capsule (κ = 0.56-0.69); excellent for diameter and tumor embolus (κ = 0.99). Overall, ICA between LI-RADS and LS was moderate (κ = 0.44-0.50); ICA was good for scores 1-2 (κ = 0.71-0.90), moderate for scores 3 and 5 (κ = 0.41-0.52), but very poor for score 4 (κ = 0.11-0.19). LI-RADS produced significantly lower accuracy (78.6% vs. 87.2%) and sensitivity (72.1% vs. 92.8%), higher specificity (97.3% vs. 71.2%) and positive likelihood ratio (+LR: 26.32 vs. 3.23) in diagnosis of HCC. CECT produced relatively low IRA, ICA, and diagnostic ability against MR. There were substantial variations in liver observations between LI-RADS and LS. Further study is needed to investigate ICA between CECT and MR. © 2015 Wiley Periodicals, Inc.

  8. System analysis approach to deriving design criteria (loads) for Space Shuttle and its payloads. Volume 1: General statement of approach

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.

    1981-01-01

    Space shuttle, the most complex transportation system designed to date, illustrates the requirement for an analysis approach that considers all major disciplines simultaneously. Its unique cross coupling and high sensitivity to aerodynamic uncertainties and high performance requirements dictated a less conservative approach than those taken in programs. Analyses performed for the space shuttle and certain payloads, Space Telescope and Spacelab, are used a examples. These illustrate the requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements control requirements and the resulting design verification approaches. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.

  9. System analysis approach to deriving design criteria (Loads) for Space Shuttle and its payloads. Volume 2: Typical examples

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.

    1981-01-01

    The achievement of an optimized design from the system standpoint under the low cost, high risk constraints of the present day environment was analyzed. Space Shuttle illustrates the requirement for an analysis approach that considers all major disciplines (coupling between structures control, propulsion, thermal, aeroelastic, and performance), simultaneously. The Space Shuttle and certain payloads, Space Telescope and Spacelab, are examined. The requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements, control requirements, and the resulting design verification approaches are illustrated. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.

  10. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    PubMed

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments).

  11. Markov Random Fields, Stochastic Quantization and Image Analysis

    DTIC Science & Technology

    1990-01-01

    Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.

  12. Covariance of lucky images: performance analysis

    NASA Astrophysics Data System (ADS)

    Cagigal, Manuel P.; Valle, Pedro J.; Cagigas, Miguel A.; Villó-Pérez, Isidro; Colodro-Conde, Carlos; Ginski, C.; Mugrauer, M.; Seeliger, M.

    2017-01-01

    The covariance of ground-based lucky images is a robust and easy-to-use algorithm that allows us to detect faint companions surrounding a host star. In this paper, we analyse the relevance of the number of processed frames, the frames' quality, the atmosphere conditions and the detection noise on the companion detectability. This analysis has been carried out using both experimental and computer-simulated imaging data. Although the technique allows us the detection of faint companions, the camera detection noise and the use of a limited number of frames reduce the minimum detectable companion intensity to around 1000 times fainter than that of the host star when placed at an angular distance corresponding to the few first Airy rings. The reachable contrast could be even larger when detecting companions with the assistance of an adaptive optics system.

  13. PAMS photo image retrieval prototype alternatives analysis

    SciTech Connect

    Conner, M.L.

    1996-04-30

    Photography and Audiovisual Services uses a system called the Photography and Audiovisual Management System (PAMS) to perform order entry and billing services. The PAMS system utilizes Revelation Technologies database management software, AREV. Work is currently in progress to link the PAMS AREV system to a Microsoft SQL Server database engine to provide photograph indexing and query capabilities. The link between AREV and SQLServer will use a technique called ``bonding.`` This photograph imaging subsystem will interface to the PAMS system and handle the image capture and retrieval portions of the project. The intent of this alternatives analysis is to examine the software and hardware alternatives available to meet the requirements for this project, and identify a cost-effective solution.

  14. Analysis on enhanced depth of field for integral imaging microscope.

    PubMed

    Lim, Young-Tae; Park, Jae-Hyeung; Kwon, Ki-Chul; Kim, Nam

    2012-10-08

    Depth of field of the integral imaging microscope is studied. In the integral imaging microscope, 3-D information is encoded as a form of elemental images Distance between intermediate plane and object point decides the number of elemental image and depth of field of integral imaging microscope. From the analysis, it is found that depth of field of the reconstructed depth plane image by computational integral imaging reconstruction is longer than depth of field of optical microscope. From analyzed relationship, experiment using integral imaging microscopy and conventional microscopy is also performed to confirm enhanced depth of field of integral imaging microscopy.

  15. Machine Learning Interface for Medical Image Analysis.

    PubMed

    Zhang, Yi C; Kagen, Alexander C

    2016-10-11

    TensorFlow is a second-generation open-source machine learning software library with a built-in framework for implementing neural networks in wide variety of perceptual tasks. Although TensorFlow usage is well established with computer vision datasets, the TensorFlow interface with DICOM formats for medical imaging remains to be established. Our goal is to extend the TensorFlow API to accept raw DICOM images as input; 1513 DaTscan DICOM images were obtained from the Parkinson's Progression Markers Initiative (PPMI) database. DICOM pixel intensities were extracted and shaped into tensors, or n-dimensional arrays, to populate the training, validation, and test input datasets for machine learning. A simple neural network was constructed in TensorFlow to classify images into normal or Parkinson's disease groups. Training was executed over 1000 iterations for each cross-validation set. The gradient descent optimization and Adagrad optimization algorithms were used to minimize cross-entropy between the predicted and ground-truth labels. Cross-validation was performed ten times to produce a mean accuracy of 0.938 ± 0.047 (95 % CI 0.908-0.967). The mean sensitivity was 0.974 ± 0.043 (95 % CI 0.947-1.00) and mean specificity was 0.822 ± 0.207 (95 % CI 0.694-0.950). We extended the TensorFlow API to enable DICOM compatibility in the context of DaTscan image analysis. We implemented a neural network classifier that produces diagnostic accuracies on par with excellent results from previous machine learning models. These results indicate the potential role of TensorFlow as a useful adjunct diagnostic tool in the clinical setting.

  16. A new multi criteria classification approach in a multi agent system applied to SEEG analysis.

    PubMed

    Kinié, A; Ndiaye, M; Montois, J J; Jacquelet, Y

    2007-01-01

    This work is focused on the study of the organization of the SEEG signals during epileptic seizures with multi-agent system approach. This approach is based on cooperative mechanisms of auto-organization at the micro level and of emergence of a global function at the macro level. In order to evaluate this approach we propose a distributed collaborative approach for the classification of the interesting signals. This new multi-criteria classification method is able to provide a relevant brain area structures organisation and to bring out epileptogenic networks elements. The method is compared to another classification approach a fuzzy classification and gives better results when applied to SEEG signals.

  17. Uses of software in digital image analysis: a forensic report

    NASA Astrophysics Data System (ADS)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  18. GIS analysis of the siting criteria for the Mixed and Low-Level Waste Treatment Facility and the Idaho Waste Processing Facility

    SciTech Connect

    Hoskinson, R.L.

    1994-01-01

    This report summarizes a study conducted using the Arc/Info{reg_sign} geographic information system (GIS) to analyze the criteria used for site selection for the Mixed and Low-Level Waste Treatment Facility (MLLWTF) and the Idaho Waste Processing Facility (IWPF). The purpose of the analyses was to determine, based on predefined criteria, the areas on the INEL that best satisfied the criteria. The coverages used in this study were produced by importing the AutoCAD files that produced the maps for a pre site selection draft report into the GIS. The files were then converted to Arc/Info{reg_sign} GIS format. The initial analysis was made by considering all of the criteria as having equal importance in determining the areas of the INEL that would best satisfy the requirements. Another analysis emphasized four of the criteria as ``must`` criteria which had to be satisfied. Additional analyses considered other criteria that were considered for, but not included in the predefined criteria. This GIS analysis of the siting criteria for the IWPF and MLLWTF provides a logical, repeatable, and defensible approach to the determination of candidate locations for the facilities. The results of the analyses support the location of the Candidate Locations.

  19. Wavelet-based image analysis system for soil texture analysis

    NASA Astrophysics Data System (ADS)

    Sun, Yun; Long, Zhiling; Jang, Ping-Rey; Plodinec, M. John

    2003-05-01

    Soil texture is defined as the relative proportion of clay, silt and sand found in a given soil sample. It is an important physical property of soil that affects such phenomena as plant growth and agricultural fertility. Traditional methods used to determine soil texture are either time consuming (hydrometer), or subjective and experience-demanding (field tactile evaluation). Considering that textural patterns observed at soil surfaces are uniquely associated with soil textures, we propose an innovative approach to soil texture analysis, in which wavelet frames-based features representing texture contents of soil images are extracted and categorized by applying a maximum likelihood criterion. The soil texture analysis system has been tested successfully with an accuracy of 91% in classifying soil samples into one of three general categories of soil textures. In comparison with the common methods, this wavelet-based image analysis approach is convenient, efficient, fast, and objective.

  20. Image analysis software and sample preparation demands

    NASA Astrophysics Data System (ADS)

    Roth, Karl n.; Wenzelides, Knut; Wolf, Guenter; Hufnagl, Peter

    1990-11-01

    Image analysis offers the opportunity to analyse many processes in medicine, biology and engeneering in a quantitative manner. Experience shows that it is only by awareness of preparation methods and attention to software design that full benefit can be reaped from a picture processing system in the fields of cytology and histology. Some examples of special stains for automated analysis are given here and the effectiveness of commercially available software packages is investigated. The application of picture processing and development of related special hardware and software has been increasing within the last years. As PC-based picture processing systems can be purchased at reasonable costs more and more users are confronted with these problems. Experience shows that the quality of commercially available software packages differ and the requirements on the sample preparation needed for successful problem solutions are often underestimated. But as always, sample preparation is still the key to success in automated image analysis for cells and tissues. Hence, a problem solution requires the permanent interaction between sample preparation methods and algorithm development.

  1. An analysis of the qualification criteria for small radioactive material shipping packages

    SciTech Connect

    McClure, J.D.

    1983-05-01

    The RAM package design certification process has two important elements, testing and acceptance. These terms sound very similar but they have specific meanings. Qualification testing in the context of this study is the imposition of simulated accident test conditions upon the candidate package design. (Normal transportation environments may also be included.) Following qualification testing, the acceptance criteria provide the performance levels which, if demonstrated, indicate the ability of the RAM package to sustain the severity of the qualification testing sequence and yet maintain specified levels of package integrity. This study has used Severities of Transportation Accidents as a data base to examine the regulatory test criteria which are required to be met by small packages containing Type B quantities of radioactive material (RAM). The basic findings indicate that the present regulatory test standards provide significantly higher levels of protection for the surface transportation modes (truck, rail) than for RAM packages shipped by aircraft. It should also be noted that various risk assessment studies have shown that the risk to the public due to severe transport accidents by surface and air transport modes is very low. A key element in this study was the quantification of the severity of the transportation accident environment and the severity of the present qualification test standards (called qualification test standards in this document) so that a direct comparison could be made between them to assess the effectiveness of the existing qualification test standards. The manner in which this was accomplished is described.

  2. Quantitative image analysis in the assessment of diffuse large B-cell lymphoma.

    PubMed

    Chabot-Richards, Devon S; Martin, David R; Myers, Orrin B; Czuchlewski, David R; Hunt, Kristin E

    2011-12-01

    Proliferation rates in diffuse large B-cell lymphoma have been associated with conflicting outcomes in the literature, more often with high proliferation associated with poor prognosis. In most studies, the proliferation rate was estimated by a pathologist using an immunohistochemical stain for the monoclonal antibody Ki-67. We hypothesized that a quantitative image analysis algorithm would give a more accurate estimate of the proliferation rate, leading to better associations with survival. In all, 84 cases of diffuse large B-cell lymphoma were selected according to the World Health Organization criteria. Ki-67 percentage positivity estimated by the pathologist was recorded from the original report. The same slides were then scanned using an Aperio ImageScope, and Ki-67 percentage positivity was calculated using a computer-based quantitative immunohistochemistry nuclear algorithm. In addition, chart review was performed and survival time was recorded. The Ki-67 percentage estimated by the pathologist from the original report versus quantitative image analysis was significantly correlated (P<0.001), but pathologist Ki-67 percentages were significantly higher than quantitative image analysis (P=0.021). There was less agreement at lower Ki-67 percentages. Comparison of Ki-67 percentage positivity versus survival did not show significant association either with pathologist estimate or quantitative image analysis. However, although not significant, there was a trend of worse survival at higher proliferation rates detected by the pathologist but not by quantitative image analysis. Interestingly, our data suggest that the Ki-67 percentage positivity as assessed by the pathologist may be more closely associated with survival outcome than that identified by quantitative image analysis. This may indicate that pathologists are better at selecting appropriate areas of the slide. More cases are needed to assess whether this finding would be statistically significant. Due to

  3. Accuracy of Computed Tomography Imaging Criteria in the Diagnosis of Adult Open Globe Injuries by Neuroradiology and Ophthalmology.

    PubMed

    Crowell, Eric L; Koduri, Vivek A; Supsupin, Emilio P; Klinglesmith, Robert E; Chuang, Alice Z; Kim, Gene; Baker, Laura A; Feldman, Robert M; Blieden, Lauren S

    2017-09-01

    The objective was to evaluate the sensitivity and specificity of computed tomography (CT) diagnosis of open globes, determine which imaging factors are most predictive of open globe injuries, and evaluate the agreement between neuroradiologist and ophthalmologist readers for diagnosis of open and closed globes. This study was a retrospective cohort study. Patients who presented to Memorial Hermann-Texas Medical Center with suspicion for open globes were reviewed. One neuroradiologist and two ophthalmologists masked to clinical information reviewed CT images for signs concerning for open globe including change in globe contour, anterior chamber deformation, intraocular air, vitreous hemorrhage, subretinal fluid indicating retinal or choroidal detachment, dislocated or absent lens, intraocular foreign body, and orbital fracture. Using the clinically or surgically confirmed globe status as the true globe status, sensitivity, specificity, and agreement (kappa) were calculated and used to investigate which imaging factors are most predictive of open globe injuries. A total of 114 patients were included: 35 patients with open globes and 79 patients with closed globes. Specificity was greater than 97% for each reader, and sensitivity ranged from 51% to 77% among readers. The imaging characteristics most consistently used to predict an open globe injury were change in globe contour and vitreous hemorrhage (sensitivity = 43% to 57%, specificity > 98%). The agreement of impression of open globe between the neuroradiologist and ophthalmologists was good and excellent between ophthalmologists. Computed tomography imaging is not absolute, and the sensitivity is still inadequate to be fully relied upon. The CT imaging findings most predictive of an open globe injury were change in globe contour and vitreous hemorrhage. Clinical examination or surgical exploration remains the most important component in evaluating for a suspected open globe, with CT imaging as an adjunct.

  4. Research on automatic human chromosome image analysis

    NASA Astrophysics Data System (ADS)

    Ming, Delie; Tian, Jinwen; Liu, Jian

    2007-11-01

    Human chromosome karyotyping is one of the essential tasks in cytogenetics, especially in genetic syndrome diagnoses. In this thesis, an automatic procedure is introduced for human chromosome image analysis. According to different status of touching and overlapping chromosomes, several segmentation methods are proposed to a