Science.gov

Sample records for image analysis criteria

  1. Design Criteria For Networked Image Analysis System

    NASA Astrophysics Data System (ADS)

    Reader, Cliff; Nitteberg, Alan

    1982-01-01

    Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.

  2. Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD): Development of Image Analysis Criteria and Examiner Reliability for Image Analysis

    PubMed Central

    Ahmad, Mansur; Hollender, Lars; Odont; Anderson, Quentin; Kartha, Krishnan; Ohrbach, Richard K.; Truelove, Edmond L.; John, Mike T.; Schiffman, Eric L.

    2011-01-01

    Introduction As a part of a multi-site RDC/TMD Validation Project, comprehensive TMJ diagnostic criteria were developed for image analysis using panoramic radiography, magnetic resonance imaging (MRI), and computed tomography (CT). Methods Inter-examiner reliability was estimated using the kappa (k) statistic, and agreement between rater pairs was characterized by overall, positive, and negative percent agreement. CT was the reference standard for assessing validity of other imaging modalities for detecting osteoarthritis (OA). Results For the radiological diagnosis of OA, reliability of the three examiners was poor for panoramic radiography (k = 0.16), fair for MRI (k = 0.46), and close to the threshold for excellent for CT (k = 0.71). Using MRI, reliability was excellent for diagnosing disc displacements (DD) with reduction (k = 0.78) and for DD without reduction (k = 0.94), and was good for effusion (k = 0.64). Overall percent agreement for pair-wise ratings was ≥ 82% for all conditions. Positive percent agreement for diagnosing OA was 19% for panoramic radiography, 59% for MRI, and 84% for CT. Using MRI, positive percent agreement for diagnoses of any DD was 95% and for effusion was 81%. Negative percent agreement was ≥ 88% for all conditions. Compared to CT, panoramic radiography and MRI had poor to marginal sensitivity, respectively, but excellent specificity, in detecting OA. Conclusion Comprehensive image analysis criteria for RDC/TMD Validation Project were developed, which can reliably be employed for assessing OA using CT, and for disc position and effusion using MRI. PMID:19464658

  3. Terahertz Wide-Angle Imaging and Analysis on Plane-wave Criteria Based on Inverse Synthetic Aperture Techniques

    NASA Astrophysics Data System (ADS)

    Gao, Jing Kun; Qin, Yu Liang; Deng, Bin; Wang, Hong Qiang; Li, Jin; Li, Xiang

    2016-04-01

    This paper presents two parts of work around terahertz imaging applications. The first part aims at solving the problems occurred with the increasing of the rotation angle. To compensate for the nonlinearity of terahertz radar systems, a calibration signal acquired from a bright target is always used. Generally, this compensation inserts an extra linear phase term in the intermediate frequency (IF) echo signal which is not expected in large-rotation angle imaging applications. We carried out a detailed theoretical analysis on this problem, and a minimum entropy criterion was employed to estimate and compensate for the linear-phase errors. In the second part, the effects of spherical wave on terahertz inverse synthetic aperture imaging are analyzed. Analytic criteria of plane-wave approximation were derived in the cases of different rotation angles. Experimental results of corner reflectors and an aircraft model based on a 330-GHz linear frequency-modulated continuous wave (LFMCW) radar system validated the necessity and effectiveness of the proposed compensation. By comparing the experimental images obtained under plane-wave assumption and spherical-wave correction, it also showed to be highly consistent with the analytic criteria we derived.

  4. Users' Relevance Criteria in Image Retrieval in American History.

    ERIC Educational Resources Information Center

    Choi, Youngok; Rasmussen, Edie M.

    2002-01-01

    Discussion of the availability of digital images focuses on a study of American history faculty and graduate students that investigated the criteria which image users apply when making judgments about the relevance of an image. Considers topicality and image quality and suggests implications for image retrieval system design. (Contains 63…

  5. Mangrove vulnerability modelling in parts of Western Niger Delta, Nigeria using satellite images, GIS techniques and Spatial Multi-Criteria Analysis (SMCA).

    PubMed

    Omo-Irabor, Omo O; Olobaniyi, Samuel B; Akunna, Joe; Venus, Valentijn; Maina, Joseph M; Paradzayi, Charles

    2011-07-01

    Mangroves are known for their global environmental and socioeconomic value. Despite their importance, mangrove like other ecosystems is now being threatened by natural and human-induced processes that damage them at alarming rates, thereby diminishing the limited number of existing mangrove vegetation. The development of a spatial vulnerability assessment model that takes into consideration environmental and socioeconomic criteria, in spatial and non-spatial formats has been attempted in this study. According to the model, 11 different input parameters are required in modelling mangrove vulnerability. These parameters and their effects on mangrove vulnerability were selected and weighted by experts in the related fields. Criteria identification and selection were mainly based on effects of environmental and socioeconomic changes associated with mangrove survival. The results obtained revealed the dominance of socioeconomic criteria such as population pressure and deforestation, with high vulnerability index of 0.75. The environmental criteria was broadly dispersed in the study area and represents vulnerability indices ranging from 0.00-0.75. This category reflects the greater influence of pollutant input from oil wells and pipelines and minimal contribution from climatic factors. This project has integrated spatial management framework for mangrove vulnerability assessment that utilises information technology in conjunction with expert knowledge and multi-criteria analysis to aid planners and policy/ decision makers in the protection of this very fragile ecosystem. PMID:20857193

  6. Retinal Imaging and Image Analysis

    PubMed Central

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:21743764

  7. Analysis of the impact of safeguards criteria

    SciTech Connect

    Mullen, M.F.; Reardon, P.T.

    1981-01-01

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) of the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.

  8. A new tool for analysis of cleanup criteria decisions.

    PubMed

    Klemic, Gladys A; Bailey, Paul; Elcock, Deborah

    2003-08-01

    Radionuclides and other hazardous materials resulting from processes used in nuclear weapons production contaminate soil, groundwater, and buildings around the United States. Cleanup criteria for environmental contaminants are agreed on prior to remediation and underpin the scope and legacy of the cleanup process. Analysis of cleanup criteria can be relevant for future agreements and may also provide insight into a complex decision making process where science and policy issues converge. An Internet accessible database has been established to summarize cleanup criteria and related factors involved in U.S. Department of Energy remediation decisions. This paper reports on a new user interface for the database that is designed to integrate related information into graphic displays and tables with interactive features that allow exploratory data analysis of cleanup criteria. Analysis of 137Cs in surface soil is presented as an example.

  9. Quantitative criteria for assessment of gamma-ray imager performance

    NASA Astrophysics Data System (ADS)

    Gottesman, Steve; Keller, Kristi; Malik, Hans

    2015-08-01

    In recent years gamma ray imagers such as the GammaCamTM and Polaris have demonstrated good imaging performance in the field. Imager performance is often summarized as "resolution", either angular, or spatial at some distance from the imager, however the definition of resolution is not always related to the ability to image an object. It is difficult to quantitatively compare imagers without a common definition of image quality. This paper examines three categories of definition: point source; line source; and area source. It discusses the details of those definitions and which ones are more relevant for different situations. Metrics such as Full Width Half Maximum (FWHM), variations on the Rayleigh criterion, and some analogous to National Imagery Interpretability Rating Scale (NIIRS) are discussed. The performance against these metrics is evaluated for a high resolution coded aperture imager modeled using Monte Carlo N-Particle (MCNP), and for a medium resolution imager measured in the lab.

  10. Alternative Test Criteria in Covariance Structure Analysis: A Unified Approach.

    ERIC Educational Resources Information Center

    Satorra, Albert

    1989-01-01

    Within covariance structural analysis, a unified approach to asymptotic theory of alternative test criteria for testing parametric restrictions is provided. More general statistics for addressing the case where the discrepancy function is not asymptotically optimal, and issues concerning power analysis and the asymptotic theory of testing-related…

  11. GIS Based Multi-Criteria Decision Analysis For Cement Plant Site Selection For Cuddalore District

    NASA Astrophysics Data System (ADS)

    Chhabra, A.

    2015-12-01

    India's cement industry is a vital part of its economy, providing employment to more than a million people. On the back of growing demands, due to increased construction and infrastructural activities cement market in India is expected to grow at a compound annual growth rate (CAGR) of 8.96 percent during the period 2014-2019. In this study, GIS-based spatial Multi Criteria Decision Analysis (MCDA) is used to determine the optimum and alternative sites to setup a cement plant. This technique contains a set of evaluation criteria which are quantifiable indicators of the extent to which decision objectives are realized. In intersection with available GIS (Geographical Information System) and local ancillary data, the outputs of image analysis serves as input for the multi-criteria decision making system. Moreover, the following steps were performed so as to represent the criteria in GIS layers, which underwent the GIS analysis in order to get several potential sites. Satellite imagery from LANDSAT 8 and ASTER DEM were used for the analysis. Cuddalore District in Tamil Nadu was selected as the study site as limestone mining is already being carried out in that region which meets the criteria of raw material for cement production. Several other criteria considered were land use land cover (LULC) classification (built-up area, river, forest cover, wet land, barren land, harvest land and agriculture land), slope, proximity to road, railway and drainage networks.

  12. Evaluation of Dairy Effluent Management Options Using Multiple Criteria Analysis

    NASA Astrophysics Data System (ADS)

    Hajkowicz, Stefan A.; Wheeler, Sarah A.

    2008-04-01

    This article describes how options for managing dairy effluent on the Lower Murray River in South Australia were evaluated using multiple criteria analysis (MCA). Multiple criteria analysis is a framework for combining multiple environmental, social, and economic objectives in policy decisions. At the time of the study, dairy irrigation in the region was based on flood irrigation which involved returning effluent to the river. The returned water contained nutrients, salts, and microbial contaminants leading to environmental, human health, and tourism impacts. In this study MCA was used to evaluate 11 options against 6 criteria for managing dairy effluent problems. Of the 11 options, the MCA model selected partial rehabilitation of dairy paddocks with the conversion of remaining land to other agriculture. Soon after, the South Australian Government adopted this course of action and is now providing incentives for dairy farmers in the region to upgrade irrigation infrastructure and/or enter alternative industries.

  13. Evaluation of dairy effluent management options using multiple criteria analysis.

    PubMed

    Hajkowicz, Stefan A; Wheeler, Sarah A

    2008-04-01

    This article describes how options for managing dairy effluent on the Lower Murray River in South Australia were evaluated using multiple criteria analysis (MCA). Multiple criteria analysis is a framework for combining multiple environmental, social, and economic objectives in policy decisions. At the time of the study, dairy irrigation in the region was based on flood irrigation which involved returning effluent to the river. The returned water contained nutrients, salts, and microbial contaminants leading to environmental, human health, and tourism impacts. In this study MCA was used to evaluate 11 options against 6 criteria for managing dairy effluent problems. Of the 11 options, the MCA model selected partial rehabilitation of dairy paddocks with the conversion of remaining land to other agriculture. Soon after, the South Australian Government adopted this course of action and is now providing incentives for dairy farmers in the region to upgrade irrigation infrastructure and/or enter alternative industries.

  14. Oncological image analysis.

    PubMed

    Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A

    2016-10-01

    Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade.

  15. 10 CFR 434.607 - Life cycle cost analysis criteria.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.607 Life cycle cost... subpart A of 10 CFR part 436. When performing optional life cycle cost analyses of energy conservation... 10 Energy 3 2014-01-01 2014-01-01 false Life cycle cost analysis criteria. 434.607 Section...

  16. 10 CFR 434.607 - Life cycle cost analysis criteria.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.607 Life cycle cost... subpart A of 10 CFR part 436. When performing optional life cycle cost analyses of energy conservation... 10 Energy 3 2013-01-01 2013-01-01 false Life cycle cost analysis criteria. 434.607 Section...

  17. 10 CFR 434.607 - Life cycle cost analysis criteria.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.607 Life cycle cost... subpart A of 10 CFR part 436. When performing optional life cycle cost analyses of energy conservation... 10 Energy 3 2012-01-01 2012-01-01 false Life cycle cost analysis criteria. 434.607 Section...

  18. Improvement and Extension of Shape Evaluation Criteria in Multi-Scale Image Segmentation

    NASA Astrophysics Data System (ADS)

    Sakamoto, M.; Honda, Y.; Kondo, A.

    2016-06-01

    From the last decade, the multi-scale image segmentation is getting a particular interest and practically being used for object-based image analysis. In this study, we have addressed the issues on multi-scale image segmentation, especially, in improving the performances for validity of merging and variety of derived region's shape. Firstly, we have introduced constraints on the application of spectral criterion which could suppress excessive merging between dissimilar regions. Secondly, we have extended the evaluation for smoothness criterion by modifying the definition on the extent of the object, which was brought for controlling the shape's diversity. Thirdly, we have developed new shape criterion called aspect ratio. This criterion helps to improve the reproducibility on the shape of object to be matched to the actual objectives of interest. This criterion provides constraint on the aspect ratio in the bounding box of object by keeping properties controlled with conventional shape criteria. These improvements and extensions lead to more accurate, flexible, and diverse segmentation results according to the shape characteristics of the target of interest. Furthermore, we also investigated a technique for quantitative and automatic parameterization in multi-scale image segmentation. This approach is achieved by comparing segmentation result with training area specified in advance by considering the maximization of the average area in derived objects or satisfying the evaluation index called F-measure. Thus, it has been possible to automate the parameterization that suited the objectives especially in the view point of shape's reproducibility.

  19. Improving diagnostic criteria for Propionibacterium acnes osteomyelitis: a retrospective analysis.

    PubMed

    Asseray, Nathalie; Papin, Christophe; Touchais, Sophie; Bemer, Pascale; Lambert, Chantal; Boutoille, David; Tequi, Brigitte; Gouin, François; Raffi, François; Passuti, Norbert; Potel, Gilles

    2010-07-01

    The identification of Propionibacterium acnes in cultures of bone and joint samples is always difficult to interpret because of the ubiquity of this microorganism. The aim of this study was to propose a diagnostic strategy to distinguish infections from contaminations. This was a retrospective analysis of all patient charts of those patients with >or=1 deep samples culture-positive for P. acnes. Every criterion was tested for sensitivity, specificity, and positive likelihood ratio, and then the diagnostic probability of combinations of criteria was calculated. Among 65 patients, 52 (80%) were considered truly infected with P. acnes, a diagnosis based on a multidisciplinary process. The most valuable diagnostic criteria were: >or=2 positive deep samples, peri-operative findings (necrosis, hardware loosening, etc.), and >or=2 surgical procedures. However, no single criterion was sufficient to ascertain the diagnosis. The following combinations of criteria had a diagnostic probability of >90%: >or=2 positive cultures + 1 criterion among: peri-operative findings, local signs of infection, >or=2 previous operations, orthopaedic devices; 1 positive culture + 3 criteria among: peri-operative findings, local signs of infection, >or=2 previous surgical operations, orthopaedic devices, inflammatory syndrome. The diagnosis of P. acnes osteomyelitis was greatly improved by combining different criteria, allowing differentiation between infection and contamination.

  20. Improving diagnostic criteria for Propionibacterium acnes osteomyelitis: a retrospective analysis.

    PubMed

    Asseray, Nathalie; Papin, Christophe; Touchais, Sophie; Bemer, Pascale; Lambert, Chantal; Boutoille, David; Tequi, Brigitte; Gouin, François; Raffi, François; Passuti, Norbert; Potel, Gilles

    2010-07-01

    The identification of Propionibacterium acnes in cultures of bone and joint samples is always difficult to interpret because of the ubiquity of this microorganism. The aim of this study was to propose a diagnostic strategy to distinguish infections from contaminations. This was a retrospective analysis of all patient charts of those patients with >or=1 deep samples culture-positive for P. acnes. Every criterion was tested for sensitivity, specificity, and positive likelihood ratio, and then the diagnostic probability of combinations of criteria was calculated. Among 65 patients, 52 (80%) were considered truly infected with P. acnes, a diagnosis based on a multidisciplinary process. The most valuable diagnostic criteria were: >or=2 positive deep samples, peri-operative findings (necrosis, hardware loosening, etc.), and >or=2 surgical procedures. However, no single criterion was sufficient to ascertain the diagnosis. The following combinations of criteria had a diagnostic probability of >90%: >or=2 positive cultures + 1 criterion among: peri-operative findings, local signs of infection, >or=2 previous operations, orthopaedic devices; 1 positive culture + 3 criteria among: peri-operative findings, local signs of infection, >or=2 previous surgical operations, orthopaedic devices, inflammatory syndrome. The diagnosis of P. acnes osteomyelitis was greatly improved by combining different criteria, allowing differentiation between infection and contamination. PMID:20141491

  1. Resolution criteria in double-slit microscopic imaging experiments

    NASA Astrophysics Data System (ADS)

    You, Shangting; Kuang, Cuifang; Zhang, Baile

    2016-09-01

    Double-slit imaging is widely used for verifying the resolution of high-resolution and super-resolution microscopies. However, due to the fabrication limits, the slit width is generally non-negligible, which can affect the claimed resolution. In this paper we theoretically calculate the electromagnetic field distribution inside and near the metallic double slit using waveguide mode expansion method, and acquire the far-field image by vectorial Fourier optics. We find that the slit width has minimal influence when the illuminating light is polarized parallel to the slits. In this case, the claimed resolution should be based on the center-to-center distance of the double-slit.

  2. Resolution criteria in double-slit microscopic imaging experiments

    PubMed Central

    You, Shangting; Kuang, Cuifang; Zhang, Baile

    2016-01-01

    Double-slit imaging is widely used for verifying the resolution of high-resolution and super-resolution microscopies. However, due to the fabrication limits, the slit width is generally non-negligible, which can affect the claimed resolution. In this paper we theoretically calculate the electromagnetic field distribution inside and near the metallic double slit using waveguide mode expansion method, and acquire the far-field image by vectorial Fourier optics. We find that the slit width has minimal influence when the illuminating light is polarized parallel to the slits. In this case, the claimed resolution should be based on the center-to-center distance of the double-slit. PMID:27640808

  3. Resolution criteria in double-slit microscopic imaging experiments.

    PubMed

    You, Shangting; Kuang, Cuifang; Zhang, Baile

    2016-01-01

    Double-slit imaging is widely used for verifying the resolution of high-resolution and super-resolution microscopies. However, due to the fabrication limits, the slit width is generally non-negligible, which can affect the claimed resolution. In this paper we theoretically calculate the electromagnetic field distribution inside and near the metallic double slit using waveguide mode expansion method, and acquire the far-field image by vectorial Fourier optics. We find that the slit width has minimal influence when the illuminating light is polarized parallel to the slits. In this case, the claimed resolution should be based on the center-to-center distance of the double-slit. PMID:27640808

  4. Description, Recognition and Analysis of Biological Images

    SciTech Connect

    Yu Donggang; Jin, Jesse S.; Luo Suhuai; Pham, Tuan D.; Lai Wei

    2010-01-25

    Description, recognition and analysis biological images plays an important role for human to describe and understand the related biological information. The color images are separated by color reduction. A new and efficient linearization algorithm is introduced based on some criteria of difference chain code. A series of critical points is got based on the linearized lines. The series of curvature angle, linearity, maximum linearity, convexity, concavity and bend angle of linearized lines are calculated from the starting line to the end line along all smoothed contours. The useful method can be used for shape description and recognition. The analysis, decision, classification of the biological images are based on the description of morphological structures, color information and prior knowledge, which are associated each other. The efficiency of the algorithms is described based on two applications. One application is the description, recognition and analysis of color flower images. Another one is related to the dynamic description, recognition and analysis of cell-cycle images.

  5. Development of Advanced Imaging Criteria for the Endoscopic Identification of Inflammatory Polyps

    PubMed Central

    Sussman, Daniel A; Barkin, Jodie A; Martin, Aileen M; Varma, Tanya; Clarke, Jennifer; Quintero, Maria A; Barkin, Heather B; Deshpande, Amar R; Barkin, Jamie S; Abreu, Maria T

    2015-01-01

    OBJECTIVES: Inflammatory polyps (IPs) are frequently encountered at colonoscopy in inflammatory bowel disease (IBD) patients and are associated with an increased risk of colon cancer. The aim of this prospective endoscopic image review and analysis was to describe endoscopic features of IPs in IBD patients at surveillance colonoscopy and determine the ability to endoscopically discern IPs from other colon polyps using high-definition white light (WL), narrow band imaging with magnification (NBI), and chromoendoscopy (CE). METHODS: Digital images of IPs using WL, NBI, and CE were reviewed by four attending gastroenterologists using a two-round modified Delphi method. The ability to endoscopically discern IPs from other colon polyps was determined among groups of gastroenterology fellows and attendings. IPs were classified by gross appearance, contour, surface pattern, pit pattern, and appearance of surrounding mucosa in IPs, as well as accuracy of diagnosis. RESULTS: Features characteristic of IPs included a fibrinous cap, surface friability and ulceration, an appendage-like appearance, the halo sign with CE, and a clustering of a multiplicity of IPs. The overall diagnostic accuracy for IP identification was 63% for WL, 42% for NBI, and 64% for CE. High degrees of histologic inflammation significantly improved the accuracy of diagnosis of IP with WL and CE, whereas the use of NBI significantly impaired IP accuracy. CONCLUSIONS: The overall diagnostic accuracy when applying these criteria to clinical images was modest, with incremental benefit with addition of CE to WL. CE showed promise predicting IP histology in actively inflamed tissue. Institutional Review Board approval was obtained. ClinicalTrials.gov Identifier: NCT01557387. PMID:26583503

  6. Image Retrieval: Theoretical Analysis and Empirical User Studies on Accessing Information in Images.

    ERIC Educational Resources Information Center

    Ornager, Susanne

    1997-01-01

    Discusses indexing and retrieval for effective searches of digitized images. Reports on an empirical study about criteria for analysis and indexing digitized images, and the different types of user queries done in newspaper image archives in Denmark. Concludes that it is necessary that the indexing represent both a factual and an expressional…

  7. Preliminary radiation criteria and nuclear analysis for ETF

    SciTech Connect

    Engholm, B.A.

    1980-09-01

    Preliminary biological and materials radiation dose criteria for the Engineering Test Facility are described and tabulated. In keeping with the ETF Mission Statement, a key biological dose criterion is a 24-hour shutdown dose rate of 2 mrem/hr on the surface of the outboard bulk shield. Materials dose criteria, which primarily govern the inboard shield design, include 10/sup 9/ rads exposure limit to epoxy insulation, 3 x 10/sup -4/ dpa damage to the TF coil copper stabilizer, and a total nuclear heating rate of 5 kW in the inboard TF coils. Nuclear analysis performed during FY 80 was directed primarily at the inboard and outboard bulk shielding, and at radiation streaming in the neutral beam drift ducts. Inboard and outboard shield thicknesses to achieve the biological and materials radiation criteria are 75 cm inboard and 125 cm outboard, the configuration consisting of alternating layers of stainless steel and borated water. The outboard shield also includes a 5 cm layer of lead. NBI duct streaming analyses performed by ORNL and LASL will play a key role in the design of the duct and NBI shielding in FY 81. The NBI aluminum cryopanel nuclear heating rate during the heating cycle is about 1 milliwatt/cm/sup 3/, which is far less than the permissible limit.

  8. Image Analysis of Foods.

    PubMed

    Russ, John C

    2015-09-01

    The structure of foods, both natural and processed ones, is controlled by many variables ranging from biology to chemistry and mechanical forces. The structure also controls many of the properties of the food, including consumer acceptance, taste, mouthfeel, appearance, and so on, and nutrition. Imaging provides an important tool for measuring the structure of foods. This includes 2-dimensional (2D) images of surfaces and sections, for example, viewed in a microscope, as well as 3-dimensional (3D) images of internal structure as may be produced by confocal microscopy, or computed tomography and magnetic resonance imaging. The use of images also guides robotics for harvesting and sorting. Processing of images may be needed to calibrate colors, reduce noise, enhance detail, and delineate structure and dimensions. Measurement of structural information such as volume fraction and internal surface areas, as well as the analysis of object size, location, and shape in both 2- and 3-dimensional images is illustrated and described, with primary references and examples from a wide range of applications. PMID:26270611

  9. Reliability and Diagnostic Performance of CT Imaging Criteria in the Diagnosis of Tuberculous Meningitis

    PubMed Central

    Botha, Hugo; Ackerman, Christelle; Candy, Sally; Carr, Jonathan A.; Griffith-Richards, Stephanie; Bateman, Kathleen J.

    2012-01-01

    Introduction Abnormalities on CT imaging may contribute to the diagnosis of tuberculous meningitis (TBM). Recently, an expert consensus case definition (CCD) and set of imaging criteria for diagnosing basal meningeal enhancement (BME) have been proposed. This study aimed to evaluate the sensitivity, specificity and reliability of these in a prospective cohort of adult meningitis patients. Methods Initial diagnoses were based on the CCD, classifying patients into: ‘Definite TBM’ (microbiological confirmation), ‘Probable TBM’ (diagnostic score ≥10), ‘Possible TBM’ (diagnostic score 6–9), ‘Not TBM’ (confirmation of an alternative diagnosis) or ‘Uncertain’ (diagnostic score of <6). CT images were evaluated independently on two occasions by four experienced reviewers. Intra-rater and inter-rater agreement were calculated using the kappa statistic. Sensitivities and specificities were calculated using both ‘Definite TBM’ and either ‘Definite TBM’ or ‘Probable TBM’ as gold standards. Results CT scan criteria for BME had good intra-rater agreement (κ range 0.35–0.78) and fair to moderate inter-rater agreement (κ range 0.20–0.52). Intra- and inter-rater agreement on the CCD components were good to fair (κ  =  ranges 0.47–0.81 and 0.21–0.63). Using ‘Definite TBM’ as a gold standard, the criteria for BME were very specific (61.5%–100%), but insensitive (5.9%–29.4%). Similarly, the imaging components of the CCD were highly specific (69.2–100%) but lacked sensitivity (0–56.7%). Similar values were found when using ‘Definite TBM’ or ‘Probable TBM’ as a gold standard. Discussion The fair to moderate inter-rater agreement and poor sensitivities of the criteria for BME suggest that little reliance should be placed in these features in isolation. While the presence of the CCD criteria of acute infarction or tuberculoma(s) appears useful as rule-in criteria, their absence is of little help in excluding TBM. The

  10. Convex half-quadratic criteria and interacting auxiliary variables for image restoration.

    PubMed

    Idier, J

    2001-01-01

    This paper deals with convex half-quadratic criteria and associated minimization algorithms for the purpose of image restoration. It brings a number of original elements within a unified mathematical presentation based on convex duality. Firstly, the Geman and Yang's and Geman and Reynolds's constructions are revisited, with a view to establishing the convexity properties of the resulting half-quadratic augmented criteria, when the original nonquadratic criterion is already convex. Secondly, a family of convex Gibbsian energies that incorporate interacting auxiliary variables is revealed as a potentially fruitful extension of the Geman and Reynolds's construction.

  11. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  12. Dynamic criteria: a longitudinal analysis of professional basketball players' outcomes.

    PubMed

    García-Izquierdo, Antonio León; Ramos-Villagrasa, Pedro José; Navarro, José

    2012-11-01

    This paper describes the fluctuations of temporal criteria dynamics in the context of professional sport. Specifically, we try to verify the underlying deterministic patterns in the outcomes of professional basketball players. We use a longitudinal approach based on the analysis of the outcomes of 94 basketball players over ten years, covering practically players' entire career development. Time series were analyzed with techniques derived from nonlinear dynamical systems theory. These techniques analyze the underlying patterns in outcomes without previous shape assumptions (linear or nonlinear). These techniques are capable of detecting an intermediate situation between randomness and determinism, called chaos. So they are very useful for the study of dynamic criteria in organizations. We have found most players (88.30%) have a deterministic pattern in their outcomes, and most cases are chaotic (81.92%). Players with chaotic patterns have higher outcomes than players with linear patterns. Moreover, players with power forward and center positions achieve better results than other players. The high number of chaotic patterns found suggests caution when appraising individual outcomes, when coaches try to find the appropriate combination of players to design a competitive team, and other personnel decisions. Management efforts must be made to assume this uncertainty.

  13. Image analysis library software development

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Bryant, J.

    1977-01-01

    The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.

  14. A Speedy Cardiovascular Diseases Classifier Using Multiple Criteria Decision Analysis

    PubMed Central

    Lee, Wah Ching; Hung, Faan Hei; Tsang, Kim Fung; Tung, Hoi Ching; Lau, Wing Hong; Rakocevic, Veselin; Lai, Loi Lei

    2015-01-01

    Each year, some 30 percent of global deaths are caused by cardiovascular diseases. This figure is worsening due to both the increasing elderly population and severe shortages of medical personnel. The development of a cardiovascular diseases classifier (CDC) for auto-diagnosis will help address solve the problem. Former CDCs did not achieve quick evaluation of cardiovascular diseases. In this letter, a new CDC to achieve speedy detection is investigated. This investigation incorporates the analytic hierarchy process (AHP)-based multiple criteria decision analysis (MCDA) to develop feature vectors using a Support Vector Machine. The MCDA facilitates the efficient assignment of appropriate weightings to potential patients, thus scaling down the number of features. Since the new CDC will only adopt the most meaningful features for discrimination between healthy persons versus cardiovascular disease patients, a speedy detection of cardiovascular diseases has been successfully implemented. PMID:25587978

  15. Medical Image Analysis Facility

    NASA Technical Reports Server (NTRS)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  16. Formulation of image quality prediction criteria for the Viking lander camera

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Jobson, D. J.; Taylor, E. J.; Wall, S. D.

    1973-01-01

    Image quality criteria are defined and mathematically formulated for the prediction computer program which is to be developed for the Viking lander imaging experiment. The general objective of broad-band (black and white) imagery to resolve small spatial details and slopes is formulated as the detectability of a right-circular cone with surface properties of the surrounding terrain. The general objective of narrow-band (color and near-infrared) imagery to observe spectral characteristics if formulated as the minimum detectable albedo variation. The general goal to encompass, but not exceed, the range of the scene radiance distribution within single, commandable, camera dynamic range setting is also considered.

  17. Analysis of eligibility criteria from ClinicalTrials.gov.

    PubMed

    Doods, Justin; Dugas, Martin; Fritz, Fleur

    2014-01-01

    Electronic health care records are being used more and more for patient documentation. This electronic data can be used for secondary purposes, for example through systems that support clinical research. Eligibility criteria have to be processable for such systems to work, but criteria published on ClinicalTrials.gov have been shown to be complex, making them challenging to re-use. We analysed the eligibility criteria on ClinicalTrials.gov using automatic methods to determine whether the criteria definition and number changed over time. From 1998 to 2012 the average number of words used to describe eligibility criteria per year increased by 46%, while the average number of lines used per year only slightly increases until 2000 and stabilizes afterwards. Whether the increase of words resulted in increased criteria complexity or whether more data elements are used to describe eligibility needs further investigation.

  18. Analysis of eligibility criteria from ClinicalTrials.gov.

    PubMed

    Doods, Justin; Dugas, Martin; Fritz, Fleur

    2014-01-01

    Electronic health care records are being used more and more for patient documentation. This electronic data can be used for secondary purposes, for example through systems that support clinical research. Eligibility criteria have to be processable for such systems to work, but criteria published on ClinicalTrials.gov have been shown to be complex, making them challenging to re-use. We analysed the eligibility criteria on ClinicalTrials.gov using automatic methods to determine whether the criteria definition and number changed over time. From 1998 to 2012 the average number of words used to describe eligibility criteria per year increased by 46%, while the average number of lines used per year only slightly increases until 2000 and stabilizes afterwards. Whether the increase of words resulted in increased criteria complexity or whether more data elements are used to describe eligibility needs further investigation. PMID:25160308

  19. Engineering design criteria for an image intensifier/image converter camera

    NASA Technical Reports Server (NTRS)

    Sharpsteen, J. T.; Lund, D. L.; Stoap, L. J.; Solheim, C. D.

    1976-01-01

    The design, display, and evaluation of an image intensifier/image converter camera which can be utilized in various requirements of spaceshuttle experiments are described. An image intensifier tube was utilized in combination with two brassboards as power supply and used for evaluation of night photography in the field. Pictures were obtained showing field details which would have been undistinguishable to the naked eye or to an ordinary camera.

  20. Criteria for High Quality Biology Teaching: An Analysis

    ERIC Educational Resources Information Center

    Tasci, Guntay

    2015-01-01

    This study aims to analyze the process under which biology lessons are taught in terms of teaching quality criteria (TQC). Teaching quality is defined as the properties of efficient teaching and is considered to be the criteria used to measure teaching quality both in general and specific to a field. The data were collected through classroom…

  1. Image based performance analysis of thermal imagers

    NASA Astrophysics Data System (ADS)

    Wegner, D.; Repasi, E.

    2016-05-01

    Due to advances in technology, modern thermal imagers resemble sophisticated image processing systems in functionality. Advanced signal and image processing tools enclosed into the camera body extend the basic image capturing capability of thermal cameras. This happens in order to enhance the display presentation of the captured scene or specific scene details. Usually, the implemented methods are proprietary company expertise, distributed without extensive documentation. This makes the comparison of thermal imagers especially from different companies a difficult task (or at least a very time consuming/expensive task - e.g. requiring the execution of a field trial and/or an observer trial). For example, a thermal camera equipped with turbulence mitigation capability stands for such a closed system. The Fraunhofer IOSB has started to build up a system for testing thermal imagers by image based methods in the lab environment. This will extend our capability of measuring the classical IR-system parameters (e.g. MTF, MTDP, etc.) in the lab. The system is set up around the IR- scene projector, which is necessary for the thermal display (projection) of an image sequence for the IR-camera under test. The same set of thermal test sequences might be presented to every unit under test. For turbulence mitigation tests, this could be e.g. the same turbulence sequence. During system tests, gradual variation of input parameters (e. g. thermal contrast) can be applied. First ideas of test scenes selection and how to assembly an imaging suite (a set of image sequences) for the analysis of imaging thermal systems containing such black boxes in the image forming path is discussed.

  2. Efficiency of model selection criteria in flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Calenda, G.; Volpi, E.

    2009-04-01

    The estimation of high flood quantiles requires the extrapolation of the probability distributions far beyond the usual sample length, involving high estimation uncertainties. The choice of the probability law, traditionally based on the hypothesis testing, is critical to this point. In this study the efficiency of different model selection criteria, seldom applied in flood frequency analysis, is investigated. The efficiency of each criterion in identifying the probability distribution of the hydrological extremes is evaluated by numerical simulations for different parent distributions, coefficients of variation and skewness, and sample sizes. The compared model selection procedures are the Akaike Information Criterion (AIC), the Bayesian Information Criterion (BIC), the Anderson Darling Criterion (ADC) recently discussed by Di Baldassarre et al. (2008) and Sample Quantile Criterion (SQC), recently proposed by the authors (Calenda et al., 2009). The SQC is based on the principle of maximising the probability density of the elements of the sample that are considered relevant to the problem, and takes into account both the accuracy and the uncertainty of the estimate. Since the stress is mainly on extreme events, the SQC involves upper-tail probabilities, where the effect of the model assumption is more critical. The proposed index is equal to the sum of logarithms of the inverse of the sample probability density of the observed quantiles. The definition of this index is based on the principle that the more centred is the sample value in respect to its density distribution (accuracy of the estimate) and the less spread is this distribution (uncertainty of the estimate), the greater is the probability density of the sample quantile. Thus, lower values of the index indicate a better performance of the distribution law. This criterion can operate the selection of the optimum distribution among competing probability models that are estimated using different samples. The

  3. Application of Model-Selection Criteria to Some Problems in Multivariate Analysis.

    ERIC Educational Resources Information Center

    Sclove, Stanley L.

    1987-01-01

    A review of model-selection criteria is presented, suggesting their similarities. Some problems treated by hypothesis tests may be more expeditiously treated by the application of model-selection criteria. Multivariate analysis, cluster analysis, and factor analysis are considered. (Author/GDC)

  4. Image quality criteria for wide-field x-ray imaging applications

    NASA Astrophysics Data System (ADS)

    Thompson, Patrick L.; Harvey, James E.

    1999-10-01

    For staring, wide-field applications, such as a solar x-ray imager, the severe off-axis aberrations of the classical Wolter Type-I grazing incidence x-ray telescope design drastically limits the 'resolution' near the solar limb. A specification upon on-axis fractional encircled energy is thus not an appropriate image quality criterion for such wide-angle applications. A more meaningful image quality criterion would be a field-weighted-average measure of 'resolution.' Since surface scattering effects from residual optical fabrication errors are always substantial at these very short wavelengths, the field-weighted-average half- power radius is a far more appropriate measure of aerial resolution. If an ideal mosaic detector array is being used in the focal plane, the finite pixel size provides a practical limit to this system performance. Thus, the total number of aerial resolution elements enclosed by the operational field-of-view, expressed as a percentage of the n umber of ideal detector pixels, is a further improved image quality criterion. In this paper we describe the development of an image quality criterion for wide-field applications of grazing incidence x-ray telescopes which leads to a new class of grazing incidence designs described in a following companion paper.

  5. Reflections on ultrasound image analysis.

    PubMed

    Alison Noble, J

    2016-10-01

    Ultrasound (US) image analysis has advanced considerably in twenty years. Progress in ultrasound image analysis has always been fundamental to the advancement of image-guided interventions research due to the real-time acquisition capability of ultrasound and this has remained true over the two decades. But in quantitative ultrasound image analysis - which takes US images and turns them into more meaningful clinical information - thinking has perhaps more fundamentally changed. From roots as a poor cousin to Computed Tomography (CT) and Magnetic Resonance (MR) image analysis, both of which have richer anatomical definition and thus were better suited to the earlier eras of medical image analysis which were dominated by model-based methods, ultrasound image analysis has now entered an exciting new era, assisted by advances in machine learning and the growing clinical and commercial interest in employing low-cost portable ultrasound devices outside traditional hospital-based clinical settings. This short article provides a perspective on this change, and highlights some challenges ahead and potential opportunities in ultrasound image analysis which may both have high impact on healthcare delivery worldwide in the future but may also, perhaps, take the subject further away from CT and MR image analysis research with time. PMID:27503078

  6. Reflections on ultrasound image analysis.

    PubMed

    Alison Noble, J

    2016-10-01

    Ultrasound (US) image analysis has advanced considerably in twenty years. Progress in ultrasound image analysis has always been fundamental to the advancement of image-guided interventions research due to the real-time acquisition capability of ultrasound and this has remained true over the two decades. But in quantitative ultrasound image analysis - which takes US images and turns them into more meaningful clinical information - thinking has perhaps more fundamentally changed. From roots as a poor cousin to Computed Tomography (CT) and Magnetic Resonance (MR) image analysis, both of which have richer anatomical definition and thus were better suited to the earlier eras of medical image analysis which were dominated by model-based methods, ultrasound image analysis has now entered an exciting new era, assisted by advances in machine learning and the growing clinical and commercial interest in employing low-cost portable ultrasound devices outside traditional hospital-based clinical settings. This short article provides a perspective on this change, and highlights some challenges ahead and potential opportunities in ultrasound image analysis which may both have high impact on healthcare delivery worldwide in the future but may also, perhaps, take the subject further away from CT and MR image analysis research with time.

  7. Environmental criteria in industrial facility siting decisions: An analysis

    NASA Astrophysics Data System (ADS)

    Briassoulis, Helen

    1995-03-01

    Environmental criteria are increasingly being employed in industrial facility siting, usually in multicriteria decision contexts, together with technical, socioeconomic and other considerations. This paper analyzes the criteria that have appeared in the published literature with the aim to offer guidance for their selection in a particular facility location problem. A number of alternative classification schemes are presented, first based on the most prevalent classification dimensions which are: the economy-environment relationship, purpose of the criterion, complexity, spatial and temporal scale, and level of measurement. The major scheme adopted draws from the economy-environment relationship and assigns environmental critera to one of seven categories: general characterizations of the environment, characteristics of individual environmental components, measures of the magnitude and intensity of the activity, measures of the nature and volume of wastes which are produced, characteristics of impacts on separate environmental media and receptors, general characterizations of environmental quality, and impacts on humans. Within each of these categories the criteria are analyzed in terms of the other classification dimensions. Common characteristics among the various criteria as well as future trends in their development are identified. This paper also discusses the most important factors conditioning the choice of criteria in a particular facility siting context and outlines a systematic procedure for their selection in real-world applications.

  8. Analysis of proposed criteria for human response to vibration

    NASA Technical Reports Server (NTRS)

    Janeway, R. N.

    1975-01-01

    The development of criteria for human vibration response is reviewed, including the evolution of the ISO standard 2631. The document is analyzed to show why its application to vehicle ride evaluation is strongly opposed. Alternative vertical horizontal limits for comfort are recommended in the ground vehicle ride frequency range above 1 Hz. These values are derived by correlating the absorbed power findings of Pradko and Lee with other established criteria. Special emphasis is placed on working limits in the frequency range of 1 to 10 Hz since this is the most significant area in ground vehicle ride evaluation.

  9. Image-guided tumor ablation: standardization of terminology and reporting criteria--a 10-year update.

    PubMed

    Ahmed, Muneeb; Solbiati, Luigi; Brace, Christopher L; Breen, David J; Callstrom, Matthew R; Charboneau, J William; Chen, Min-Hua; Choi, Byung Ihn; de Baère, Thierry; Dodd, Gerald D; Dupuy, Damian E; Gervais, Debra A; Gianfelice, David; Gillams, Alice R; Lee, Fred T; Leen, Edward; Lencioni, Riccardo; Littrup, Peter J; Livraghi, Tito; Lu, David S; McGahan, John P; Meloni, Maria Franca; Nikolic, Boris; Pereira, Philippe L; Liang, Ping; Rhim, Hyunchul; Rose, Steven C; Salem, Riad; Sofocleous, Constantinos T; Solomon, Stephen B; Soulen, Michael C; Tanaka, Masatoshi; Vogl, Thomas J; Wood, Bradford J; Goldberg, S Nahum

    2014-11-01

    Image-guided tumor ablation has become a well-established hallmark of local cancer therapy. The breadth of options available in this growing field increases the need for standardization of terminology and reporting criteria to facilitate effective communication of ideas and appropriate comparison among treatments that use different technologies, such as chemical (eg, ethanol or acetic acid) ablation, thermal therapies (eg, radiofrequency, laser, microwave, focused ultrasound, and cryoablation) and newer ablative modalities such as irreversible electroporation. This updated consensus document provides a framework that will facilitate the clearest communication among investigators regarding ablative technologies. An appropriate vehicle is proposed for reporting the various aspects of image-guided ablation therapy including classification of therapies, procedure terms, descriptors of imaging guidance, and terminology for imaging and pathologic findings. Methods are addressed for standardizing reporting of technique, follow-up, complications, and clinical results. As noted in the original document from 2003, adherence to the recommendations will improve the precision of communications in this field, leading to more accurate comparison of technologies and results, and ultimately to improved patient outcomes. PMID:25442132

  10. Spotlight-8 Image Analysis Software

    NASA Technical Reports Server (NTRS)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  11. Air Pollution Monitoring Site Selection by Multiple Criteria Decision Analysis

    EPA Science Inventory

    Criteria air pollutants (particulate matter, sulfur dioxide, oxides of nitrogen, volatile organic compounds, and carbon monoxide) as well as toxic air pollutants are a global concern. A particular scenario that is receiving increased attention in the research is the exposure to t...

  12. Oncological image analysis: medical and molecular image analysis

    NASA Astrophysics Data System (ADS)

    Brady, Michael

    2007-03-01

    This paper summarises the work we have been doing on joint projects with GE Healthcare on colorectal and liver cancer, and with Siemens Molecular Imaging on dynamic PET. First, we recall the salient facts about cancer and oncological image analysis. Then we introduce some of the work that we have done on analysing clinical MRI images of colorectal and liver cancer, specifically the detection of lymph nodes and segmentation of the circumferential resection margin. In the second part of the paper, we shift attention to the complementary aspect of molecular image analysis, illustrating our approach with some recent work on: tumour acidosis, tumour hypoxia, and multiply drug resistant tumours.

  13. Mapping tropical dry forest succession using multiple criteria spectral mixture analysis

    NASA Astrophysics Data System (ADS)

    Cao, Sen; Yu, Qiuyan; Sanchez-Azofeifa, Arturo; Feng, Jilu; Rivard, Benoit; Gu, Zhujun

    2015-11-01

    Tropical dry forests (TDFs) in the Americas are considered the first frontier of economic development with less than 1% of their total original coverage under protection. Accordingly, accurate estimates of their spatial extent, fragmentation, and degree of regeneration are critical in evaluating the success of current conservation policies. This study focused on a well-protected secondary TDF in Santa Rosa National Park (SRNP) Environmental Monitoring Super Site, Guanacaste, Costa Rica. We used spectral signature analysis of TDF ecosystem succession (early, intermediate, and late successional stages), and its intrinsic variability, to propose a new multiple criteria spectral mixture analysis (MCSMA) method on the shortwave infrared (SWIR) of HyMap image. Unlike most existing iterative mixture analysis (IMA) techniques, MCSMA tries to extract and make use of representative endmembers with spectral and spatial information. MCSMA then considers three criteria that influence the comparative importance of different endmember combinations (endmember models): root mean square error (RMSE); spatial distance (SD); and fraction consistency (FC), to create an evaluation framework to select a best-fit model. The spectral analysis demonstrated that TDFs have a high spectral variability as a result of biomass variability. By adopting two search strategies, the unmixing results showed that our new MCSMA approach had a better performance in root mean square error (early: 0.160/0.159; intermediate: 0.322/0.321; and late: 0.239/0.235); mean absolute error (early: 0.132/0.128; intermediate: 0.254/0.251; and late: 0.191/0.188); and systematic error (early: 0.045/0.055; intermediate: -0.211/-0.214; and late: 0.161/0.160), compared to the multiple endmember spectral mixture analysis (MESMA). This study highlights the importance of SWIR in differentiating successional stages in TDFs. The proposed MCSMA provides a more flexible and generalized means for the best-fit model determination

  14. Benign and Suspicious Ovarian Masses-MR Imaging Criteria for Characterization: Pictorial Review.

    PubMed

    Valentini, A L; Gui, B; Miccò, M; Mingote, M C; De Gaetano, A M; Ninivaggi, V; Bonomo, L

    2012-01-01

    Ovarian masses present a special diagnostic challenge when imaging findings cannot be categorized into benign or malignant pathology. Ultrasonography (US), Computed Tomography (CT), and Magnetic Resonance Imaging (MRI) are currently used to evaluate ovarian tumors. US is the first-line imaging investigation for suspected adnexal masses. Color Doppler US helps the diagnosis identifying vascularized components within the mass. CT is commonly performed in preoperative evaluation of a suspected ovarian malignancy, but it exposes patients to radiation. When US findings are nondiagnostic or equivocal, MRI can be a valuable problem solving tool, useful to give also surgical planning information. MRI is well known to provide accurate information about hemorrhage, fat, and collagen. It is able to identify different types of tissue contained in pelvic masses, distinguishing benign from malignant ovarian tumors. The knowledge of clinical syndromes and MRI features of these conditions is crucial in establishing an accurate diagnosis and determining appropriate treatment. The purpose of this paper is to illustrate MRI findings in neoplastic and non-neoplastic ovarian masses, which were assessed into three groups: cystic, solid, and solid/cystic lesions. MRI criteria for the correct diagnosis and characteristics for differentiating benign from malignant conditions are shown in this paper.

  15. Radiologist and automated image analysis

    NASA Astrophysics Data System (ADS)

    Krupinski, Elizabeth A.

    1999-07-01

    Significant advances are being made in the area of automated medical image analysis. Part of the progress is due to the general advances being made in the types of algorithms used to process images and perform various detection and recognition tasks. A more important reason for this growth in medical image analysis processes, may be due however to a very different reason. The use of computer workstations, digital image acquisition technologies and the use of CRT monitors for display of medical images for primary diagnostic reading is becoming more prevalent in radiology departments around the world. With the advance in computer- based displays, however, has come the realization that displaying images on a CRT monitor is not the same as displaying film on a viewbox. There are perceptual, cognitive and ergonomic issues that must be considered if radiologists are to accept this change in technology and display. The bottom line is that radiologists' performance must be evaluated with these new technologies and image analysis techniques in order to verify that diagnostic performance is at least as good with these new technologies and image analysis procedures as with film-based displays. The goal of this paper is to address some of the perceptual, cognitive and ergonomic issues associated with reading radiographic images from digital displays.

  16. A Comparison of Alternatives to Conducting Monte Carlo Analyses for Determining Parallel Analysis Criteria.

    ERIC Educational Resources Information Center

    Lautenschlager, Gary J.

    1989-01-01

    Procedures for implementing parallel analysis (PA) criteria in practice were compared, examining regression equation methods that can be used to estimate random data eigenvalues from known values of the sample size and number of variables. More internally accurate methods for determining PA criteria are presented. (SLD)

  17. Histopathological Image Analysis: A Review

    PubMed Central

    Gurcan, Metin N.; Boucheron, Laura; Can, Ali; Madabhushi, Anant; Rajpoot, Nasir; Yener, Bulent

    2010-01-01

    Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement to the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe. PMID:20671804

  18. Analysis of an interferometric Stokes imaging polarimeter

    NASA Astrophysics Data System (ADS)

    Murali, Sukumar

    Estimation of Stokes vector components from an interferometric fringe encoded image is a novel way of measuring the State Of Polarization (SOP) distribution across a scene. Imaging polarimeters employing interferometric techniques encode SOP in- formation across a scene in a single image in the form of intensity fringes. The lack of moving parts and use of a single image eliminates the problems of conventional polarimetry - vibration, spurious signal generation due to artifacts, beam wander, and need for registration routines. However, interferometric polarimeters are limited by narrow bandpass and short exposure time operations which decrease the Signal to Noise Ratio (SNR) defined as the ratio of the mean photon count to the standard deviation in the detected image. A simulation environment for designing an Interferometric Stokes Imaging polarimeter (ISIP) and a detector with noise effects is created and presented. Users of this environment are capable of imaging an object with defined SOP through an ISIP onto a detector producing a digitized image output. The simulation also includes bandpass imaging capabilities, control of detector noise, and object brightness levels. The Stokes images are estimated from a fringe encoded image of a scene by means of a reconstructor algorithm. A spatial domain methodology involving the idea of a unit cell and slide approach is applied to the reconstructor model developed using Mueller calculus. The validation of this methodology and effectiveness compared to a discrete approach is demonstrated with suitable examples. The pixel size required to sample the fringes and minimum unit cell size required for reconstruction are investigated using condition numbers. The importance of the PSF of fore-optics (telescope) used in imaging the object is investigated and analyzed using a point source imaging example and a Nyquist criteria is presented. Reconstruction of fringe modulated images in the presence of noise involves choosing an

  19. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  20. Psychogenic skin excoriations: diagnostic criteria, semiological analysis and psychiatric profiles.

    PubMed

    Misery, Laurent; Chastaing, Myriam; Touboul, Sylviane; Callot, Valérie; Schollhammer, Martine; Young, Paul; Feton-Danou, Nathalie; Dutray, Sabine

    2012-07-01

    Psychogenic excoriations are also called neurotic excoriations, dermatillomania or skin picking syndrome. We proposed diagnostic criteria and then performed a study of the psychiatric profiles of outpatients with psychogenic excoriations and the circumstances around the creation of these excoriations. Although the results must be interpreted with caution because the study was performed with only 10 patients, interesting data is provided about the onset of psychogenic excoriations, the behaviour of picking, and comorbidity. Common or specific characteristics were identified according to type of case. The majority of patients associated first excoriations with personal problems. Four patients reported abuse in childhood or adolescence. This study confirms that skin picking is an impulsive reaction and does not belong to the obsessive-compulsive disorders: impulsivity is defined by ineffective or failing control resulting in uninhibited behaviour.

  1. Image Analysis in Surgical Pathology.

    PubMed

    Lloyd, Mark C; Monaco, James P; Bui, Marilyn M

    2016-06-01

    Digitization of glass slides of surgical pathology samples facilitates a number of value-added capabilities beyond what a pathologist could previously do with a microscope. Image analysis is one of the most fundamental opportunities to leverage the advantages that digital pathology provides. The ability to quantify aspects of a digital image is an extraordinary opportunity to collect data with exquisite accuracy and reliability. In this review, we describe the history of image analysis in pathology and the present state of technology processes as well as examples of research and clinical use. PMID:27241112

  2. Pancreatoduodenectomy for chronic pancreatitis: anatomic selection criteria and subsequent long-term outcome analysis.

    PubMed Central

    Traverso, L W; Kozarek, R A

    1997-01-01

    OBJECTIVE: The authors sought to provide a framework through outcome analysis to evaluate operations directed toward the intractable abdominal pain of severe chronic pancreatitis centered in the pancreatic head. Pancreatoduodenectomy (PD) was used as an example. SUMMARY BACKGROUND DATA: Head resection for severe chronic pancreatitis is the treatment of choice for a ductal system in the head obliterated by severe disease when associated with intractable abdominal pain. To evaluate the effectiveness of promising head resection substitutes for PD, a framework is necessary to provide a reference standard (i.e., an outcome analysis) of PD. METHODS: Inclusion criteria were severe chronic pancreatitis centered in the pancreatic head, intractable abdominal pain, and a main pancreatic duct obstruction or stricture resulting in absent drainage into the duodenum from the uncinate process and adjacent pancreatic head areas or the entire gland. Since 1986, 57 consecutive cases with these criteria underwent PD (47 head only and 10 total pancreatectomy). Clinical and anatomic predictor variables were derived from the history, imaging studies, and pathologic examination. These variables then were tested for association with the following outcome events gathered during annual follow-up: pain relief, onset of diabetes, body weight maintenance, and peptic ulceration. RESULTS: Operative mortality was zero. In 57 patients with a mean follow-up of 42 months, the 5-year outcome event for survival was 93% and the onset of diabetes was 32%. All new cases of diabetes occurred more than 1 year after resection. In 43 cases > or =1 year postoperative with a mean follow-up of 55 months, all patients indicated significant pain relief and 76% were pain free. Pain relief was more common in patients with diabetes or in those patients with a pancreatic duct disruption. Death was more common in patients with diabetes. Weight maintenance was more common if preoperatively severe ductal changes were not

  3. Image analysis for DNA sequencing

    NASA Astrophysics Data System (ADS)

    Palaniappan, Kannappan; Huang, Thomas S.

    1991-07-01

    There is a great deal of interest in automating the process of DNA (deoxyribonucleic acid) sequencing to support the analysis of genomic DNA such as the Human and Mouse Genome projects. In one class of gel-based sequencing protocols autoradiograph images are generated in the final step and usually require manual interpretation to reconstruct the DNA sequence represented by the image. The need to handle a large volume of sequence information necessitates automation of the manual autoradiograph reading step through image analysis in order to reduce the length of time required to obtain sequence data and reduce transcription errors. Various adaptive image enhancement, segmentation and alignment methods were applied to autoradiograph images. The methods are adaptive to the local characteristics of the image such as noise, background signal, or presence of edges. Once the two-dimensional data is converted to a set of aligned one-dimensional profiles waveform analysis is used to determine the location of each band which represents one nucleotide in the sequence. Different classification strategies including a rule-based approach are investigated to map the profile signals, augmented with the original two-dimensional image data as necessary, to textual DNA sequence information.

  4. Decerns: A framework for multi-criteria decision analysis

    DOE PAGES

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; Sullivan, Terry

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  5. Basic image analysis and manipulation in ImageJ.

    PubMed

    Hartig, Sean M

    2013-01-01

    Image analysis methods have been developed to provide quantitative assessment of microscopy data. In this unit, basic aspects of image analysis are outlined, including software installation, data import, image processing functions, and analytical tools that can be used to extract information from microscopy data using ImageJ. Step-by-step protocols for analyzing objects in a fluorescence image and extracting information from two-color tissue images collected by bright-field microscopy are included.

  6. Errors from Image Analysis

    SciTech Connect

    Wood, William Monford

    2015-02-23

    Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.

  7. Automatic analysis of macroarrays images.

    PubMed

    Caridade, C R; Marcal, A S; Mendonca, T; Albuquerque, P; Mendes, M V; Tavares, F

    2010-01-01

    The analysis of dot blot (macroarray) images is currently based on the human identification of positive/negative dots, which is a subjective and time consuming process. This paper presents a system for the automatic analysis of dot blot images, using a pre-defined grid of markers, including a number of ON and OFF controls. The geometric deformations of the input image are corrected, and the individual markers detected, both tasks fully automatically. Based on a previous training stage, the probability for each marker to be ON is established. This information is provided together with quality parameters for training, noise and classification, allowing for a fully automatic evaluation of a dot blot image. PMID:21097139

  8. Evaluation of expert criteria for preoperative magnetic resonance imaging of newly diagnosed breast cancer.

    PubMed

    Behrendt, Carolyn E; Tumyan, Lusine; Gonser, Laura; Shaw, Sara L; Vora, Lalit; Paz, I Benjamin; Ellenhorn, Joshua D I; Yim, John H

    2014-08-01

    Despite 2 randomized trials reporting no reduction in operations or local recurrence at 1 year, preoperative magnetic resonance imaging (MRI) is increasingly used in diagnostic workup of breast cancer. We evaluated 5 utilization criteria recently proposed by experts. Of women (n = 340) newly diagnosed with unilateral breast cancer who underwent bilateral MRI, most (69.4%) met at least 1 criterion before MRI: mammographic density (44.4%), under consideration for partial breast irradiation (PBI) (19.7%), genetic-familial risk (12.9%), invasive lobular carcinoma (11.8%), and multifocal/multicentric disease (10.6%). MRI detected occult malignant lesion or extension of index lesion in 21.2% of index, 3.3% of contralateral, breasts. No expert criterion was associated with MRI-detected malignant lesion, which associated instead with pre-MRI plan of lumpectomy without PBI (48.2% of subjects): Odds Ratio 3.05, 95% CI 1.57-5.91 (p adjusted for multiple hypothesis testing = 0.007, adjusted for index-vs-contralateral breast and covariates). The expert guidelines were not confirmed by clinical evidence.

  9. Multispectral Imaging Broadens Cellular Analysis

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Amnis Corporation, a Seattle-based biotechnology company, developed ImageStream to produce sensitive fluorescence images of cells in flow. The company responded to an SBIR solicitation from Ames Research Center, and proposed to evaluate several methods of extending the depth of field for its ImageStream system and implement the best as an upgrade to its commercial products. This would allow users to view whole cells at the same time, rather than just one section of each cell. Through Phase I and II SBIR contracts, Ames provided Amnis the funding the company needed to develop this extended functionality. For NASA, the resulting high-speed image flow cytometry process made its way into Medusa, a life-detection instrument built to collect, store, and analyze sample organisms from erupting hydrothermal vents, and has the potential to benefit space flight health monitoring. On the commercial end, Amnis has implemented the process in ImageStream, combining high-resolution microscopy and flow cytometry in a single instrument, giving researchers the power to conduct quantitative analyses of individual cells and cell populations at the same time, in the same experiment. ImageStream is also built for many other applications, including cell signaling and pathway analysis; classification and characterization of peripheral blood mononuclear cell populations; quantitative morphology; apoptosis (cell death) assays; gene expression analysis; analysis of cell conjugates; molecular distribution; and receptor mapping and distribution.

  10. Development on inelastic analysis acceptance criteria for radioactive material transportation packages

    SciTech Connect

    Ammerman, D.J.; Ludwigsen, J.S.

    1995-12-31

    The response of radioactive material transportation packages to mechanical accident loadings can be more accurately characterized by non-linear dynamic analysis than by the ``Equivalent dynamic`` static elastic analysis typically used in the design of these packages. This more accurate characterization of the response can lead to improved package safety and design efficiency. For non-linear dynamic analysis to become the preferred method of package design analysis, an acceptance criterion must be established that achieves an equivalent level of safety as the currently used criterion defined in NRC Regulatory Guide 7.6 (NRC 1978). Sandia National Laboratories has been conducting a study of possible acceptance criteria to meet this requirement. In this paper non-linear dynamic analysis acceptance criteria based on stress, strain, and strain-energy-density will be discussed. An example package design will be compared for each of the design criteria, including the approach of NRC Regulatory Guide 7.6.

  11. Family-based association analysis of alcohol dependence criteria and severity

    PubMed Central

    Wetherill, Leah; Kapoor, Manav; Agrawal, Arpana; Bucholz, Kathleen; Koller, Daniel; Bertelsen, Sarah E.; Le, Nhung; Wang, Jen-Chyong; Almasy, Laura; Hesselbrock, Victor; Kramer, John; Nurnberger, John I.; Schuckit, Marc; Tischfield, Jay A.; Xuei, Xiaoling; Porjesz, Bernice; Edenberg, Howard J.; Goate, Alison M.; Foroud, Tatiana

    2013-01-01

    Background Despite the high heritability of alcohol dependence (AD), the genes found to be associated with it account for only a small proportion of its total variability. The goal of this study was to identify and analyze phenotypes based on homogeneous classes of individuals to increase the power to detect genetic risk factors contributing to the risk of AD. Methods The 7 individual DSM-IV criteria for AD were analyzed using latent class analysis (LCA) to identify classes defined by the pattern of endorsement of the criteria. A genome-wide association study was performed in 118 extended European American families (n = 2,322 individuals) densely affected with AD to identify genes associated with AD, with each of the seven DSM-IV criteria, and with the probability of belonging to two of three latent classes. Results Heritability for DSM-IV AD was 61%, and ranged from 17-60% for the other phenotypes. A SNP in the olfactory receptor OR51L1 was significantly associated (7.3 × 10−8) with the DSM-IV criterion of persistent desire to, or inability to, cut down on drinking. LCA revealed a three-class model: the “low risk” class (50%) rarely endorsed any criteria, and none met criteria for AD; the “moderate risk” class (33) endorsed primarily 4 DSM-IV criteria, and 48% met criteria for AD; the “high risk” class (17%) manifested high endorsement probabilities for most criteria and nearly all (99%) met criteria for AD One single nucleotide polymorphism (SNP) in a sodium leak channel NALCN demonstrated genome-wide significance with the high risk class (p=4.1 × 10−8). Analyses in an independent sample did not replicate these associations. Conclusion We explored the genetic contribution to several phenotypes derived from the DSM-IV alcohol dependence criteria. The strongest evidence of association was with SNPs in NALCN and OR51L1. PMID:24015780

  12. A Comparative Investigation of Rotation Criteria within Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Sass, Daniel A.; Schmitt, Thomas A.

    2010-01-01

    Exploratory factor analysis (EFA) is a commonly used statistical technique for examining the relationships between variables (e.g., items) and the factors (e.g., latent traits) they depict. There are several decisions that must be made when using EFA, with one of the more important being choice of the rotation criterion. This selection can be…

  13. Data selection criteria in star-based monitoring of GOES imager visible-channel responsivities

    NASA Astrophysics Data System (ADS)

    Chang, I.-Lok; Crosby, David; Dean, Charles; Weinreb, Michael; Baltimore, Perry; Baucom, Jeanette; Han, Dejiang

    2004-10-01

    Monitoring the responsivities of the visible channels of the operational Geostationary Operational Environmental Satellites (GOES) is an on-going effort at NOAA. Various techniques are being used. In this paper we describe the technique based on the analysis of star signals that are used in the GOES Orbit and Attitude Tracking System (OATS) for satellite attitude and orbit determination. Time series of OATS star observations give information on the degradation of the detectors of a visible channel. Investigations of star data from the past three years have led to several modifications of the method we initially used to calculate the exponential degradation coefficient of a star-signal time series. First we observed that different patterns of detector output versus time result when star images drift across the detector array along different trajectories. We found that certain trajectories should be rejected in the data analysis. We found also that some detector-dependent weighting coefficients used in the OATS analysis tend to scatter the star signals measured by different detectors. We present a set of modifications to our star monitoring algorithms for resolving such problems. Other simple enhancements on the algorithms will also be described. With these modifications, the time series of the star signals show less scatter. This allows for more confidence in the estimated degradation rates and a more realistic statistical analysis on the extent of uncertainty in those rates. The resulting time series and estimated degradation rates for the visible channels of GOES-8 and GOES-10 Imagers will be presented.

  14. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  15. Multi-criteria analysis for PM10 planning

    NASA Astrophysics Data System (ADS)

    Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa

    To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.

  16. A Study to Determine Through Content Analysis Selected Criteria for Open-End Examinations.

    ERIC Educational Resources Information Center

    McNally, Elaine F.

    Content analysis was used to determine the evaluation criteria of high school and college teachers and college seniors in grading essay tests. Content analysis is defined as a way of asking a fixed set of questions unfalteringly of all of a predetermined body of writings, in such a way as to produce quantitative results. Four reponses to a…

  17. A comparative analysis of the D-criteria used to determine genetic links of small bodies

    NASA Astrophysics Data System (ADS)

    Sokolova, M. G.; Kondratyeva, E. D.; Nefedyev, Y. A.

    2013-10-01

    In this article the D-criteria, which can be used to determine the genetic relationships of small bodies with their parent bodies in the solar system, are estimated. Drummond (1981), Southworth and Hawkins (1963), Jopek (1993), dynamic (Kalinin and Kulikova, 2007; Holshevnikov and Titov, 2007) D-criteria were analysed. It was found that the Drummond criterion is less sensitive to errors of observations and its upper limit does not exceed 0.2. The Southworth-Hawkins and Jopek D-criteria are more stable and have good convergence. Limiting values, which vary in the range of 0.3-0.6 (except for the Lyrids), were determined on the basis of the analysis of six meteor showers for the Southworth-Hawkins and Jopek criteria.

  18. A distance-based uncertainty analysis approach to multi-criteria decision analysis for water resource decision making.

    PubMed

    Hyde, K M; Maier, H R; Colby, C B

    2005-12-01

    The choice among alternative water supply sources is generally based on the fundamental objective of maximising the ratio of benefits to costs. There is, however, a need to consider sustainability, the environment and social implications in regional water resources planning, in addition to economics. In order to achieve this, multi-criteria decision analysis (MCDA) techniques can be used. Various sources of uncertainty exist in the application of MCDA methods, including the selection of the MCDA method, elicitation of criteria weights and assignment of criteria performance values. The focus of this paper is on the uncertainty in the criteria weights. Sensitivity analysis can be used to analyse the effects of uncertainties associated with the criteria weights. Two existing sensitivity methods are described in this paper and a new distance-based approach is proposed which overcomes limitations of these methods. The benefits of the proposed approach are the concurrent alteration of the criteria weights, the applicability of the method to a range of MCDA techniques and the identification of the most critical criteria weights. The existing and proposed methods are applied to three case studies and the results indicate that simultaneous consideration of the uncertainty in the criteria weights should be an integral part of the decision making process.

  19. Development of a radiopharmaceutical activity schedule for technetium-99m dimercaptosuccinic acid in children based on image quality criteria.

    PubMed

    Smith, T; Gordon, I; Evans, K; Anderson, P J; Lythgoe, M F

    1997-11-01

    The aim of this study was to determine an activity schedule (amount of administered activity in relation to body weight) for technetium-99m dimercaptosuccinic acid examinations in children, from information present in renal scintigraphic images. Scans from 48 children (5 weeks to 14.8 years old) were graded for image quality according to the clarity of both kidney outline and internal structure. Numerical image data (kidney and background counts, signal-to-noise ratio) were associated with these subjective gradings to formulate three criteria, specifying the required values of the above-measured parameters to yield optimum grades of image quality. When applied to derived functions, a kidney uptake of 20% was required to satisfy the criterion based on the signal-to-noise ratio. Using this value with the other two criteria predicts the form of the weight-dependent activity schedule as a function of imaging time. Examples of schedules for imaging times of 300 and 600 s are compared with a schedule based on surface area.

  20. 75 FR 80544 - NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... COMMISSION NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the..., ``Confirmatory Thermal-Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis... . SUPPLEMENTARY INFORMATION: NUREG-1953, ``Confirmatory Thermal-Hydraulic Analysis to Support Specific...

  1. Minimizing impacts of land use change on ecosystem services using multi-criteria heuristic analysis.

    PubMed

    Keller, Arturo A; Fournier, Eric; Fox, Jessica

    2015-06-01

    Development of natural landscapes to support human activities impacts the capacity of the landscape to provide ecosystem services. Typically, several ecosystem services are impacted at a single development site and various footprint scenarios are possible, thus a multi-criteria analysis is needed. Restoration potential should also be considered for the area surrounding the permanent impact site. The primary objective of this research was to develop a heuristic approach to analyze multiple criteria (e.g. impacts to various ecosystem services) in a spatial configuration with many potential development sites. The approach was to: (1) quantify the magnitude of terrestrial ecosystem service (biodiversity, carbon sequestration, nutrient and sediment retention, and pollination) impacts associated with a suite of land use change scenarios using the InVEST model; (2) normalize results across categories of ecosystem services to allow cross-service comparison; (3) apply the multi-criteria heuristic algorithm to select sites with the least impact to ecosystem services, including a spatial criterion (separation between sites). As a case study, the multi-criteria impact minimization algorithm was applied to InVEST output to select 25 potential development sites out of 204 possible locations (selected by other criteria) within a 24,000 ha property. This study advanced a generally applicable spatial multi-criteria approach for 1) considering many land use footprint scenarios, 2) balancing impact decisions across a suite of ecosystem services, and 3) determining the restoration potential of ecosystem services after impacts.

  2. Comparison of the RECIST and PERCIST criteria in solid tumors: a pooled analysis and review

    PubMed Central

    Min, Seon Jeong; Jang, Hyun Joo; Kim, Jung Han

    2016-01-01

    The PET Response Criteria in Solid Tumors (PERCIST) is a new method for the quantitative assessment of metabolic changes in solid tumors. The assessments of tumor response between the RECIST and PERCIST have shown considerable difference in several studies. This pooled study was conducted to compare tumor response according to the two criteria in patients with solid tumors. We surveyed MEDLINE, EMBASE and PUBMED for articles with terms of the RECIST or PERCIST from 2009 and January 2016. There were six articles comparing the RECIST and PERCIST. A total of 268 patients were recruited; 81 with colorectal cancer, 60 with lung cancer, 48 with esophageal cancer, 28 with breast cancer, 14 with basal cell carcinoma, 12 with stomach cancer, 10 with head and neck cancer, and 16 with other rare cancers. The agreement of tumor response between the RECIST and PERCIST was moderate (k = 0.590). Of 268 patients, 101 (37.7%) showed discordance in the tumor responses between two criteria. When adopting the PERCIST, tumor response was upgraded in 85 patients and downgraded in 16. The estimated overall response rates were significantly different between two criteria (35.1% by RECIST vs. 54.1% by PERCIST, P < 0.0001). In conclusion, this pooled analysis demonstrates that the concordance of tumor responses between the RECIST and PERCIST criteria is not excellent. The PERCIST might be more suitable for assessing tumor response than the RECIST criteria. PMID:27036043

  3. Minimizing impacts of land use change on ecosystem services using multi-criteria heuristic analysis.

    PubMed

    Keller, Arturo A; Fournier, Eric; Fox, Jessica

    2015-06-01

    Development of natural landscapes to support human activities impacts the capacity of the landscape to provide ecosystem services. Typically, several ecosystem services are impacted at a single development site and various footprint scenarios are possible, thus a multi-criteria analysis is needed. Restoration potential should also be considered for the area surrounding the permanent impact site. The primary objective of this research was to develop a heuristic approach to analyze multiple criteria (e.g. impacts to various ecosystem services) in a spatial configuration with many potential development sites. The approach was to: (1) quantify the magnitude of terrestrial ecosystem service (biodiversity, carbon sequestration, nutrient and sediment retention, and pollination) impacts associated with a suite of land use change scenarios using the InVEST model; (2) normalize results across categories of ecosystem services to allow cross-service comparison; (3) apply the multi-criteria heuristic algorithm to select sites with the least impact to ecosystem services, including a spatial criterion (separation between sites). As a case study, the multi-criteria impact minimization algorithm was applied to InVEST output to select 25 potential development sites out of 204 possible locations (selected by other criteria) within a 24,000 ha property. This study advanced a generally applicable spatial multi-criteria approach for 1) considering many land use footprint scenarios, 2) balancing impact decisions across a suite of ecosystem services, and 3) determining the restoration potential of ecosystem services after impacts. PMID:25794964

  4. Do choosing wisely tools meet criteria for patient decision aids? A descriptive analysis of patient materials

    PubMed Central

    Légaré, France; Hébert, Jessica; Goh, Larissa; Lewis, Krystina B; Leiva Portocarrero, Maria Ester; Robitaille, Hubert; Stacey, Dawn

    2016-01-01

    Objectives Choosing Wisely is a remarkable physician-led campaign to reduce unnecessary or harmful health services. Some of the literature identifies Choosing Wisely as a shared decision-making approach. We evaluated the patient materials developed by Choosing Wisely Canada to determine whether they meet the criteria for shared decision-making tools known as patient decision aids. Design Descriptive analysis of all Choosing Wisely Canada patient materials. Data source In May 2015, we selected all Choosing Wisely Canada patient materials from its official website. Main outcomes and measures Four team members independently extracted characteristics of the English materials using the International Patient Decision Aid Standards (IPDAS) modified 16-item minimum criteria for qualifying and certifying patient decision aids. The research team discussed discrepancies between data extractors and reached a consensus. Descriptive analysis was conducted. Results Of the 24 patient materials assessed, 12 were about treatments, 11 were about screening and 1 was about prevention. The median score for patient materials using IPDAS criteria was 10/16 (range: 8–11) for screening topics and 6/12 (range: 6–9) for prevention and treatment topics. Commonly missed criteria were stating the decision (21/24 did not), providing balanced information on option benefits/harms (24/24 did not), citing evidence (24/24 did not) and updating policy (24/24 did not). Out of 24 patient materials, only 2 met the 6 IPDAS criteria to qualify as patient decision aids, and neither of these 2 met the 6 certifying criteria. Conclusions Patient materials developed by Choosing Wisely Canada do not meet the IPDAS minimal qualifying or certifying criteria for patient decision aids. Modifications to the Choosing Wisely Canada patient materials would help to ensure that they qualify as patient decision aids and thus as more effective shared decision-making tools. PMID:27566638

  5. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    SciTech Connect

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J

    2003-10-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  6. The use of multi-criteria decision analysis to tackle waste management problems: a literature review.

    PubMed

    Achillas, Charisios; Moussiopoulos, Nicolas; Karagiannidis, Avraam; Banias, Georgias; Perkoulidis, George

    2013-02-01

    Problems in waste management have become more and more complex during recent decades. The increasing volumes of waste produced and social environmental consciousness present prominent drivers for environmental managers towards the achievement of a sustainable waste management scheme. However, in practice, there are many factors and influences - often mutually conflicting - criteria for finding solutions in real-life applications. This paper presents a review of the literature on multi-criteria decision aiding in waste management problems for all reported waste streams. Despite limitations, which are clearly stated, most of the work published in this field is reviewed. The present review aims to provide environmental managers and decision-makers with a thorough list of practical applications of the multi-criteria decision analysis techniques that are used to solve real-life waste management problems, as well as the criteria that are mostly employed in such applications according to the nature of the problem under study. Moreover, the paper explores the advantages and disadvantages of using multi-criteria decision analysis techniques in waste management problems in comparison to other available alternatives.

  7. Target identification by image analysis.

    PubMed

    Fetz, V; Prochnow, H; Brönstrup, M; Sasse, F

    2016-05-01

    Covering: 1997 to the end of 2015Each biologically active compound induces phenotypic changes in target cells that are characteristic for its mode of action. These phenotypic alterations can be directly observed under the microscope or made visible by labelling structural elements or selected proteins of the cells with dyes. A comparison of the cellular phenotype induced by a compound of interest with the phenotypes of reference compounds with known cellular targets allows predicting its mode of action. While this approach has been successfully applied to the characterization of natural products based on a visual inspection of images, recent studies used automated microscopy and analysis software to increase speed and to reduce subjective interpretation. In this review, we give a general outline of the workflow for manual and automated image analysis, and we highlight natural products whose bacterial and eucaryotic targets could be identified through such approaches. PMID:26777141

  8. Uncooled thermal imaging and image analysis

    NASA Astrophysics Data System (ADS)

    Wang, Shiyun; Chang, Benkang; Yu, Chunyu; Zhang, Junju; Sun, Lianjun

    2006-09-01

    Thermal imager can transfer difference of temperature to difference of electric signal level, so can be application to medical treatment such as estimation of blood flow speed and vessel 1ocation [1], assess pain [2] and so on. With the technology of un-cooled focal plane array (UFPA) is grown up more and more, some simple medical function can be completed with un-cooled thermal imager, for example, quick warning for fever heat with SARS. It is required that performance of imaging is stabilization and spatial and temperature resolution is high enough. In all performance parameters, noise equivalent temperature difference (NETD) is often used as the criterion of universal performance. 320 x 240 α-Si micro-bolometer UFPA has been applied widely presently for its steady performance and sensitive responsibility. In this paper, NETD of UFPA and the relation between NETD and temperature are researched. several vital parameters that can affect NETD are listed and an universal formula is presented. Last, the images from the kind of thermal imager are analyzed based on the purpose of detection persons with fever heat. An applied thermal image intensification method is introduced.

  9. Validity of Criteria-Based Content Analysis (CBCA) at Trial in Free-Narrative Interviews

    ERIC Educational Resources Information Center

    Roma, Paolo; San Martini, Pietro; Sabatello, Ugo; Tatarelli, Roberto; Ferracuti, Stefano

    2011-01-01

    Objective: The reliability of child witness testimony in sexual abuse cases is often controversial, and few tools are available. Criteria-Based Content Analysis (CBCA) is a widely used instrument for evaluating psychological credibility in cases of suspected child sexual abuse. Only few studies have evaluated CBCA scores in children suspected of…

  10. Planning applications in image analysis

    NASA Technical Reports Server (NTRS)

    Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.

    1994-01-01

    We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.

  11. Image analysis of dye stained patterns in soils

    NASA Astrophysics Data System (ADS)

    Bogner, Christina; Trancón y Widemann, Baltasar; Lange, Holger

    2013-04-01

    Quality of surface water and groundwater is directly affected by flow processes in the unsaturated zone. In general, it is difficult to measure or model water flow. Indeed, parametrization of hydrological models is problematic and often no unique solution exists. To visualise flow patterns in soils directly dye tracer studies can be done. These experiments provide images of stained soil profiles and their evaluation demands knowledge in hydrology as well as in image analysis and statistics. First, these photographs are converted to binary images classifying the pixels in dye stained and non-stained ones. Then, some feature extraction is necessary to discern relevant hydrological information. In our study we propose to use several index functions to extract different (ideally complementary) features. We associate each image row with a feature vector (i.e. a certain number of image function values) and use these features to cluster the image rows to identify similar image areas. Because images of stained profiles might have different reasonable clusterings, we calculate multiple consensus clusterings. An expert can explore these different solutions and base his/her interpretation of predominant flow mechanisms on quantitative (objective) criteria. The complete workflow from reading-in binary images to final clusterings has been implemented in the free R system, a language and environment for statistical computing. The calculation of image indices is part of our own package Indigo, manipulation of binary images, clustering and visualization of results are done using either build-in facilities in R, additional R packages or the LATEX system.

  12. The economics of project analysis: Optimal investment criteria and methods of study

    NASA Technical Reports Server (NTRS)

    Scriven, M. C.

    1979-01-01

    Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.

  13. Automated image analysis of uterine cervical images

    NASA Astrophysics Data System (ADS)

    Li, Wenjing; Gu, Jia; Ferris, Daron; Poirson, Allen

    2007-03-01

    Cervical Cancer is the second most common cancer among women worldwide and the leading cause of cancer mortality of women in developing countries. If detected early and treated adequately, cervical cancer can be virtually prevented. Cervical precursor lesions and invasive cancer exhibit certain morphologic features that can be identified during a visual inspection exam. Digital imaging technologies allow us to assist the physician with a Computer-Aided Diagnosis (CAD) system. In colposcopy, epithelium that turns white after application of acetic acid is called acetowhite epithelium. Acetowhite epithelium is one of the major diagnostic features observed in detecting cancer and pre-cancerous regions. Automatic extraction of acetowhite regions from cervical images has been a challenging task due to specular reflection, various illumination conditions, and most importantly, large intra-patient variation. This paper presents a multi-step acetowhite region detection system to analyze the acetowhite lesions in cervical images automatically. First, the system calibrates the color of the cervical images to be independent of screening devices. Second, the anatomy of the uterine cervix is analyzed in terms of cervix region, external os region, columnar region, and squamous region. Third, the squamous region is further analyzed and subregions based on three levels of acetowhite are identified. The extracted acetowhite regions are accompanied by color scores to indicate the different levels of acetowhite. The system has been evaluated by 40 human subjects' data and demonstrates high correlation with experts' annotations.

  14. Use of stochastic multi-criteria decision analysis to support sustainable management of contaminated sediments.

    PubMed

    Sparrevik, Magnus; Barton, David N; Bates, Mathew E; Linkov, Igor

    2012-02-01

    Sustainable management of contaminated sediments requires careful prioritization of available resources and focuses on efforts to optimize decisions that consider environmental, economic, and societal aspects simultaneously. This may be achieved by combining different analytical approaches such as risk analysis (RA), life cycle analysis (LCA), multicriteria decision analysis (MCDA), and economic valuation methods. We propose the use of stochastic MCDA based on outranking algorithms to implement integrative sustainability strategies for sediment management. In this paper we use the method to select the best sediment management alternatives for the dibenzo-p-dioxin and -furan (PCDD/F) contaminated Grenland fjord in Norway. In the analysis, the benefits of health risk reductions and socio-economic benefits from removing seafood health advisories are evaluated against the detriments of remedial costs and life cycle environmental impacts. A value-plural based weighing of criteria is compared to criteria weights mimicking traditional cost-effectiveness (CEA) and cost-benefit (CBA) analyses. Capping highly contaminated areas in the inner or outer fjord is identified as the most preferable remediation alternative under all criteria schemes and the results are confirmed by a probabilistic sensitivity analysis. The proposed methodology can serve as a flexible framework for future decision support and can be a step toward more sustainable decision making for contaminated sediment management. It may be applicable to the broader field of ecosystem restoration for trade-off analysis between ecosystem services and restoration costs.

  15. Evaluation of diagnostic criteria for night eating syndrome using item response theory analysis.

    PubMed

    Allison, Kelly C; Engel, Scott G; Crosby, Ross D; de Zwaan, Martina; O'Reardon, John P; Wonderlich, Stephen A; Mitchell, James E; West, Delia Smith; Wadden, Thomas A; Stunkard, Albert J

    2008-12-01

    Uniform diagnostic criteria for the night eating syndrome (NES), a disorder characterized by a delay in the circadian pattern of eating, have not been established. Proposed criteria for NES were evaluated using item response theory (IRT) analysis. Six studies yielded 1,481 Night Eating Questionnaires which were coded to reflect the presence/absence of five night eating symptoms. Symptoms were evaluated based on the clinical usefulness of their diagnostic information and on the assumptions of IRT analysis (unidimensionality, monotonicity, local item independence, correct model specification), using a two parameter logistic (2PL) IRT model. Reports of (1) nocturnal eating and/or evening hyperphagia, (2) initial insomnia, and (3) night awakenings showed high precision in discriminating those with night eating problems, while morning anorexia and delayed morning meal provided little additional information. IRT is a useful tool for evaluating the diagnostic criteria of psychiatric disorders and can be used to evaluate potential diagnostic criteria of NES empirically. Behavioral factors were identified as useful discriminators of NES. Future work should also examine psychological factors in conjunction with those identified here.

  16. Evaluation of diagnostic criteria for night eating syndrome using item response theory analysis.

    PubMed

    Allison, Kelly C; Engel, Scott G; Crosby, Ross D; de Zwaan, Martina; O'Reardon, John P; Wonderlich, Stephen A; Mitchell, James E; West, Delia Smith; Wadden, Thomas A; Stunkard, Albert J

    2008-12-01

    Uniform diagnostic criteria for the night eating syndrome (NES), a disorder characterized by a delay in the circadian pattern of eating, have not been established. Proposed criteria for NES were evaluated using item response theory (IRT) analysis. Six studies yielded 1,481 Night Eating Questionnaires which were coded to reflect the presence/absence of five night eating symptoms. Symptoms were evaluated based on the clinical usefulness of their diagnostic information and on the assumptions of IRT analysis (unidimensionality, monotonicity, local item independence, correct model specification), using a two parameter logistic (2PL) IRT model. Reports of (1) nocturnal eating and/or evening hyperphagia, (2) initial insomnia, and (3) night awakenings showed high precision in discriminating those with night eating problems, while morning anorexia and delayed morning meal provided little additional information. IRT is a useful tool for evaluating the diagnostic criteria of psychiatric disorders and can be used to evaluate potential diagnostic criteria of NES empirically. Behavioral factors were identified as useful discriminators of NES. Future work should also examine psychological factors in conjunction with those identified here. PMID:18928902

  17. A water quality monitoring network design using fuzzy theory and multiple criteria analysis.

    PubMed

    Chang, Chia-Ling; Lin, You-Tze

    2014-10-01

    A proper water quality monitoring design is required in a watershed, particularly in a water resource protected area. As numerous factors can influence the water quality monitoring design, this study applies multiple criteria analysis to evaluate the suitability of the water quality monitoring design in the Taipei Water Resource Domain (TWRD) in northern Taiwan. Seven criteria, which comprise percentage of farmland area, percentage of built-up area, amount of non-point source pollution, green cover ratio, landslide area ratio, ratio of over-utilization on hillsides, and density of water quality monitoring stations, are selected in the multiple criteria analysis. The criteria are normalized and weighted. The weighted method is applied to score the subbasins. The density of water quality stations needs to be increased in priority in the subbasins with a higher score. The fuzzy theory is utilized to prioritize the need for a higher density of water quality monitoring stations. The results show that the need for more water quality stations in subbasin 2 in the Bei-Shih Creek Basin is much higher than those in the other subbasins. Furthermore, the existing water quality station in subbasin 2 requires maintenance. It is recommended that new water quality stations be built in subbasin 2. PMID:24974234

  18. Automated Microarray Image Analysis Toolbox for MATLAB

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Willse, Alan R.; Protic, Miroslava; Chandler, Darrell P.

    2005-09-01

    The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.

  19. Statistical analysis of biophoton image

    NASA Astrophysics Data System (ADS)

    Wang, Susheng

    1998-08-01

    A photon count image system has been developed to obtain the ultra-weak bioluminescence image. The photon images of some plant, animal and human hand have been detected. The biophoton image is different from usual image. In this paper three characteristics of biophoton image are analyzed. On the basis of these characteristics the detected probability and detected limit of photon count image system, detected limit of biophoton image have been discussed. These researches provide scientific basis for experiments design and photon image processing.

  20. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  1. Computer analysis of mammography phantom images (CAMPI)

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.

    1997-05-01

    Computer analysis of mammography phantom images (CAMPI) is a method for objective and precise measurements of phantom image quality in mammography. This investigation applied CAMPI methodology to the Fischer Mammotest Stereotactic Digital Biopsy machine. Images of an American College of Radiology phantom centered on the largest two microcalcification groups were obtained on this machine under a variety of x-ray conditions. Analyses of the images revealed that the precise behavior of the CAMPI measures could be understood from basic imaging physics principles. We conclude that CAMPI is sensitive to subtle image quality changes and can perform accurate evaluations of images, especially of directly acquired digital images.

  2. Using multi-criteria decision analysis to assess the vulnerability of drinking water utilities.

    PubMed

    Joerin, Florent; Cool, Geneviève; Rodriguez, Manuel J; Gignac, Marc; Bouchard, Christian

    2010-07-01

    Outbreaks of microbiological waterborne disease have increased governmental concern regarding the importance of drinking water safety. Considering the multi-barrier approach to safe drinking water may improve management decisions to reduce contamination risks. However, the application of this approach must consider numerous and diverse kinds of information simultaneously. This makes it difficult for authorities to apply the approach to decision making. For this reason, multi-criteria decision analysis can be helpful in applying the multi-barrier approach to vulnerability assessment. The goal of this study is to propose an approach based on a multi-criteria analysis method in order to rank drinking water systems (DWUs) based on their vulnerability to microbiological contamination. This approach is illustrated with an application carried out on 28 DWUs supplied by groundwater in the Province of Québec, Canada. The multi-criteria analysis method chosen is measuring attractiveness by a categorical based evaluation technique methodology allowing the assessment of a microbiological vulnerability indicator (MVI) for each DWU. Results are presented on a scale ranking DWUs from less vulnerable to most vulnerable to contamination. MVI results are tested using a sensitivity analysis on barrier weights and they are also compared with historical data on contamination at the utilities. The investigation demonstrates that MVI provides a good representation of the vulnerability of DWUs to microbiological contamination.

  3. Overview of quality in cardiovascular imaging and procedures for clinicians: focus on appropriate-use-criteria guidelines.

    PubMed

    Stainback, Raymond F

    2014-01-01

    Cardiovascular imaging and procedures have experienced exponential growth over the past 20 years in terms of new modalities, procedure volume, technological sophistication, and cost. As a result, related quality improvement tools have become multifaceted works in progress. This article briefly summarizes the evolution of the time-honored American College of Cardiology Foundation/American Heart Association clinical practice guidelines versus the newer American College of Cardiology Foundation appropriate-use-criteria guidelines and how these may interact with emerging performance measures, clinical data registries, and cardiovascular laboratory accreditation initiatives.

  4. Exploratory Factor Analysis of Diagnostic and Statistical Manual, 5th Edition, Criteria for Posttraumatic Stress Disorder.

    PubMed

    McSweeney, Lauren B; Koch, Ellen I; Saules, Karen K; Jefferson, Stephen

    2016-01-01

    One change to the posttraumatic stress disorder (PTSD) nomenclature highlighted in the Diagnostic and Statistical Manual, 5th Edition (DSM-5; American Psychiatric Association, 2013) is the conceptualization of PTSD as a diagnostic category with four distinct symptom clusters. This article presents exploratory factor analysis to test the structural validity of the DSM-5 conceptualization of PTSD via an online survey that included the PTSD Checklist-5. The study utilized a sample of 113 college students from a large Midwestern university and 177 Amazon Mechanical Turk users. Participants were primarily female, Caucasian, single, and heterosexual with an average age of 32 years. Approximately 30% to 35% of participants met diagnostic criteria for PTSD based on two different scoring criteria. Results of the exploratory factor analysis revealed five distinct symptom clusters. The implications for the classification of PTSD are discussed.

  5. A watershed-based cumulative risk impact analysis: environmental vulnerability and impact criteria.

    PubMed

    Osowski, S L; Swick, J D; Carney, G R; Pena, H B; Danielson, J E; Parrish, D A

    2001-01-01

    Swine Concentrated Animal Feeding Operations (CAFOs) have received much attention in recent years. As a result, a watershed-based screening tool, the Cumulative Risk Index Analysis (CRIA), was developed to assess the cumulative impacts of multiple CAFO facilities in a watershed subunit. The CRIA formula calculates an index number based on: 1) the area of one or more facilities compared to the area of the watershed subunit, 2) the average of the environmental vulnerability criteria, and 3) the average of the industry-specific impact criteria. Each vulnerability or impact criterion is ranked on a 1 to 5 scale, with a low rank indicating low environmental vulnerability or impact and a high rank indicating high environmental vulnerability or impact. The individual criterion ranks, as well as the total CRIA score, can be used to focus the environmental analysis and facilitate discussions with industry, public, and other stakeholders in the Agency decision-making process. PMID:11214349

  6. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  7. How strict should specimen acceptance or rejection criteria be for diagnostic semen analysis? An opinion.

    PubMed

    Woodward, Bryan J; Tomlinson, Mathew J

    2015-06-01

    Medical laboratory accreditation (previously by Clinical Pathology Accreditation UK Ltd and now by the United Kingdom Accreditation Service) has been integral to improving standards and service quality in the UK. With the recent introduction of the ISO15189 standard, all laboratories offering a clinical diagnostic service are required to demonstrate further improvement, with more emphasis on validation and assessment of the uncertainty levels associated with testing. This applies not only to 'bench testing', but also to the evidence-base for all pre-analytical and post-analytical procedures. To reduce the risk of external influences on andrology test results, semen sample rejection criteria were developed, including confirmation of patient identity, a strict time limit from sample production to testing, the use of toxicity-tested containers, a prescribed sexual abstinence and a need for complete sample collection. However, such criteria were originally developed by the World Health Organization in order to standardise analysis rather than reject testing outright, and should therefore be implemented with caution. Rejecting samples with normal semen parameters because they fail to meet some of the criteria as outlined above would be a waste of resources and adversely affect user (the person who requested or provided the sample) satisfaction. This document evaluates the evidence base underlying commonly used criteria for specimen rejection and suggests how they may be applied more pragmatically in order to improve efficiency and reduce the waste of resources.

  8. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

  9. How strict should specimen acceptance or rejection criteria be for diagnostic semen analysis? An opinion.

    PubMed

    Woodward, Bryan J; Tomlinson, Mathew J

    2015-06-01

    Medical laboratory accreditation (previously by Clinical Pathology Accreditation UK Ltd and now by the United Kingdom Accreditation Service) has been integral to improving standards and service quality in the UK. With the recent introduction of the ISO15189 standard, all laboratories offering a clinical diagnostic service are required to demonstrate further improvement, with more emphasis on validation and assessment of the uncertainty levels associated with testing. This applies not only to 'bench testing', but also to the evidence-base for all pre-analytical and post-analytical procedures. To reduce the risk of external influences on andrology test results, semen sample rejection criteria were developed, including confirmation of patient identity, a strict time limit from sample production to testing, the use of toxicity-tested containers, a prescribed sexual abstinence and a need for complete sample collection. However, such criteria were originally developed by the World Health Organization in order to standardise analysis rather than reject testing outright, and should therefore be implemented with caution. Rejecting samples with normal semen parameters because they fail to meet some of the criteria as outlined above would be a waste of resources and adversely affect user (the person who requested or provided the sample) satisfaction. This document evaluates the evidence base underlying commonly used criteria for specimen rejection and suggests how they may be applied more pragmatically in order to improve efficiency and reduce the waste of resources. PMID:25292458

  10. Principles and clinical applications of image analysis.

    PubMed

    Kisner, H J

    1988-12-01

    Image processing has traveled to the lunar surface and back, finding its way into the clinical laboratory. Advances in digital computers have improved the technology of image analysis, resulting in a wide variety of medical applications. Offering improvements in turnaround time, standardized systems, increased precision, and walkaway automation, digital image analysis has likely found a permanent home as a diagnostic aid in the interpretation of microscopic as well as macroscopic laboratory images.

  11. Use of multi-criteria decision analysis in regulatory alternatives analysis: a case study of lead free solder.

    PubMed

    Malloy, Timothy F; Sinsheimer, Peter J; Blake, Ann; Linkov, Igor

    2013-10-01

    Regulators are implementing new programs that require manufacturers of products containing certain chemicals of concern to identify, evaluate, and adopt viable, safer alternatives. Such programs raise the difficult question for policymakers and regulated businesses of which alternatives are "viable" and "safer." To address that question, these programs use "alternatives analysis," an emerging methodology that integrates issues of human health and environmental effects with technical feasibility and economic impact. Despite the central role that alternatives analysis plays in these programs, the methodology itself is neither well-developed nor tailored to application in regulatory settings. This study uses the case of Pb-based bar solder and its non-Pb-based alternatives to examine the application of 2 multi-criteria decision analysis (MCDA) methods to alternatives analysis: multi-attribute utility analysis and outranking. The article develops and evaluates an alternatives analysis methodology and supporting decision-analysis software for use in a regulatory context, using weighting of the relevant decision criteria generated from a stakeholder elicitation process. The analysis produced complete rankings of the alternatives, including identification of the relative contribution to the ranking of each of the highest level decision criteria such as human health impacts, technical feasibility, and economic feasibility. It also examined the effect of variation in data conventions, weighting, and decision frameworks on the outcome. The results indicate that MCDA can play a critical role in emerging prevention-based regulatory programs. Multi-criteria decision analysis methods offer a means for transparent, objective, and rigorous analysis of products and processes, providing regulators and stakeholders with a common baseline understanding of the relative performance of alternatives and the trade-offs they present.

  12. Use of multi-criteria decision analysis in regulatory alternatives analysis: a case study of lead free solder.

    PubMed

    Malloy, Timothy F; Sinsheimer, Peter J; Blake, Ann; Linkov, Igor

    2013-10-01

    Regulators are implementing new programs that require manufacturers of products containing certain chemicals of concern to identify, evaluate, and adopt viable, safer alternatives. Such programs raise the difficult question for policymakers and regulated businesses of which alternatives are "viable" and "safer." To address that question, these programs use "alternatives analysis," an emerging methodology that integrates issues of human health and environmental effects with technical feasibility and economic impact. Despite the central role that alternatives analysis plays in these programs, the methodology itself is neither well-developed nor tailored to application in regulatory settings. This study uses the case of Pb-based bar solder and its non-Pb-based alternatives to examine the application of 2 multi-criteria decision analysis (MCDA) methods to alternatives analysis: multi-attribute utility analysis and outranking. The article develops and evaluates an alternatives analysis methodology and supporting decision-analysis software for use in a regulatory context, using weighting of the relevant decision criteria generated from a stakeholder elicitation process. The analysis produced complete rankings of the alternatives, including identification of the relative contribution to the ranking of each of the highest level decision criteria such as human health impacts, technical feasibility, and economic feasibility. It also examined the effect of variation in data conventions, weighting, and decision frameworks on the outcome. The results indicate that MCDA can play a critical role in emerging prevention-based regulatory programs. Multi-criteria decision analysis methods offer a means for transparent, objective, and rigorous analysis of products and processes, providing regulators and stakeholders with a common baseline understanding of the relative performance of alternatives and the trade-offs they present. PMID:23703936

  13. IMAGE ANALYSIS ALGORITHMS FOR DUAL MODE IMAGING SYSTEMS

    SciTech Connect

    Robinson, Sean M.; Jarman, Kenneth D.; Miller, Erin A.; Misner, Alex C.; Myjak, Mitchell J.; Pitts, W. Karl; Seifert, Allen; Seifert, Carolyn E.; Woodring, Mitchell L.

    2010-06-11

    The level of detail discernable in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes where information barriers are mandatory. However, if a balance can be struck between sufficient information barriers and feature extraction to verify or identify objects of interest, imaging may significantly advance verification efforts. This paper describes the development of combined active (conventional) radiography and passive (auto) radiography techniques for imaging sensitive items assuming that comparison images cannot be furnished. Three image analysis algorithms are presented, each of which reduces full image information to non-sensitive feature information and ultimately is intended to provide only a yes/no response verifying features present in the image. These algorithms are evaluated on both their technical performance in image analysis and their application with or without an explicitly constructed information barrier. The first algorithm reduces images to non-invertible pixel intensity histograms, retaining only summary information about the image that can be used in template comparisons. This one-way transform is sufficient to discriminate between different image structures (in terms of area and density) without revealing unnecessary specificity. The second algorithm estimates the attenuation cross-section of objects of known shape based on transition characteristics around the edge of the object’s image. The third algorithm compares the radiography image with the passive image to discriminate dense, radioactive material from point sources or inactive dense material. By comparing two images and reporting only a single statistic from the combination thereof, this algorithm can operate entirely behind an information barrier stage. Together with knowledge of the radiography system, the use of these algorithms in combination can be used to improve verification capability to inspection regimes and improve

  14. Adaptation and Evaluation of a Multi-Criteria Decision Analysis Model for Lyme Disease Prevention.

    PubMed

    Aenishaenslin, Cécile; Gern, Lise; Michel, Pascal; Ravel, André; Hongoh, Valérie; Waaub, Jean-Philippe; Milord, François; Bélanger, Denise

    2015-01-01

    Designing preventive programs relevant to vector-borne diseases such as Lyme disease (LD) can be complex given the need to include multiple issues and perspectives into prioritizing public health actions. A multi-criteria decision aid (MCDA) model was previously used to rank interventions for LD prevention in Quebec, Canada, where the disease is emerging. The aim of the current study was to adapt and evaluate the decision model constructed in Quebec under a different epidemiological context, in Switzerland, where LD has been endemic for the last thirty years. The model adaptation was undertaken with a group of Swiss stakeholders using a participatory approach. The PROMETHEE method was used for multi-criteria analysis. Key elements and results of the MCDA model are described and contrasted with the Quebec model. All criteria and most interventions of the MCDA model developed for LD prevention in Quebec were directly transferable to the Swiss context. Four new decision criteria were added, and the list of proposed interventions was modified. Based on the overall group ranking, interventions targeting human populations were prioritized in the Swiss model, with the top ranked action being the implementation of a large communication campaign. The addition of criteria did not significantly alter the intervention rankings, but increased the capacity of the model to discriminate between highest and lowest ranked interventions. The current study suggests that beyond the specificity of the MCDA models developed for Quebec and Switzerland, their general structure captures the fundamental and common issues that characterize the complexity of vector-borne disease prevention. These results should encourage public health organizations to adapt, use and share MCDA models as an effective and functional approach to enable the integration of multiple perspectives and considerations in the prevention and control of complex public health issues such as Lyme disease or other vector

  15. Latent Class Analysis of DSM-5 Alcohol Use Disorder Criteria Among Heavy-Drinking College Students.

    PubMed

    Rinker, Dipali Venkataraman; Neighbors, Clayton

    2015-10-01

    The DSM-5 has created significant changes in the definition of alcohol use disorders (AUDs). Limited work has considered the impact of these changes in specific populations, such as heavy-drinking college students. Latent class analysis (LCA) is a person-centered approach that divides a population into mutually exclusive and exhaustive latent classes, based on observable indicator variables. The present research was designed to examine whether there were distinct classes of heavy-drinking college students who met DSM-5 criteria for an AUD and whether gender, perceived social norms, use of protective behavioral strategies (PBS), drinking refusal self-efficacy (DRSE), self-perceptions of drinking identity, psychological distress, and membership in a fraternity/sorority would be associated with class membership. Three-hundred and ninety-four college students who met DSM-5 criteria for an AUD were recruited from three different universities. Two distinct classes emerged: Less Severe (86%), the majority of whom endorsed both drinking more than intended and tolerance, as well as met criteria for a mild AUD; and More Severe (14%), the majority of whom endorsed at least half of the DSM-5 AUD criteria and met criteria for a severe AUD. Relative to the Less Severe class, membership in the More Severe class was negatively associated with DRSE and positively associated with self-identification as a drinker. There is a distinct class of heavy-drinking college students with a more severe AUD and for whom intervention content needs to be more focused and tailored. Clinical implications are discussed.

  16. Adaptation and Evaluation of a Multi-Criteria Decision Analysis Model for Lyme Disease Prevention

    PubMed Central

    Aenishaenslin, Cécile; Gern, Lise; Michel, Pascal; Ravel, André; Hongoh, Valérie; Waaub, Jean-Philippe; Milord, François; Bélanger, Denise

    2015-01-01

    Designing preventive programs relevant to vector-borne diseases such as Lyme disease (LD) can be complex given the need to include multiple issues and perspectives into prioritizing public health actions. A multi-criteria decision aid (MCDA) model was previously used to rank interventions for LD prevention in Quebec, Canada, where the disease is emerging. The aim of the current study was to adapt and evaluate the decision model constructed in Quebec under a different epidemiological context, in Switzerland, where LD has been endemic for the last thirty years. The model adaptation was undertaken with a group of Swiss stakeholders using a participatory approach. The PROMETHEE method was used for multi-criteria analysis. Key elements and results of the MCDA model are described and contrasted with the Quebec model. All criteria and most interventions of the MCDA model developed for LD prevention in Quebec were directly transferable to the Swiss context. Four new decision criteria were added, and the list of proposed interventions was modified. Based on the overall group ranking, interventions targeting human populations were prioritized in the Swiss model, with the top ranked action being the implementation of a large communication campaign. The addition of criteria did not significantly alter the intervention rankings, but increased the capacity of the model to discriminate between highest and lowest ranked interventions. The current study suggests that beyond the specificity of the MCDA models developed for Quebec and Switzerland, their general structure captures the fundamental and common issues that characterize the complexity of vector-borne disease prevention. These results should encourage public health organizations to adapt, use and share MCDA models as an effective and functional approach to enable the integration of multiple perspectives and considerations in the prevention and control of complex public health issues such as Lyme disease or other vector

  17. Multiple criteria analysis of remotely piloted aircraft systems for monitoring the crops vegetation status

    NASA Astrophysics Data System (ADS)

    Cristea, L.; Luculescu, M. C.; Zamfira, S. C.; Boer, A. L.; Pop, S.

    2016-08-01

    The paper presents an analysis of Remotely Piloted Aircraft Systems (RPAS) used for monitoring the crops vegetation status. The study focuses on two types of RPAS, namely the flying wing and the multi-copter. The following criteria were taken into account: technical characteristics, power consumption, flight autonomy, flight conditions, costs, data acquisition systems used for monitoring, crops area and so on. Based on this analysis, advantages and disadvantages are emphasized offering a useful tool for choosing the proper solution according to the specific application conditions.

  18. Automated Dermoscopy Image Analysis of Pigmented Skin Lesions

    PubMed Central

    Baldi, Alfonso; Quartulli, Marco; Murace, Raffaele; Dragonetti, Emanuele; Manganaro, Mario; Guerra, Oscar; Bizzi, Stefano

    2010-01-01

    Dermoscopy (dermatoscopy, epiluminescence microscopy) is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs), allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis). This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR). PMID:24281070

  19. Micro-CT imaging: Developing criteria for examining fetal skeletons in regulatory developmental toxicology studies - A workshop report.

    PubMed

    Solomon, Howard M; Makris, Susan L; Alsaid, Hasan; Bermudez, Oscar; Beyer, Bruce K; Chen, Antong; Chen, Connie L; Chen, Zhou; Chmielewski, Gary; DeLise, Anthony M; de Schaepdrijver, Luc; Dogdas, Belma; French, Julian; Harrouk, Wafa; Helfgott, Jonathan; Henkelman, R Mark; Hesterman, Jacob; Hew, Kok-Wah; Hoberman, Alan; Lo, Cecilia W; McDougal, Andrew; Minck, Daniel R; Scott, Lelia; Stewart, Jane; Sutherland, Vicki; Tatiparthi, Arun K; Winkelmann, Christopher T; Wise, L David; Wood, Sandra L; Ying, Xiaoyou

    2016-06-01

    During the past two decades the use and refinements of imaging modalities have markedly increased making it possible to image embryos and fetuses used in pivotal nonclinical studies submitted to regulatory agencies. Implementing these technologies into the Good Laboratory Practice environment requires rigorous testing, validation, and documentation to ensure the reproducibility of data. A workshop on current practices and regulatory requirements was held with the goal of defining minimal criteria for the proper implementation of these technologies and subsequent submission to regulatory agencies. Micro-computed tomography (micro-CT) is especially well suited for high-throughput evaluations, and is gaining popularity to evaluate fetal skeletons to assess the potential developmental toxicity of test agents. This workshop was convened to help scientists in the developmental toxicology field understand and apply micro-CT technology to nonclinical toxicology studies and facilitate the regulatory acceptance of imaging data. Presentations and workshop discussions covered: (1) principles of micro-CT fetal imaging; (2) concordance of findings with conventional skeletal evaluations; and (3) regulatory requirements for validating the system. Establishing these requirements for micro-CT examination can provide a path forward for laboratories considering implementing this technology and provide regulatory agencies with a basis to consider the acceptability of data generated via this technology. PMID:26930635

  20. [THE COMPARATIVE ANALYSIS OF INFORMATION VALUE OF MAIN CLINICAL CRITERIA USED TO DIAGNOSE OF BACTERIAL VAGINOSIS].

    PubMed

    Tsvetkova, A V; Murtazina, Z A; Markusheva, T V; Mavzutov, A R

    2015-05-01

    The bacterial vaginosis is one of the most frequent causes of women visiting gynecologist. The diagnostics of bacterial vaginosis is predominantly based on Amsel criteria (1983). Nowadays, the objectivity of these criteria is disputed more often. The analysis of excretion of mucous membranes of posterolateral fornix of vagina was applied to 640 women with clinical diagnosis bacterial vaginosis. The application of light microscopy to mounts of excretion confirmed in laboratory way the diagnosis of bacterial vaginosis in 100 (15.63%) women. The complaints of burning and unpleasant smell and the Amsel criterion of detection of "key cells" against the background of pH > 4.5 were established as statistically significant for bacterial vaginosis. According study data, the occurrence of excretions has no statistical reliable obligation for differentiation of bacterial vaginosis form other inflammatory pathological conditions of female reproductive sphere. At the same time, detection of "key cells" in mount reliably correlated with bacterial vaginosis.

  1. Exploratory factor analysis of borderline personality disorder criteria in monolingual Hispanic outpatients with substance use disorders†

    PubMed Central

    Becker, Daniel F.; Añez, Luis Miguel; Paris, Manuel; Grilo, Carlos M.

    2009-01-01

    This study examined the factor structure of the DSM-IV criteria for borderline personality disorder (BPD) in Hispanic patients. Subjects were 130 monolingual Hispanic adults who had been admitted to a specialty outpatient clinic that provides psychiatric and substance abuse services to Spanish-speaking individuals. All were reliably assessed with the Spanish-Language Version of the Diagnostic Interview for DSM-IV Personality Disorders. After evaluating internal consistency of the BPD criterion set, an exploratory factor analysis was performed using principal axis factoring. Results suggested a unidimensional structure, and were consistent with similar studies of the DSM-IV criteria for BPD in non-Hispanic samples. These findings have implications for understanding borderline psychopathology in this population, and for the overall validity of the DSM-IV BPD construct. PMID:20472296

  2. A multi-criteria decision analysis assessment of waste paper management options.

    PubMed

    Hanan, Deirdre; Burnley, Stephen; Cooke, David

    2013-03-01

    The use of Multi-criteria Decision Analysis (MCDA) was investigated in an exercise using a panel of local residents and stakeholders to assess the options for managing waste paper on the Isle of Wight. Seven recycling, recovery and disposal options were considered by the panel who evaluated each option against seven environmental, financial and social criteria. The panel preferred options where the waste was managed on the island with gasification and recycling achieving the highest scores. Exporting the waste to the English mainland for incineration or landfill proved to be the least preferred options. This research has demonstrated that MCDA is an effective way of involving community groups in waste management decision making.

  3. A multi-criteria decision analysis perspective on the health economic evaluation of medical interventions.

    PubMed

    Postmus, Douwe; Tervonen, Tommi; van Valkenhoef, Gert; Hillege, Hans L; Buskens, Erik

    2014-09-01

    A standard practice in health economic evaluation is to monetize health effects by assuming a certain societal willingness-to-pay per unit of health gain. Although the resulting net monetary benefit (NMB) is easy to compute, the use of a single willingness-to-pay threshold assumes expressibility of the health effects on a single non-monetary scale. To relax this assumption, this article proves that the NMB framework is a special case of the more general stochastic multi-criteria acceptability analysis (SMAA) method. Specifically, as SMAA does not restrict the number of criteria to two and also does not require the marginal rates of substitution to be constant, there are problem instances for which the use of this more general method may result in a better understanding of the trade-offs underlying the reimbursement decision-making problem. This is illustrated by applying both methods in a case study related to infertility treatment.

  4. Assessment of patient selection criteria for quantitative imaging with respiratory-gated positron emission tomography.

    PubMed

    Bowen, Stephen R; Pierce, Larry A; Alessio, Adam M; Liu, Chi; Wollenweber, Scott D; Stearns, Charles W; Kinahan, Paul E

    2014-07-01

    The objective of this investigation was to propose techniques for determining which patients are likely to benefit from quantitative respiratory-gated imaging by correlating respiratory patterns to changes in positron emission tomography (PET) metrics. Twenty-six lung and liver cancer patients underwent PET/computed tomography exams with recorded chest/abdominal displacements. Static and adaptive amplitude-gated [[Formula: see text

  5. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya Monton, Carlos; Schuller, Ivan K.

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  6. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  7. Image registration with uncertainty analysis

    DOEpatents

    Simonson, Katherine M.

    2011-03-22

    In an image registration method, edges are detected in a first image and a second image. A percentage of edge pixels in a subset of the second image that are also edges in the first image shifted by a translation is calculated. A best registration point is calculated based on a maximum percentage of edges matched. In a predefined search region, all registration points other than the best registration point are identified that are not significantly worse than the best registration point according to a predetermined statistical criterion.

  8. Regulatory analysis on criteria for the release of patients administered radioactive material. Final report

    SciTech Connect

    Schneider, S.; McGuire, S.A.

    1997-02-01

    This regulatory analysis was developed to respond to three petitions for rulemaking to amend 10 CFR parts 20 and 35 regarding release of patients administered radioactive material. The petitions requested revision of these regulations to remove the ambiguity that existed between the 1-millisievert (0.1-rem) total effective dose equivalent (TEDE) public dose limit in Part 20, adopted in 1991, and the activity-based release limit in 10 CFR 35.75 that, in some instances, would permit release of individuals in excess of the current public dose limit. Three alternatives for resolution of the petitions were evaluated. Under Alternative 1, NRC would amend its patient release criteria in 10 CFR 35.75 to match the annual public dose limit in Part 20 of 1 millisievert (0.1 rem) TEDE. Alternative 2 would maintain the status quo of using the activity-based release criteria currently found in 10 CFR 35.75. Under Alternative 3, the NRC would revise the release criteria in 10 CFR 35.75 to specify a dose limit of 5 millisieverts (0.5 rem) TEDE.

  9. Millimeter-wave sensor image analysis

    NASA Technical Reports Server (NTRS)

    Wilson, William J.; Suess, Helmut

    1989-01-01

    Images of an airborne, scanning, radiometer operating at a frequency of 98 GHz, have been analyzed. The mm-wave images were obtained in 1985/1986 using the JPL mm-wave imaging sensor. The goal of this study was to enhance the information content of these images and make their interpretation easier for human analysis. In this paper, a visual interpretative approach was used for information extraction from the images. This included application of nonlinear transform techniques for noise reduction and for color, contrast and edge enhancement. Results of the techniques on selected mm-wave images are presented.

  10. Regulatory analysis on criteria for the release of patients administered radioactive material

    SciTech Connect

    Schneider, S.; McGuire, S.A.; Behling, U.H.; Behling, K.; Goldin, D.

    1994-05-01

    The Nuclear Regulatory Commission (NRC) has received two petitions to amend its regulations in 10 CFR Parts 20 and 35 as they apply to doses received by members of the public exposed to patients released from a hospital after they have been administered radioactive material. While the two petitions are not identical they both request that the NRC establish a dose limit of 5 millisieverts (0.5 rem) per year for individuals exposed to patients who have been administered radioactive materials. This Regulatory Analysis evaluates three alternatives. Alternative 1 is for the NRC to amend its patient release criteria in 10 CFR 35.75 to use the more stringent dose limit of 1 millisievert per year in 10 CFR 20.1301(a) for its patient release criteria. Alternative 2 is for the NRC to continue using the existing patient release criteria in 10 CFR 35.75 of 1,110 megabecquerels of activity or a dose rate at one meter from the patient of 0.05 millisievert per hour. Alternative 3 is for the NRC to amend the patient release criteria in 10 CFR 35.75 to specify a dose limit of 5 millisieverts for patient release. The evaluation indicates that Alternative 1 would cause a prohibitively large increase in the national health care cost from retaining patients in a hospital longer and would cause significant personal and psychological costs to patients and their families. The choice of Alternatives 2 or 3 would affect only thyroid cancer patients treated with iodine-131. For those patients, Alternative 3 would result in less hospitalization than Alternative 2. Alternative 3 has a potential decrease in national health care cost of $30,000,000 per year but would increase the potential collective dose from released therapy patients by about 2,700 person-rem per year, mainly to family members.

  11. Imaging-based enrichment criteria using deep learning algorithms for efficient clinical trials in mild cognitive impairment.

    PubMed

    Ithapu, Vamsi K; Singh, Vikas; Okonkwo, Ozioma C; Chappell, Richard J; Dowling, N Maritza; Johnson, Sterling C

    2015-12-01

    The mild cognitive impairment (MCI) stage of Alzheimer's disease (AD) may be optimal for clinical trials to test potential treatments for preventing or delaying decline to dementia. However, MCI is heterogeneous in that not all cases progress to dementia within the time frame of a trial and some may not have underlying AD pathology. Identifying those MCIs who are most likely to decline during a trial and thus most likely to benefit from treatment will improve trial efficiency and power to detect treatment effects. To this end, using multimodal, imaging-derived, inclusion criteria may be especially beneficial. Here, we present a novel multimodal imaging marker that predicts future cognitive and neural decline from [F-18]fluorodeoxyglucose positron emission tomography (PET), amyloid florbetapir PET, and structural magnetic resonance imaging, based on a new deep learning algorithm (randomized denoising autoencoder marker, rDAm). Using ADNI2 MCI data, we show that using rDAm as a trial enrichment criterion reduces the required sample estimates by at least five times compared with the no-enrichment regime and leads to smaller trials with high statistical power, compared with existing methods.

  12. A 3D image analysis tool for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  13. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization.

  14. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  15. Multi-criteria analysis on how to select solar radiation hydrogen production system

    SciTech Connect

    Badea, G.; Naghiu, G. S. Felseghi, R.-A.; Giurca, I.; Răboacă, S.; Aşchilean, I.

    2015-12-23

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  16. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    NASA Astrophysics Data System (ADS)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  17. Multiscale Analysis of Solar Image Data

    NASA Astrophysics Data System (ADS)

    Young, C. A.; Myers, D. C.

    2001-12-01

    It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.

  18. SU-E-J-27: Appropriateness Criteria for Deformable Image Registration and Dose Propagation

    SciTech Connect

    Papanikolaou, P; Tuohy, Rachel; Mavroidis, P; Eng, T; Gutierrez, A; Stathakis, S

    2014-06-01

    Purpose: Several commercial software packages have been recently released that allow the user to apply deformable registration algorithms (DRA) for image fusion and dose propagation. Although the idea of anatomically tracking the daily patient dose in the context of adaptive radiotherapy or merely adding the dose from prior treatment to the current one is very intuitive, the accuracy and applicability of such algorithms needs to be investigated as it remains somewhat subjective. In our study, we used true anatomical data where we introduced changes in the density, volume and location of segmented structures to test the DRA for its sensitivity and accuracy. Methods: The CT scan of a prostate patient was selected for this study. The CT images were first segmented to define structure such as the PTV, bladder, rectum, intestines and pelvic bone anatomy. To perform our study, we introduced anatomical changes in the reference patient image set in three different ways: (i) we kept the segmented volumes constant and changed the density of rectum and bladder in increments of 5% (ii) we changed the volume of rectum and bladder in increments of 5% and (iii) we kept the segmented volumes constant but changed their location by moving their COM in increments of 3mm. Using the Velocity software, we evaluated the accuracy of the DRA for each incremental change in all three scenarios. Results: The DRA performs reasonably well when the differential density difference against the background is more than 5%. For the volume change study, the DRA results became unreliable for relative volume changes greater than 10%. Finally for the location study, the DRA performance was acceptable for shifts below 9mm. Conclusion: Site specific and patient specific QA for DRA is an important step to evaluate such algorithms prior to their use for dose propagation.

  19. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    SciTech Connect

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.; Zuffranieri, Jason V.; Paananen, Orman Henrie; Jones, Scott A.; Ortner, Juergen G.; Brewer, Jeffrey D.; Valdez, Maximo M.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.

  20. Combination of a Stressor-Response Model with a Conditional Probability Analysis Approach for Developing Candidate Criteria from MBSS

    EPA Science Inventory

    I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.

  1. Image Reconstruction Using Analysis Model Prior.

    PubMed

    Han, Yu; Du, Huiqian; Lam, Fan; Mei, Wenbo; Fang, Liping

    2016-01-01

    The analysis model has been previously exploited as an alternative to the classical sparse synthesis model for designing image reconstruction methods. Applying a suitable analysis operator on the image of interest yields a cosparse outcome which enables us to reconstruct the image from undersampled data. In this work, we introduce additional prior in the analysis context and theoretically study the uniqueness issues in terms of analysis operators in general position and the specific 2D finite difference operator. We establish bounds on the minimum measurement numbers which are lower than those in cases without using analysis model prior. Based on the idea of iterative cosupport detection (ICD), we develop a novel image reconstruction model and an effective algorithm, achieving significantly better reconstruction performance. Simulation results on synthetic and practical magnetic resonance (MR) images are also shown to illustrate our theoretical claims. PMID:27379171

  2. Image Reconstruction Using Analysis Model Prior

    PubMed Central

    Han, Yu; Du, Huiqian; Lam, Fan; Mei, Wenbo; Fang, Liping

    2016-01-01

    The analysis model has been previously exploited as an alternative to the classical sparse synthesis model for designing image reconstruction methods. Applying a suitable analysis operator on the image of interest yields a cosparse outcome which enables us to reconstruct the image from undersampled data. In this work, we introduce additional prior in the analysis context and theoretically study the uniqueness issues in terms of analysis operators in general position and the specific 2D finite difference operator. We establish bounds on the minimum measurement numbers which are lower than those in cases without using analysis model prior. Based on the idea of iterative cosupport detection (ICD), we develop a novel image reconstruction model and an effective algorithm, achieving significantly better reconstruction performance. Simulation results on synthetic and practical magnetic resonance (MR) images are also shown to illustrate our theoretical claims. PMID:27379171

  3. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    SciTech Connect

    Sharifi, Mozafar Hadidi, Mosslem Vessali, Elahe Mosstafakhani, Parasto Taheri, Kamal Shahoie, Saber Khodamoradpour, Mehran

    2009-10-15

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  4. Quantitative image analysis of synovial tissue.

    PubMed

    van der Hall, Pascal O; Kraan, Maarten C; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the acquisition, storage and evaluation of images with dedicated hardware and software. Major advantages of quantitative image analysis over traditional techniques include sophisticated calibration systems, interaction, speed, and control of inter- and intraobserver variation. This results in a well controlled environment, which is essential for quality control and reproducibility, and helps to optimize sensitivity and specificity. To achieve this, an optimal quantitative image analysis system combines solid software engineering with easy interactivity with the operator. Moreover, the system also needs to be as transparent as possible in generating the data because a "black box design" will deliver uncontrollable results. In addition to these more general aspects, specifically for the analysis of synovial tissue the necessity of interactivity is highlighted by the added value of identification and quantification of information as present in areas such as the intimal lining layer, blood vessels, and lymphocyte aggregates. Speed is another important aspect of digital cytometry. Currently, rapidly increasing numbers of samples, together with accumulation of a variety of markers and detection techniques has made the use of traditional analysis techniques such as manual quantification and semi-quantitative analysis unpractical. It can be anticipated that the development of even more powerful computer systems with sophisticated software will further facilitate reliable analysis at high speed.

  5. Update on appropriate use criteria for amyloid PET imaging: dementia experts, mild cognitive impairment, and education. Amyloid Imaging Task Force of the Alzheimer’s Association and Society for Nuclear Medicine and Molecular Imaging.

    PubMed

    Johnson, Keith A; Minoshima, Satoshi; Bohnen, Nicolaas I; Donohoe, Kevin J; Foster, Norman L; Herscovitch, Peter; Karlawish, Jason H; Rowe, Christopher C; Hedrick, Saima; Pappas, Virginia; Carrillo, Maria C; Hartley, Dean M

    2013-07-01

    Amyloid PET imaging is a novel diagnostic test that can detect in living humans one of the two defining pathologic lesions of Alzheimer disease, amyloid-β deposition in the brain. The Amyloid Imaging Task Force of the Alzheimer's Association and Society for Nuclear Medicine and Molecular Imaging previously published appropriate use criteria for amyloid PET as an important tool for increasing the certainty of a diagnosis of Alzheimer disease in specific patient populations. Here, the task force further clarifies and expands 3 topics discussed in the original paper: first, defining dementia experts and their use of proper documentation to demonstrate the medical necessity of an amyloid PET scan; second, identifying a specific subset of individuals with mild cognitive impairment for whom an amyloid PET scan is appropriate; and finally, developing educational programs to increase awareness of the amyloid PET appropriate use criteria and providing instructions on how this test should be used in the clinical decision-making process.

  6. Fidelity Analysis of Sampled Imaging Systems

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Rahman, Zia-ur

    1999-01-01

    Many modeling, simulation and performance analysis studies of sampled imaging systems are inherently incomplete because they are conditioned on a discrete-input, discrete-output model that only accounts for blurring during image acquisition and additive noise. For those sampled imaging systems where the effects of digital image acquisition, digital filtering and reconstruction are significant, the modeling, simulation and performance analysis should be based on a more comprehensive continuous-input, discrete-processing, continuous-output end-to-end model. This more comprehensive model should properly account for the low-pass filtering effects of image acquisition prior to sampling, the potentially important noiselike effects of the aliasing caused by sampling, additive noise due to device electronics and quantization, the generally high-boost filtering effects of digital processing, and the low-pass filtering effects of image reconstruction. This model should not, however, be so complex as to preclude significant mathematical analysis, particularly the mean-square (fidelity) type of analysis so common in linear system theory. We demonstrate that, although the mathematics of such a model is more complex, the increase in complexity is not so great as to prevent a complete fidelity-metric analysis at both the component level and at the end-to-end system level: that is, computable mean-square-based fidelity metrics are developed by which both component-level and system-level performance can be quantified. In addition, we demonstrate that system performance can be assessed qualitatively by visualizing the output image as the sum of three component images, each of which relates to a corresponding fidelity metric. The cascaded, or filtered, component accounts for the end-to-end system filtering of image acquisition, digital processing, and image reconstruction; the random noise component accounts for additive random noise, modulated by digital processing and image

  7. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  8. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  9. Assessing Interventions to Manage West Nile Virus Using Multi-Criteria Decision Analysis with Risk Scenarios.

    PubMed

    Hongoh, Valerie; Campagna, Céline; Panic, Mirna; Samuel, Onil; Gosselin, Pierre; Waaub, Jean-Philippe; Ravel, André; Samoura, Karim; Michel, Pascal

    2016-01-01

    The recent emergence of West Nile virus (WNV) in North America highlights vulnerability to climate sensitive diseases and stresses the importance of preventive efforts to reduce their public health impact. Effective prevention involves reducing environmental risk of exposure and increasing adoption of preventive behaviours, both of which depend on knowledge and acceptance of such measures. When making operational decisions about disease prevention and control, public health must take into account a wide range of operational, environmental, social and economic considerations in addition to intervention effectiveness. The current study aimed to identify, assess and rank possible risk reduction measures taking into account a broad set of criteria and perspectives applicable to the management of WNV in Quebec under increasing transmission risk scenarios, some of which may be related to ongoing warming in higher-latitude regions. A participatory approach was used to collect information on categories of concern to relevant stakeholders with respect to WNV prevention and control. Multi-criteria decision analysis was applied to examine stakeholder perspectives and their effect on strategy rankings under increasing transmission risk scenarios. Twenty-three preventive interventions were retained for evaluation using eighteen criteria identified by stakeholders. Combined evaluations revealed that, at an individual-level, inspecting window screen integrity, wearing light colored, long clothing, eliminating peridomestic larval sites and reducing outdoor activities at peak times were top interventions under six WNV transmission scenarios. At a regional-level, the use of larvicides was a preferred strategy in five out of six scenarios, while use of adulticides and dissemination of sterile male mosquitoes were found to be among the least favoured interventions in almost all scenarios. Our findings suggest that continued public health efforts aimed at reinforcing individual

  10. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    PubMed

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  11. Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis

    PubMed Central

    Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748

  12. Assessing Interventions to Manage West Nile Virus Using Multi-Criteria Decision Analysis with Risk Scenarios.

    PubMed

    Hongoh, Valerie; Campagna, Céline; Panic, Mirna; Samuel, Onil; Gosselin, Pierre; Waaub, Jean-Philippe; Ravel, André; Samoura, Karim; Michel, Pascal

    2016-01-01

    The recent emergence of West Nile virus (WNV) in North America highlights vulnerability to climate sensitive diseases and stresses the importance of preventive efforts to reduce their public health impact. Effective prevention involves reducing environmental risk of exposure and increasing adoption of preventive behaviours, both of which depend on knowledge and acceptance of such measures. When making operational decisions about disease prevention and control, public health must take into account a wide range of operational, environmental, social and economic considerations in addition to intervention effectiveness. The current study aimed to identify, assess and rank possible risk reduction measures taking into account a broad set of criteria and perspectives applicable to the management of WNV in Quebec under increasing transmission risk scenarios, some of which may be related to ongoing warming in higher-latitude regions. A participatory approach was used to collect information on categories of concern to relevant stakeholders with respect to WNV prevention and control. Multi-criteria decision analysis was applied to examine stakeholder perspectives and their effect on strategy rankings under increasing transmission risk scenarios. Twenty-three preventive interventions were retained for evaluation using eighteen criteria identified by stakeholders. Combined evaluations revealed that, at an individual-level, inspecting window screen integrity, wearing light colored, long clothing, eliminating peridomestic larval sites and reducing outdoor activities at peak times were top interventions under six WNV transmission scenarios. At a regional-level, the use of larvicides was a preferred strategy in five out of six scenarios, while use of adulticides and dissemination of sterile male mosquitoes were found to be among the least favoured interventions in almost all scenarios. Our findings suggest that continued public health efforts aimed at reinforcing individual

  13. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases.

    PubMed

    Hongoh, Valerie; Hoen, Anne Gatewood; Aenishaenslin, Cécile; Waaub, Jean-Philippe; Bélanger, Denise; Michel, Pascal

    2011-12-29

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular.

  14. Assessing Interventions to Manage West Nile Virus Using Multi-Criteria Decision Analysis with Risk Scenarios

    PubMed Central

    Hongoh, Valerie; Campagna, Céline; Panic, Mirna; Samuel, Onil; Gosselin, Pierre; Waaub, Jean-Philippe; Ravel, André; Samoura, Karim; Michel, Pascal

    2016-01-01

    The recent emergence of West Nile virus (WNV) in North America highlights vulnerability to climate sensitive diseases and stresses the importance of preventive efforts to reduce their public health impact. Effective prevention involves reducing environmental risk of exposure and increasing adoption of preventive behaviours, both of which depend on knowledge and acceptance of such measures. When making operational decisions about disease prevention and control, public health must take into account a wide range of operational, environmental, social and economic considerations in addition to intervention effectiveness. The current study aimed to identify, assess and rank possible risk reduction measures taking into account a broad set of criteria and perspectives applicable to the management of WNV in Quebec under increasing transmission risk scenarios, some of which may be related to ongoing warming in higher-latitude regions. A participatory approach was used to collect information on categories of concern to relevant stakeholders with respect to WNV prevention and control. Multi-criteria decision analysis was applied to examine stakeholder perspectives and their effect on strategy rankings under increasing transmission risk scenarios. Twenty-three preventive interventions were retained for evaluation using eighteen criteria identified by stakeholders. Combined evaluations revealed that, at an individual-level, inspecting window screen integrity, wearing light colored, long clothing, eliminating peridomestic larval sites and reducing outdoor activities at peak times were top interventions under six WNV transmission scenarios. At a regional-level, the use of larvicides was a preferred strategy in five out of six scenarios, while use of adulticides and dissemination of sterile male mosquitoes were found to be among the least favoured interventions in almost all scenarios. Our findings suggest that continued public health efforts aimed at reinforcing individual

  15. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    PubMed Central

    2011-01-01

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular

  16. Optical Analysis of Microscope Images

    NASA Astrophysics Data System (ADS)

    Biles, Jonathan R.

    Microscope images were analyzed with coherent and incoherent light using analog optical techniques. These techniques were found to be useful for analyzing large numbers of nonsymbolic, statistical microscope images. In the first part phase coherent transparencies having 20-100 human multiple myeloma nuclei were simultaneously photographed at 100 power magnification using high resolution holographic film developed to high contrast. An optical transform was obtained by focussing the laser onto each nuclear image and allowing the diffracted light to propagate onto a one dimensional photosensor array. This method reduced the data to the position of the first two intensity minima and the intensity of successive maxima. These values were utilized to estimate the four most important cancer detection clues of nuclear size, shape, darkness, and chromatin texture. In the second part, the geometric and holographic methods of phase incoherent optical processing were investigated for pattern recognition of real-time, diffuse microscope images. The theory and implementation of these processors was discussed in view of their mutual problems of dimness, image bias, and detector resolution. The dimness problem was solved by either using a holographic correlator or a speckle free laser microscope. The latter was built using a spinning tilted mirror which caused the speckle to change so quickly that it averaged out during the exposure. To solve the bias problem low image bias templates were generated by four techniques: microphotography of samples, creation of typical shapes by computer graphics editor, transmission holography of photoplates of samples, and by spatially coherent color image bias removal. The first of these templates was used to perform correlations with bacteria images. The aperture bias was successfully removed from the correlation with a video frame subtractor. To overcome the limited detector resolution it is necessary to discover some analog nonlinear intensity

  17. Secure thin client architecture for DICOM image analysis

    NASA Astrophysics Data System (ADS)

    Mogatala, Harsha V. R.; Gallet, Jacqueline

    2005-04-01

    This paper presents a concept of Secure Thin Client (STC) Architecture for Digital Imaging and Communications in Medicine (DICOM) image analysis over Internet. STC Architecture provides in-depth analysis and design of customized reports for DICOM images using drag-and-drop and data warehouse technology. Using a personal computer and a common set of browsing software, STC can be used for analyzing and reporting detailed patient information, type of examinations, date, Computer Tomography (CT) dose index, and other relevant information stored within the images header files as well as in the hospital databases. STC Architecture is three-tier architecture. The First-Tier consists of drag-and-drop web based interface and web server, which provides customized analysis and reporting ability to the users. The Second-Tier consists of an online analytical processing (OLAP) server and database system, which serves fast, real-time, aggregated multi-dimensional data using OLAP technology. The Third-Tier consists of a smart algorithm based software program which extracts DICOM tags from CT images in this particular application, irrespective of CT vendor's, and transfers these tags into a secure database system. This architecture provides Winnipeg Regional Health Authorities (WRHA) with quality indicators for CT examinations in the hospitals. It also provides health care professionals with analytical tool to optimize radiation dose and image quality parameters. The information is provided to the user by way of a secure socket layer (SSL) and role based security criteria over Internet. Although this particular application has been developed for WRHA, this paper also discusses the effort to extend the Architecture to other hospitals in the region. Any DICOM tag from any imaging modality could be tracked with this software.

  18. Digital Image Analysis for DETCHIP® Code Determination

    PubMed Central

    Lyon, Marcus; Wilson, Mark V.; Rouhier, Kerry A.; Symonsbergen, David J.; Bastola, Kiran; Thapa, Ishwor; Holmes, Andrea E.

    2013-01-01

    DETECHIP® is a molecular sensing array used for identification of a large variety of substances. Previous methodology for the analysis of DETECHIP® used human vision to distinguish color changes induced by the presence of the analyte of interest. This paper describes several analysis techniques using digital images of DETECHIP®. Both a digital camera and flatbed desktop photo scanner were used to obtain Jpeg images. Color information within these digital images was obtained through the measurement of red-green-blue (RGB) values using software such as GIMP, Photoshop and ImageJ. Several different techniques were used to evaluate these color changes. It was determined that the flatbed scanner produced in the clearest and more reproducible images. Furthermore, codes obtained using a macro written for use within ImageJ showed improved consistency versus pervious methods. PMID:25267940

  19. Item Response Theory Analysis of DSM-IV Cannabis Abuse and Dependence Criteria in Adolescents

    ERIC Educational Resources Information Center

    Hartman, Christie A.; Gelhorn, Heather; Crowley, Thomas J.; Sakai, Joseph T.; Stallings, Michael; Young, Susan E.; Rhee, Soo Hyun; Corley, Robin; Hewitt, John K.; Hopfer, Christian J.

    2008-01-01

    A study to examine the DSM-IV criteria for cannabis abuse and dependence among adolescents is conducted. Results conclude that abuse and dependence criteria were not found to affect the different levels of severity in cannabis use.

  20. Comparative Analysis of Thermoeconomic Evaluation Criteria for an Actual Heat Engine

    NASA Astrophysics Data System (ADS)

    Özel, Gülcan; Açıkkalp, Emin; Savaş, Ahmet Fevzi; Yamık, Hasan

    2016-07-01

    In the present study, an actual heat engine is investigated by using different thermoeconomic evaluation criteria in the literature. A criteria that has not been investigated in detail is considered and it is called as ecologico-economical criteria (F_{EC}). It is the difference of power cost and exergy destruction rate cost of the system. All four criteria are applied to an irreversible Carnot heat engine, results are presented numerically and some suggestions are made.

  1. Analysis of dynamic brain imaging data.

    PubMed Central

    Mitra, P P; Pesaran, B

    1999-01-01

    Modern imaging techniques for probing brain function, including functional magnetic resonance imaging, intrinsic and extrinsic contrast optical imaging, and magnetoencephalography, generate large data sets with complex content. In this paper we develop appropriate techniques for analysis and visualization of such imaging data to separate the signal from the noise and characterize the signal. The techniques developed fall into the general category of multivariate time series analysis, and in particular we extensively use the multitaper framework of spectral analysis. We develop specific protocols for the analysis of fMRI, optical imaging, and MEG data, and illustrate the techniques by applications to real data sets generated by these imaging modalities. In general, the analysis protocols involve two distinct stages: "noise" characterization and suppression, and "signal" characterization and visualization. An important general conclusion of our study is the utility of a frequency-based representation, with short, moving analysis windows to account for nonstationarity in the data. Of particular note are 1) the development of a decomposition technique (space-frequency singular value decomposition) that is shown to be a useful means of characterizing the image data, and 2) the development of an algorithm, based on multitaper methods, for the removal of approximately periodic physiological artifacts arising from cardiac and respiratory sources. PMID:9929474

  2. A multi-criteria decision analysis assessment of waste paper management options

    SciTech Connect

    Hanan, Deirdre; Burnley, Stephen; Cooke, David

    2013-03-15

    Highlights: ► Isolated communities have particular problems in terms of waste management. ► An MCDA tool allowed a group of non-experts to evaluate waste management options. ► The group preferred local waste management solutions to export to the mainland. ► Gasification of paper was the preferred option followed by recycling. ► The group concluded that they could be involved in the decision making process. - Abstract: The use of Multi-criteria Decision Analysis (MCDA) was investigated in an exercise using a panel of local residents and stakeholders to assess the options for managing waste paper on the Isle of Wight. Seven recycling, recovery and disposal options were considered by the panel who evaluated each option against seven environmental, financial and social criteria. The panel preferred options where the waste was managed on the island with gasification and recycling achieving the highest scores. Exporting the waste to the English mainland for incineration or landfill proved to be the least preferred options. This research has demonstrated that MCDA is an effective way of involving community groups in waste management decision making.

  3. Multi-criteria decision analysis for waste management in Saharawi refugee camps.

    PubMed

    Garfì, M; Tondelli, S; Bonoli, A

    2009-10-01

    The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders: The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.

  4. Multi-criteria decision analysis for waste management in Saharawi refugee camps

    SciTech Connect

    Garfi, M. Tondelli, S.; Bonoli, A.

    2009-10-15

    The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders: The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.

  5. NIH Image to ImageJ: 25 years of image analysis.

    PubMed

    Schneider, Caroline A; Rasband, Wayne S; Eliceiri, Kevin W

    2012-07-01

    For the past 25 years NIH Image and ImageJ software have been pioneers as open tools for the analysis of scientific images. We discuss the origins, challenges and solutions of these two programs, and how their history can serve to advise and inform other software projects.

  6. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    SciTech Connect

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  7. Spatial multi-criteria decision analysis to predict suitability for African swine fever endemicity in Africa

    PubMed Central

    2014-01-01

    Background African swine fever (ASF) is endemic in several countries of Africa and may pose a risk to all pig producing areas on the continent. Official ASF reporting is often rare and there remains limited awareness of the continent-wide distribution of the disease. In the absence of accurate ASF outbreak data and few quantitative studies on the epidemiology of the disease in Africa, we used spatial multi-criteria decision analysis (MCDA) to derive predictions of the continental distribution of suitability for ASF persistence in domestic pig populations as part of sylvatic or domestic transmission cycles. In order to incorporate the uncertainty in the relative importance of different criteria in defining suitability, we modelled decisions within the MCDA framework using a stochastic approach. The predictive performance of suitability estimates was assessed via a partial ROC analysis using ASF outbreak data reported to the OIE since 2005. Results Outputs from the spatial MCDA indicate that large areas of sub-Saharan Africa may be suitable for ASF persistence as part of either domestic or sylvatic transmission cycles. Areas with high suitability for pig to pig transmission (‘domestic cycles’) were estimated to occur throughout sub-Saharan Africa, whilst areas with high suitability for introduction from wildlife reservoirs (‘sylvatic cycles’) were found predominantly in East, Central and Southern Africa. Based on average AUC ratios from the partial ROC analysis, the predictive ability of suitability estimates for domestic cycles alone was considerably higher than suitability estimates for sylvatic cycles alone, or domestic and sylvatic cycles in combination. Conclusions This study provides the first standardised estimates of the distribution of suitability for ASF transmission associated with domestic and sylvatic cycles in Africa. We provide further evidence for the utility of knowledge-driven risk mapping in animal health, particularly in data

  8. Factor Analysis of the Image Correlation Matrix.

    ERIC Educational Resources Information Center

    Kaiser, Henry F.; Cerny, Barbara A.

    1979-01-01

    Whether to factor the image correlation matrix or to use a new model with an alpha factor analysis of it is mentioned, with particular reference to the determinacy problem. It is pointed out that the distribution of the images is sensibly multivariate normal, making for "better" factor analyses. (Author/CTM)

  9. Brown Adipose Reporting Criteria in Imaging STudies (BARCIST 1.0): Recommendations for Standardized FDG-PET/CT Experiments in Humans.

    PubMed

    Chen, Kong Y; Cypess, Aaron M; Laughlin, Maren R; Haft, Carol R; Hu, Houchun Harry; Bredella, Miriam A; Enerbäck, Sven; Kinahan, Paul E; Lichtenbelt, Wouter van Marken; Lin, Frank I; Sunderland, John J; Virtanen, Kirsi A; Wahl, Richard L

    2016-08-01

    Human brown adipose tissue (BAT) presence, metabolic activity, and estimated mass are typically measured by imaging [18F]fluorodeoxyglucose (FDG) uptake in response to cold exposure in regions of the body expected to contain BAT, using positron emission tomography combined with X-ray computed tomography (FDG-PET/CT). Efforts to describe the epidemiology and biology of human BAT are hampered by diverse experimental practices, making it difficult to directly compare results among laboratories. An expert panel was assembled by the National Institute of Diabetes and Digestive and Kidney Diseases on November 4, 2014 to discuss minimal requirements for conducting FDG-PET/CT experiments of human BAT, data analysis, and publication of results. This resulted in Brown Adipose Reporting Criteria in Imaging STudies (BARCIST 1.0). Since there are no fully validated best practices at this time, panel recommendations are meant to enhance comparability across experiments, but not to constrain experimental design or the questions that can be asked. PMID:27508870

  10. An Imaging And Graphics Workstation For Image Sequence Analysis

    NASA Astrophysics Data System (ADS)

    Mostafavi, Hassan

    1990-01-01

    This paper describes an application-specific engineering workstation designed and developed to analyze imagery sequences from a variety of sources. The system combines the software and hardware environment of the modern graphic-oriented workstations with the digital image acquisition, processing and display techniques. The objective is to achieve automation and high throughput for many data reduction tasks involving metric studies of image sequences. The applications of such an automated data reduction tool include analysis of the trajectory and attitude of aircraft, missile, stores and other flying objects in various flight regimes including launch and separation as well as regular flight maneuvers. The workstation can also be used in an on-line or off-line mode to study three-dimensional motion of aircraft models in simulated flight conditions such as wind tunnels. The system's key features are: 1) Acquisition and storage of image sequences by digitizing real-time video or frames from a film strip; 2) computer-controlled movie loop playback, slow motion and freeze frame display combined with digital image sharpening, noise reduction, contrast enhancement and interactive image magnification; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored image sequence; 4) automatic and manual field-of-view and spatial calibration; 5) image sequence data base generation and management, including the measurement data products; 6) off-line analysis software for trajectory plotting and statistical analysis; 7) model-based estimation and tracking of object attitude angles; and 8) interface to a variety of video players and film transport sub-systems.

  11. A Robust Actin Filaments Image Analysis Framework

    PubMed Central

    Alioscha-Perez, Mitchel; Benadiba, Carine; Goossens, Katty; Kasas, Sandor; Dietler, Giovanni; Willaert, Ronnie; Sahli, Hichem

    2016-01-01

    The cytoskeleton is a highly dynamical protein network that plays a central role in numerous cellular physiological processes, and is traditionally divided into three components according to its chemical composition, i.e. actin, tubulin and intermediate filament cytoskeletons. Understanding the cytoskeleton dynamics is of prime importance to unveil mechanisms involved in cell adaptation to any stress type. Fluorescence imaging of cytoskeleton structures allows analyzing the impact of mechanical stimulation in the cytoskeleton, but it also imposes additional challenges in the image processing stage, such as the presence of imaging-related artifacts and heavy blurring introduced by (high-throughput) automated scans. However, although there exists a considerable number of image-based analytical tools to address the image processing and analysis, most of them are unfit to cope with the aforementioned challenges. Filamentous structures in images can be considered as a piecewise composition of quasi-straight segments (at least in some finer or coarser scale). Based on this observation, we propose a three-steps actin filaments extraction methodology: (i) first the input image is decomposed into a ‘cartoon’ part corresponding to the filament structures in the image, and a noise/texture part, (ii) on the ‘cartoon’ image, we apply a multi-scale line detector coupled with a (iii) quasi-straight filaments merging algorithm for fiber extraction. The proposed robust actin filaments image analysis framework allows extracting individual filaments in the presence of noise, artifacts and heavy blurring. Moreover, it provides numerous parameters such as filaments orientation, position and length, useful for further analysis. Cell image decomposition is relatively under-exploited in biological images processing, and our study shows the benefits it provides when addressing such tasks. Experimental validation was conducted using publicly available datasets, and in osteoblasts

  12. A Robust Actin Filaments Image Analysis Framework.

    PubMed

    Alioscha-Perez, Mitchel; Benadiba, Carine; Goossens, Katty; Kasas, Sandor; Dietler, Giovanni; Willaert, Ronnie; Sahli, Hichem

    2016-08-01

    The cytoskeleton is a highly dynamical protein network that plays a central role in numerous cellular physiological processes, and is traditionally divided into three components according to its chemical composition, i.e. actin, tubulin and intermediate filament cytoskeletons. Understanding the cytoskeleton dynamics is of prime importance to unveil mechanisms involved in cell adaptation to any stress type. Fluorescence imaging of cytoskeleton structures allows analyzing the impact of mechanical stimulation in the cytoskeleton, but it also imposes additional challenges in the image processing stage, such as the presence of imaging-related artifacts and heavy blurring introduced by (high-throughput) automated scans. However, although there exists a considerable number of image-based analytical tools to address the image processing and analysis, most of them are unfit to cope with the aforementioned challenges. Filamentous structures in images can be considered as a piecewise composition of quasi-straight segments (at least in some finer or coarser scale). Based on this observation, we propose a three-steps actin filaments extraction methodology: (i) first the input image is decomposed into a 'cartoon' part corresponding to the filament structures in the image, and a noise/texture part, (ii) on the 'cartoon' image, we apply a multi-scale line detector coupled with a (iii) quasi-straight filaments merging algorithm for fiber extraction. The proposed robust actin filaments image analysis framework allows extracting individual filaments in the presence of noise, artifacts and heavy blurring. Moreover, it provides numerous parameters such as filaments orientation, position and length, useful for further analysis. Cell image decomposition is relatively under-exploited in biological images processing, and our study shows the benefits it provides when addressing such tasks. Experimental validation was conducted using publicly available datasets, and in osteoblasts grown in

  13. Launch commit criteria performance trending analysis, phase 1, revision A. SRM and QA mission services

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An assessment of quantitative methods and measures for measuring launch commit criteria (LCC) performance measurement trends is made. A statistical performance trending analysis pilot study was processed and compared to STS-26 mission data. This study used four selected shuttle measurement types (solid rocket booster, external tank, space shuttle main engine, and range safety switch safe and arm device) from the five missions prior to mission 51-L. After obtaining raw data coordinates, each set of measurements was processed to obtain statistical confidence bounds and mean data profiles for each of the selected measurement types. STS-26 measurements were compared to the statistical data base profiles to verify the statistical capability of assessing occurrences of data trend anomalies and abnormal time-varying operational conditions associated with data amplitude and phase shifts.

  14. On image analysis in fractography (Methodological Notes)

    NASA Astrophysics Data System (ADS)

    Shtremel', M. A.

    2015-10-01

    As other spheres of image analysis, fractography has no universal method for information convolution. An effective characteristic of an image is found by analyzing the essence and origin of every class of objects. As follows from the geometric definition of a fractal curve, its projection onto any straight line covers a certain segment many times; therefore, neither a time series (one-valued function of time) nor an image (one-valued function of plane) can be a fractal. For applications, multidimensional multiscale characteristics of an image are necessary. "Full" wavelet series break the law of conservation of information.

  15. Harnessing ecosystem models and multi-criteria decision analysis for the support of forest management.

    PubMed

    Wolfslehner, Bernhard; Seidl, Rupert

    2010-12-01

    The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.

  16. Harnessing Ecosystem Models and Multi-Criteria Decision Analysis for the Support of Forest Management

    NASA Astrophysics Data System (ADS)

    Wolfslehner, Bernhard; Seidl, Rupert

    2010-12-01

    The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.

  17. Malware analysis using visualized image matrices.

    PubMed

    Han, KyoungSoo; Kang, BooJoong; Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively. PMID:25133202

  18. Malware Analysis Using Visualized Image Matrices

    PubMed Central

    Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively. PMID:25133202

  19. Malware analysis using visualized image matrices.

    PubMed

    Han, KyoungSoo; Kang, BooJoong; Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively.

  20. Multi-level multi-criteria analysis of alternative fuels for waste collection vehicles in the United States.

    PubMed

    Maimoun, Mousa; Madani, Kaveh; Reinhart, Debra

    2016-04-15

    Historically, the U.S. waste collection fleet was dominated by diesel-fueled waste collection vehicles (WCVs); the growing need for sustainable waste collection has urged decision makers to incorporate economically efficient alternative fuels, while mitigating environmental impacts. The pros and cons of alternative fuels complicate the decisions making process, calling for a comprehensive study that assesses the multiple factors involved. Multi-criteria decision analysis (MCDA) methods allow decision makers to select the best alternatives with respect to selection criteria. In this study, two MCDA methods, Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) and Simple Additive Weighting (SAW), were used to rank fuel alternatives for the U.S. waste collection industry with respect to a multi-level environmental and financial decision matrix. The environmental criteria consisted of life-cycle emissions, tail-pipe emissions, water footprint (WFP), and power density, while the financial criteria comprised of vehicle cost, fuel price, fuel price stability, and fueling station availability. The overall analysis showed that conventional diesel is still the best option, followed by hydraulic-hybrid WCVs, landfill gas (LFG) sourced natural gas, fossil natural gas, and biodiesel. The elimination of the WFP and power density criteria from the environmental criteria ranked biodiesel 100 (BD100) as an environmentally better alternative compared to other fossil fuels (diesel and natural gas). This result showed that considering the WFP and power density as environmental criteria can make a difference in the decision process. The elimination of the fueling station and fuel price stability criteria from the decision matrix ranked fossil natural gas second after LFG-sourced natural gas. This scenario was found to represent the status quo of the waste collection industry. A sensitivity analysis for the status quo scenario showed the overall ranking of diesel and

  1. Multi-level multi-criteria analysis of alternative fuels for waste collection vehicles in the United States.

    PubMed

    Maimoun, Mousa; Madani, Kaveh; Reinhart, Debra

    2016-04-15

    Historically, the U.S. waste collection fleet was dominated by diesel-fueled waste collection vehicles (WCVs); the growing need for sustainable waste collection has urged decision makers to incorporate economically efficient alternative fuels, while mitigating environmental impacts. The pros and cons of alternative fuels complicate the decisions making process, calling for a comprehensive study that assesses the multiple factors involved. Multi-criteria decision analysis (MCDA) methods allow decision makers to select the best alternatives with respect to selection criteria. In this study, two MCDA methods, Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) and Simple Additive Weighting (SAW), were used to rank fuel alternatives for the U.S. waste collection industry with respect to a multi-level environmental and financial decision matrix. The environmental criteria consisted of life-cycle emissions, tail-pipe emissions, water footprint (WFP), and power density, while the financial criteria comprised of vehicle cost, fuel price, fuel price stability, and fueling station availability. The overall analysis showed that conventional diesel is still the best option, followed by hydraulic-hybrid WCVs, landfill gas (LFG) sourced natural gas, fossil natural gas, and biodiesel. The elimination of the WFP and power density criteria from the environmental criteria ranked biodiesel 100 (BD100) as an environmentally better alternative compared to other fossil fuels (diesel and natural gas). This result showed that considering the WFP and power density as environmental criteria can make a difference in the decision process. The elimination of the fueling station and fuel price stability criteria from the decision matrix ranked fossil natural gas second after LFG-sourced natural gas. This scenario was found to represent the status quo of the waste collection industry. A sensitivity analysis for the status quo scenario showed the overall ranking of diesel and

  2. Principal component analysis of scintimammographic images.

    PubMed

    Bonifazzi, Claudio; Cinti, Maria Nerina; Vincentis, Giuseppe De; Finos, Livio; Muzzioli, Valerio; Betti, Margherita; Nico, Lanconelli; Tartari, Agostino; Pani, Roberto

    2006-01-01

    The recent development of new gamma imagers based on scintillation array with high spatial resolution, has strongly improved the possibility of detecting sub-centimeter cancer in Scintimammography. However, Compton scattering contamination remains the main drawback since it limits the sensitivity of tumor detection. Principal component image analysis (PCA), recently introduced in scintimam nographic imaging, is a data reduction technique able to represent the radiation emitted from chest, breast healthy and damaged tissues as separated images. From these images a Scintimammography can be obtained where the Compton contamination is "removed". In the present paper we compared the PCA reconstructed images with the conventional scintimammographic images resulting from the photopeak (Ph) energy window. Data coming from a clinical trial were used. For both kinds of images the tumor presence was quantified by evaluating the t-student statistics for independent sample as a measure of the signal-to-noise ratio (SNR). Since the absence of Compton scattering, the PCA reconstructed images shows a better noise suppression and allows a more reliable diagnostics in comparison with the images obtained by the photopeak energy window, reducing the trend in producing false positive. PMID:17646004

  3. 75 FR 69140 - NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-10

    ... COMMISSION NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the Standardized Plant Analysis Risk Models--Surry and Peach Bottom; Draft Report for Comment AGENCY: Nuclear... Regulatory Commission has issued for public comment a document entitled: NUREG-1953, ``Confirmatory...

  4. Image analysis in comparative genomic hybridization

    SciTech Connect

    Lundsteen, C.; Maahr, J.; Christensen, B.

    1995-01-01

    Comparative genomic hybridization (CGH) is a new technique by which genomic imbalances can be detected by combining in situ suppression hybridization of whole genomic DNA and image analysis. We have developed software for rapid, quantitative CGH image analysis by a modification and extension of the standard software used for routine karyotyping of G-banded metaphase spreads in the Magiscan chromosome analysis system. The DAPI-counterstained metaphase spread is karyotyped interactively. Corrections for image shifts between the DAPI, FITC, and TRITC images are done manually by moving the three images relative to each other. The fluorescence background is subtracted. A mean filter is applied to smooth the FITC and TRITC images before the fluorescence ratio between the individual FITC and TRITC-stained chromosomes is computed pixel by pixel inside the area of the chromosomes determined by the DAPI boundaries. Fluorescence intensity ratio profiles are generated, and peaks and valleys indicating possible gains and losses of test DNA are marked if they exceed ratios below 0.75 and above 1.25. By combining the analysis of several metaphase spreads, consistent findings of gains and losses in all or almost all spreads indicate chromosomal imbalance. Chromosomal imbalances are detected either by visual inspection of fluorescence ratio (FR) profiles or by a statistical approach that compares FR measurements of the individual case with measurements of normal chromosomes. The complete analysis of one metaphase can be carried out in approximately 10 minutes. 8 refs., 7 figs., 1 tab.

  5. Net Clinical Benefit of Oral Anticoagulants: A Multiple Criteria Decision Analysis

    PubMed Central

    Yang, Yea-Huei Kao; Lu, Christine Y.

    2015-01-01

    Background This study quantitatively evaluated the comparative efficacy and safety of new oral anticoagulants (dabigatran, rivaroxaban, and apizaban) and warfarin for treatment of nonvalvular atrial fibrillation. We also compared these agents under different scenarios, including population with high risk of stroke and for primary vs. secondary stroke prevention. Methods We used multiple criteria decision analysis (MCDA) to assess the benefit-risk of these medications. Our MCDA models contained criteria for benefits (prevention of ischemic stroke and systemic embolism) and risks (intracranial and extracranial bleeding). We calculated a performance score for each drug accounting for benefits and risks in comparison to treatment alternatives. Results Overall, new agents had higher performance scores than warfarin; in order of performance scores: dabigatran 150 mg (0.529), rivaroxaban (0.462), apixaban (0.426), and warfarin (0.191). For patients at a higher risk of stroke (CHADS2 score≥3), apixaban had the highest performance score (0.686); performance scores for other drugs were 0.462 for dabigatran 150 mg, 0.392 for dabigatran 110 mg, 0.271 for rivaroxaban, and 0.116 for warfarin. Dabigatran 150 mg had the highest performance score for primary stroke prevention, while dabigatran 110 mg had the highest performance score for secondary prevention. Conclusions Our results suggest that new oral anticoagulants might be preferred over warfarin. Selecting appropriate medicines according to the patient’s condition based on information from an integrated benefit-risk assessment of treatment options is crucial to achieve optimal clinical outcomes. PMID:25897861

  6. Repeated-Measures Analysis of Image Data

    NASA Technical Reports Server (NTRS)

    Newton, H. J.

    1983-01-01

    It is suggested that using a modified analysis of variance procedure on data sampled systematically from a rectangular array of image data can provide a measure of homogeneity of means over that array in single directions and how variation in perpendicular directions interact. The modification of analysis of variance required to account for spatial correlation is described theoretically and numerically on simulated data.

  7. Hybrid µCT-FMT imaging and image analysis

    PubMed Central

    Zafarnia, Sara; Babler, Anne; Jahnen-Dechent, Willi; Lammers, Twan; Lederle, Wiltrud; Kiessling, Fabian

    2015-01-01

    Fluorescence-mediated tomography (FMT) enables longitudinal and quantitative determination of the fluorescence distribution in vivo and can be used to assess the biodistribution of novel probes and to assess disease progression using established molecular probes or reporter genes. The combination with an anatomical modality, e.g., micro computed tomography (µCT), is beneficial for image analysis and for fluorescence reconstruction. We describe a protocol for multimodal µCT-FMT imaging including the image processing steps necessary to extract quantitative measurements. After preparing the mice and performing the imaging, the multimodal data sets are registered. Subsequently, an improved fluorescence reconstruction is performed, which takes into account the shape of the mouse. For quantitative analysis, organ segmentations are generated based on the anatomical data using our interactive segmentation tool. Finally, the biodistribution curves are generated using a batch-processing feature. We show the applicability of the method by assessing the biodistribution of a well-known probe that binds to bones and joints. PMID:26066033

  8. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    PubMed

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results.

  9. Strategic rehabilitation planning of piped water networks using multi-criteria decision analysis.

    PubMed

    Scholten, Lisa; Scheidegger, Andreas; Reichert, Peter; Maurer, Max; Mauer, Max; Lienert, Judit

    2014-02-01

    To overcome the difficulties of strategic asset management of water distribution networks, a pipe failure and a rehabilitation model are combined to predict the long-term performance of rehabilitation strategies. Bayesian parameter estimation is performed to calibrate the failure and replacement model based on a prior distribution inferred from three large water utilities in Switzerland. Multi-criteria decision analysis (MCDA) and scenario planning build the framework for evaluating 18 strategic rehabilitation alternatives under future uncertainty. Outcomes for three fundamental objectives (low costs, high reliability, and high intergenerational equity) are assessed. Exploitation of stochastic dominance concepts helps to identify twelve non-dominated alternatives and local sensitivity analysis of stakeholder preferences is used to rank them under four scenarios. Strategies with annual replacement of 1.5-2% of the network perform reasonably well under all scenarios. In contrast, the commonly used reactive replacement is not recommendable unless cost is the only relevant objective. Exemplified for a small Swiss water utility, this approach can readily be adapted to support strategic asset management for any utility size and based on objectives and preferences that matter to the respective decision makers.

  10. Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.

    PubMed

    Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor

    2011-09-01

    Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied.

  11. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    PubMed

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results. PMID:27054724

  12. Choices, choices: the application of multi-criteria decision analysis to a food safety decision-making problem.

    PubMed

    Fazil, A; Rajic, A; Sanchez, J; McEwen, S

    2008-11-01

    In the food safety arena, the decision-making process can be especially difficult. Decision makers are often faced with social and fiscal pressures when attempting to identify an appropriate balance among several choices. Concurrently, policy and decision makers in microbial food safety are under increasing pressure to demonstrate that their policies and decisions are made using transparent and accountable processes. In this article, we present a multi-criteria decision analysis approach that can be used to address the problem of trying to select a food safety intervention while balancing various criteria. Criteria that are important when selecting an intervention were determined, as a result of an expert consultation, to include effectiveness, cost, weight of evidence, and practicality associated with the interventions. The multi-criteria decision analysis approach we present is able to consider these criteria and arrive at a ranking of interventions. It can also provide a clear justification for the ranking as well as demonstrate to stakeholders, through a scenario analysis approach, how to potentially converge toward common ground. While this article focuses on the problem of selecting food safety interventions, the range of applications in the food safety arena is truly diverse and can be a significant tool in assisting decisions that need to be coherent, transparent, and justifiable. Most importantly, it is a significant contributor when there is a need to strike a fine balance between various potentially competing alternatives and/or stakeholder groups.

  13. GIS-based multicriteria municipal solid waste landfill suitability analysis: a review of the methodologies performed and criteria implemented.

    PubMed

    Demesouka, O E; Vavatsikos, A P; Anagnostopoulos, K P

    2014-04-01

    Multicriteria spatial decision support systems (MC-SDSS) have emerged as an integration of the geographical information systems (GIS) and multiple criteria decision analysis (MCDA) methods. GIS-based MCDA allows the incorporation of conflicting objectives and decision maker (DM) preferences into spatial decision models. During recent decades, a variety of research articles have been published regarding the implementation of methods and/or tools in a variety of real-world case studies. The article discusses, in detail, the criteria and methods that are implemented in GIS-based landfill siting suitability analysis and especially the exclusionary and non-exclusionary criteria that can be considered when selecting sites for municipal solid waste (MSW) landfills. This paper reviews 36 seminal articles in which the evaluation of candidate landfill sites is conducted using MCDA methods. After a brief description of the main components of a MC-SDSS and the applied decision rules, the review focuses on the criteria incorporated into the decision models. The review provides a comprehensive guide to the landfill siting analysis criteria, providing details regarding the utilization methods, their decision or exclusionary nature and their monotonicity.

  14. Multi-Criteria Analysis for Biomass Utilization Applying Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Morimoto, Hidetsugu; Hoshino, Satoshi; Kuki, Yasuaki

    This paper aimed to consider about material-recycling, preventing global warming, and economic efficiency on preset and planed 195 Biomass Towns applying DEA (Data Envelopment Analysis), which can evaluate operational efficiency entities such as private companies or projects. In the results, although the Biomass Town can recycle material efficiently, it was clarified that preventing global warming and business profitability was brushed off like it in Biomass Town Design. Moreover, from the point of view of operational efficiency, we suggested an improvement of the Biomass Town scale for more efficiency-enhancing applying DEA. We found that applying DEA was able to catch more improvements or indicator as compared with cost-benefit analysis and cost-effectiveness analysis.

  15. Particle Pollution Estimation Based on Image Analysis.

    PubMed

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction. PMID:26828757

  16. Particle Pollution Estimation Based on Image Analysis

    PubMed Central

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction. PMID:26828757

  17. Particle Pollution Estimation Based on Image Analysis.

    PubMed

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction.

  18. Image Chain Analysis For Digital Image Rectification System

    NASA Astrophysics Data System (ADS)

    Arguello, Roger J.

    1981-07-01

    An image chain analysis, utilizing a comprehensive computer program, has been gen-erated for the key elements of a digital image rectification system. System block dia-grams and analyses for three system configurations employing film scanner input have been formulated with a parametric specification of pertinent element modulation transfer functions and input film scene spectra. The major elements of the system for this analy-sis include a high-resolution, high-speed charge-coupled device film scanner, three candidate digital resampling option algorithms (i.e., nearest neighbor, bilinear inter-polation and cubic convolution methods), and two candidate printer reconstructor implemen-tations (solid-state light-emitting diode printer and laser beam recorder). Suitable metrics for the digital rectification system, incorporating the effects of interpolation and resolution error, were established, and the image chain analysis program was used to perform a quantitative comparison of the three resampling options with the two candi-date printer reconstructor implementations. The nearest neighbor digital resampling function is found to be a good compromise choice when cascaded with either a light-emit-ting diode printer or laser beam recorder. The resulting composite intensity point spread functions, including resampling, and both types of reconstruction are bilinear and quadratic, respectively.

  19. Making Good Decisions in Healthcare with Multi-Criteria Decision Analysis: The Use, Current Research and Future Development of MCDA.

    PubMed

    Mühlbacher, Axel C; Kaczynski, Anika

    2016-02-01

    Healthcare decision making is usually characterized by a low degree of transparency. The demand for transparent decision processes can be fulfilled only when assessment, appraisal and decisions about health technologies are performed under a systematic construct of benefit assessment. The benefit of an intervention is often multidimensional and, thus, must be represented by several decision criteria. Complex decision problems require an assessment and appraisal of various criteria; therefore, a decision process that systematically identifies the best available alternative and enables an optimal and transparent decision is needed. For that reason, decision criteria must be weighted and goal achievement must be scored for all alternatives. Methods of multi-criteria decision analysis (MCDA) are available to analyse and appraise multiple clinical endpoints and structure complex decision problems in healthcare decision making. By means of MCDA, value judgments, priorities and preferences of patients, insurees and experts can be integrated systematically and transparently into the decision-making process. This article describes the MCDA framework and identifies potential areas where MCDA can be of use (e.g. approval, guidelines and reimbursement/pricing of health technologies). A literature search was performed to identify current research in healthcare. The results showed that healthcare decision making is addressing the problem of multiple decision criteria and is focusing on the future development and use of techniques to weight and score different decision criteria. This article emphasizes the use and future benefit of MCDA.

  20. An Analysis of Criteria for the Evaluation of Educational Web Sites.

    ERIC Educational Resources Information Center

    Bantjes, L.; Cronje, J. C.

    2000-01-01

    Proposes a set of 50 criteria in seven categories for evaluating educational Internet information sources. Compares these indicators against a number of acknowledged Internet evaluation sites and identifies the most used criteria. Finds that currency, graphic design, and browsability are the most highly rated aspects to consider when evaluating…

  1. Environmental Education Research Project, Content Analysis Criteria, Report on First Evaluation Trial.

    ERIC Educational Resources Information Center

    Linke, R. D.

    Ten criteria for use in assessing the emphasis on environmental education in textbooks and similar resource materials were developed and given to 30 members of the Australian Conservation Foundation Education and Training Committees throughout the country. Each rater applied the criteria to three chapters of a biology textbook "The Web of Life,"…

  2. The Politics of Determining Merit Aid Eligibility Criteria: An Analysis of the Policy Process

    ERIC Educational Resources Information Center

    Ness, Erik C.

    2010-01-01

    Despite the scholarly attention on the effects of merit aid on college access and choice, particularly on the significant effect that states' varied eligibility criteria play, no studies have examined the policy process through which merit aid criteria are determined. This is surprising given the recent attention to state-level policy dynamics and…

  3. Audience Evaluations of Ethical Issues in Television Journalism: An Analysis of the Criteria Used for Judgment.

    ERIC Educational Resources Information Center

    Lind, Rebecca Ann; Rarick, David L.

    Television journalism has long been the object of study by scholars of news media ethics. A study examined the reasoning process and the criteria for judgment used by viewers when evaluating possibly problematic television (TV) news content, and analyzed these criteria as they are applied to ethical issues and problems in TV newscasts. Thirty-four…

  4. GIS, Geoscience, Multi-criteria Analysis and Integrated Management of the Coastal Zone

    NASA Astrophysics Data System (ADS)

    Kacimi, Y.; Barich, A.

    2011-12-01

    In this 3rd millennium, geology can be considered as a science of decision that intervenes in all the society domains. It has passed its academic dimension to spread toward some domains that until now were out of reach. Combining different Geoscience sub-disciplines emanates from a strong will to demonstrate the contribution of this science and its impact on the daily life, especially by making it applicable to various innovative projects. Geophysics, geochemistry and structural geology are complementary disciplines that can be applied in perfect symbiosis in many domains like construction, mining prospection, impact assessment, environment, etc. This can be proved by using collected data from these studies and integrate them into Geographic Information Systems (GIS), in order to make a multi-criteria analysis, which gives generally very impressive results. From this point, it is easy to set mining, eco-geotouristic and risk assessment models in order to establish land use projects but also in the case of integrated management of the coastal zone (IMCZ). Touristic projects in Morocco focus on its coast which represents at least 3500 km ; the management of this zone for building marinas or touristic infrastructures requires a deep and detailed study of marine currents on the coast, for example, by creating surveillance models and a coastal hazards map. An innovative project that will include geophysical, geochemical and structural geology studies associated to a multi-criteria analysis. The data will be integrated into a GIS to establish a coastal map that will highlight low-risk erosion zones and thus will facilitate implementation of ports and other construction projects. YES Morocco is a chapter of the International YES Network that aims to promote Geoscience in the service of society and professional development of Young and Early Career Geoscientists. Our commitment for such project will be of qualitative aspect into an associative framework that will involve

  5. Data analysis for GOPEX image frames

    NASA Technical Reports Server (NTRS)

    Levine, B. M.; Shaik, K. S.; Yan, T.-Y.

    1993-01-01

    The data analysis based on the image frames received at the Solid State Imaging (SSI) camera of the Galileo Optical Experiment (GOPEX) demonstration conducted between 9-16 Dec. 1992 is described. Laser uplink was successfully established between the ground and the Galileo spacecraft during its second Earth-gravity-assist phase in December 1992. SSI camera frames were acquired which contained images of detected laser pulses transmitted from the Table Mountain Facility (TMF), Wrightwood, California, and the Starfire Optical Range (SOR), Albuquerque, New Mexico. Laser pulse data were processed using standard image-processing techniques at the Multimission Image Processing Laboratory (MIPL) for preliminary pulse identification and to produce public release images. Subsequent image analysis corrected for background noise to measure received pulse intensities. Data were plotted to obtain histograms on a daily basis and were then compared with theoretical results derived from applicable weak-turbulence and strong-turbulence considerations. Processing steps are described and the theories are compared with the experimental results. Quantitative agreement was found in both turbulence regimes, and better agreement would have been found, given more received laser pulses. Future experiments should consider methods to reliably measure low-intensity pulses, and through experimental planning to geometrically locate pulse positions with greater certainty.

  6. Cancer detection by quantitative fluorescence image analysis.

    PubMed

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  7. Automatic quantitative analysis of t-tubule organization in cardiac myocytes using ImageJ.

    PubMed

    Pasqualin, Côme; Gannier, François; Malécot, Claire O; Bredeloux, Pierre; Maupoil, Véronique

    2015-02-01

    The transverse tubule system in mammalian striated muscle is highly organized and contributes to optimal and homogeneous contraction. Diverse pathologies such as heart failure and atrial fibrillation include disorganization of t-tubules and contractile dysfunction. Few tools are available for the quantification of the organization of the t-tubule system. We developed a plugin for the ImageJ/Fiji image analysis platform developed by the National Institutes of Health. This plugin (TTorg) analyzes raw confocal microscopy images. Analysis options include the whole image, specific regions of the image (cropping), and z-axis analysis of the same image. Batch analysis of a series of images with identical criteria is also one of the options. There is no need to either reorientate any specimen to the horizontal or to do a thresholding of the image to perform analysis. TTorg includes a synthetic "myocyte-like" image generator to test the plugin's efficiency in the user's own experimental conditions. This plugin was validated on synthetic images for different simulated cell characteristics and acquisition parameters. TTorg was able to detect significant differences between the organization of the t-tubule systems in experimental data of mouse ventricular myocytes isolated from wild-type and dystrophin-deficient mice. TTorg is freely distributed, and its source code is available. It provides a reliable, easy-to-use, automatic, and unbiased measurement of t-tubule organization in a wide variety of experimental conditions.

  8. Advanced automated char image analysis techniques

    SciTech Connect

    Tao Wu; Edward Lester; Michael Cloke

    2006-05-15

    Char morphology is an important characteristic when attempting to understand coal behavior and coal burnout. In this study, an augmented algorithm has been proposed to identify char types using image analysis. On the basis of a series of image processing steps, a char image is singled out from the whole image, which then allows the important major features of the char particle to be measured, including size, porosity, and wall thickness. The techniques for automated char image analysis have been tested against char images taken from ICCP Char Atlas as well as actual char particles derived from pyrolyzed char samples. Thirty different chars were prepared in a drop tube furnace operating at 1300{sup o}C, 1% oxygen, and 100 ms from 15 different world coals sieved into two size fractions (53-75 and 106-125 {mu}m). The results from this automated technique are comparable with those from manual analysis, and the additional detail from the automated sytem has potential use in applications such as combustion modeling systems. Obtaining highly detailed char information with automated methods has traditionally been hampered by the difficulty of automatic recognition of individual char particles. 20 refs., 10 figs., 3 tabs.

  9. A pairwise image analysis with sparse decomposition

    NASA Astrophysics Data System (ADS)

    Boucher, A.; Cloppet, F.; Vincent, N.

    2013-02-01

    This paper aims to detect the evolution between two images representing the same scene. The evolution detection problem has many practical applications, especially in medical images. Indeed, the concept of a patient "file" implies the joint analysis of different acquisitions taken at different times, and the detection of significant modifications. The research presented in this paper is carried out within the application context of the development of computer assisted diagnosis (CAD) applied to mammograms. It is performed on already registered pair of images. As the registration is never perfect, we must develop a comparison method sufficiently adapted to detect real small differences between comparable tissues. In many applications, the assessment of similarity used during the registration step is also used for the interpretation step that yields to prompt suspicious regions. In our case registration is assumed to match the spatial coordinates of similar anatomical elements. In this paper, in order to process the medical images at tissue level, the image representation is based on elementary patterns, therefore seeking patterns, not pixels. Besides, as the studied images have low entropy, the decomposed signal is expressed in a parsimonious way. Parsimonious representations are known to help extract the significant structures of a signal, and generate a compact version of the data. This change of representation should allow us to compare the studied images in a short time, thanks to the low weight of the images thus represented, while maintaining a good representativeness. The good precision of our results show the approach efficiency.

  10. Automated eXpert Spectral Image Analysis

    2003-11-25

    AXSIA performs automated factor analysis of hyperspectral images. In such images, a complete spectrum is collected an each point in a 1-, 2- or 3- dimensional spatial array. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful information. Multivariate factor analysis techniques have proven effective for extracting the essential information from high dimensional data sets into a limtedmore » number of factors that describe the spectral characteristics and spatial distributions of the pure components comprising the sample. AXSIA provides tools to estimate different types of factor models including Singular Value Decomposition (SVD), Principal Component Analysis (PCA), PCA with factor rotation, and Alternating Least Squares-based Multivariate Curve Resolution (MCR-ALS). As part of the analysis process, AXSIA can automatically estimate the number of pure components that comprise the data and can scale the data to account for Poisson noise. The data analysis methods are fundamentally based on eigenanalysis of the data crossproduct matrix coupled with orthogonal eigenvector rotation and constrained alternating least squares refinement. A novel method for automatically determining the number of significant components, which is based on the eigenvalues of the crossproduct matrix, has also been devised and implemented. The data can be compressed spectrally via PCA and spatially through wavelet transforms, and algorithms have been developed that perform factor analysis in the transform domain while retaining full spatial and spectral resolution in the final result. These latter innovations enable the analysis of larger-than core-memory spectrum-images. AXSIA was designed to perform automated chemical phase analysis of spectrum-images acquired by a variety of chemical imaging techniques. Successful applications include Energy Dispersive X-ray Spectroscopy, X

  11. Using soil function evaluation in multi-criteria decision analysis for sustainability appraisal of remediation alternatives.

    PubMed

    Volchko, Yevheniya; Norrman, Jenny; Rosén, Lars; Bergknut, Magnus; Josefsson, Sarah; Söderqvist, Tore; Norberg, Tommy; Wiberg, Karin; Tysklind, Mats

    2014-07-01

    Soil contamination is one of the major threats constraining proper functioning of the soil and thus provision of ecosystem services. Remedial actions typically only address the chemical soil quality by reducing total contaminant concentrations to acceptable levels guided by land use. However, emerging regulatory requirements on soil protection demand a holistic view on soil assessment in remediation projects thus accounting for a variety of soil functions. Such a view would require not only that the contamination concentrations are assessed and attended to, but also that other aspects are taking into account, thus addressing also physical and biological as well as other chemical soil quality indicators (SQIs). This study outlines how soil function assessment can be a part of a holistic sustainability appraisal of remediation alternatives using multi-criteria decision analysis (MCDA). The paper presents a method for practitioners for evaluating the effects of remediation alternatives on selected ecological soil functions using a suggested minimum data set (MDS) containing physical, biological and chemical SQIs. The measured SQIs are transformed into sub-scores by the use of scoring curves, which allows interpretation and the integration of soil quality data into the MCDA framework. The method is demonstrated at a study site (Marieberg, Sweden) and the results give an example of how soil analyses using the suggested MDS can be used for soil function assessment and subsequent input to the MCDA framework.

  12. Using soil function evaluation in multi-criteria decision analysis for sustainability appraisal of remediation alternatives.

    PubMed

    Volchko, Yevheniya; Norrman, Jenny; Rosén, Lars; Bergknut, Magnus; Josefsson, Sarah; Söderqvist, Tore; Norberg, Tommy; Wiberg, Karin; Tysklind, Mats

    2014-07-01

    Soil contamination is one of the major threats constraining proper functioning of the soil and thus provision of ecosystem services. Remedial actions typically only address the chemical soil quality by reducing total contaminant concentrations to acceptable levels guided by land use. However, emerging regulatory requirements on soil protection demand a holistic view on soil assessment in remediation projects thus accounting for a variety of soil functions. Such a view would require not only that the contamination concentrations are assessed and attended to, but also that other aspects are taking into account, thus addressing also physical and biological as well as other chemical soil quality indicators (SQIs). This study outlines how soil function assessment can be a part of a holistic sustainability appraisal of remediation alternatives using multi-criteria decision analysis (MCDA). The paper presents a method for practitioners for evaluating the effects of remediation alternatives on selected ecological soil functions using a suggested minimum data set (MDS) containing physical, biological and chemical SQIs. The measured SQIs are transformed into sub-scores by the use of scoring curves, which allows interpretation and the integration of soil quality data into the MCDA framework. The method is demonstrated at a study site (Marieberg, Sweden) and the results give an example of how soil analyses using the suggested MDS can be used for soil function assessment and subsequent input to the MCDA framework. PMID:24529453

  13. Criteria for acceptance to preprofessional dietetics programs vs desired qualities of professionals: an analysis.

    PubMed

    Moore, K K

    1995-01-01

    The objectives of this analysis were to examine the literature and compare and contrast (a) qualities preferred in preprofessional dietetics students by directors of internships and approved preprofessional practice programs (AP4s), (b) characteristics needed to succeed in a scientific field, (c) traits emphasized by dietetics training programs compared with those most valued by employers, (d) skills needed by high-level managerial dietitians and those in business and communications, and (e) qualities dietitians have aspired to develop for increased competitiveness in the marketplace. Even though the revised Standards of Education have been in place since 1988, recent evaluation of criteria for internship and AP4 admission has shown traditional emphasis on academic performance and the importance of work experience. Success in scientific pursuits has been linked with more than innate intelligence; a drive for success and enthusiasm for learning are also involved. Internships foster mostly technical learning, so development of skills in human and conceptual areas are somewhat lacking. These skills, which have been identified as valuable to employers, need greater development or more consistent identification in the selection and training process. Perhaps serious consideration should be given to applicants for preprofessional programs who have shown leadership qualities through extracurricular activities or who have given themselves the opportunity to develop and improve these skills. Such students might hasten the metamorphosis of dietetics practitioners toward improved levels of compensation and professional fulfillment. PMID:7798584

  14. Criteria for acceptance to preprofessional dietetics programs vs desired qualities of professionals: an analysis.

    PubMed

    Moore, K K

    1995-01-01

    The objectives of this analysis were to examine the literature and compare and contrast (a) qualities preferred in preprofessional dietetics students by directors of internships and approved preprofessional practice programs (AP4s), (b) characteristics needed to succeed in a scientific field, (c) traits emphasized by dietetics training programs compared with those most valued by employers, (d) skills needed by high-level managerial dietitians and those in business and communications, and (e) qualities dietitians have aspired to develop for increased competitiveness in the marketplace. Even though the revised Standards of Education have been in place since 1988, recent evaluation of criteria for internship and AP4 admission has shown traditional emphasis on academic performance and the importance of work experience. Success in scientific pursuits has been linked with more than innate intelligence; a drive for success and enthusiasm for learning are also involved. Internships foster mostly technical learning, so development of skills in human and conceptual areas are somewhat lacking. These skills, which have been identified as valuable to employers, need greater development or more consistent identification in the selection and training process. Perhaps serious consideration should be given to applicants for preprofessional programs who have shown leadership qualities through extracurricular activities or who have given themselves the opportunity to develop and improve these skills. Such students might hasten the metamorphosis of dietetics practitioners toward improved levels of compensation and professional fulfillment.

  15. Comorbidity in people with Down's syndrome: a criteria-based analysis.

    PubMed

    van Schrojenstein Lantman-de Valk, H M; Haveman, M J; Crebolder, H F

    1996-10-01

    The aims of this study were to review what is currently known about comorbidity in people with Down's syndrome and to determine if their relative risk for certain disorders was increased. Analysis was carried out on the published literature from 1982 through 1994. In order to be included in this study, articles had to meet predetermined criteria. The strengths and weaknesses of the selected articles were considered in this review. The estimation of relative risks was done by calculating the odds ratio (OR). Odds ratios of > 2 or < 0.5 were found in more than one article for congenital heart defects, hypothyroidism, hearing impairment and hepatitis B. Only one article indicated an OR within this range for all of the following disorders: obesity, epilepsy, degenerative spine disorders and a wide atlanto-axial distance. The results were unclear in the areas of hyperthyroidism, visual disorders, dementia and psychiatric disorders. The concept of comorbidity, i.e. establishing the relationships between the various conditions in one person and understanding the implications for medical care, seems promising, especially for people with intellectual disability. Further work in this area may well improve the quality of care offered to these people.

  16. Applications Of Binary Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  17. Microscopical image analysis: problems and approaches.

    PubMed

    Bradbury, S

    1979-03-01

    This article reviews some of the problems which have been encountered in the application of automatic image analysis to problems in biology. Some of the questions involved in the actual formulation of such a problem for this approach are considered as well as the difficulties in the analysis due to lack of specific constrast in the image and to its complexity. Various practical methods which have been successful in overcoming these problems are outlined, and the question of the desirability of an opto-manual or semi-automatic system as opposed to a fully automatic version is considered.

  18. Comparison of Image Quality Criteria between Digital Storage Phosphor Plate in Mammography and Full-Field Digital Mammography in the Detection of Breast Cancer

    PubMed Central

    Thevi Rajendran, Pushpa; Krishnapillai, Vijayalakshmi; Tamanang, Sulaiman; Kumari Chelliah, Kanaga

    2012-01-01

    Background: Digital mammography is slowly replacing screen film mammography. In digital mammography, 2 methods are available in acquiring images: digital storage phosphor plate and full-field digital mammography. The aim of this study was to compare the image quality acquired from the 2 methods of digital mammography in the detection of breast cancer. Methods: The study took place at the National Cancer Society, Kuala Lumpur, and followed 150 asymptomatic women for the duration of 1 year. Participating women gave informed consent and were exposed to 4 views from each system. Two radiologists independently evaluated the printed images based on the image quality criteria in mammography. McNemar’s test was used to compare the image quality criteria between the systems. Results: The agreement between the radiologists for the digital storage phosphor plate was к = 0.551 and for full-field digital mammography was к = 0.523. Full-field digital mammography was significantly better compared with the digital storage phosphor plate in right and left mediolateral oblique views (P < 0.05) in the detection of microcalcifications, which are early signs of breast cancer. However, both systems were comparable in all other aspects of image quality. Conclusion: Digital mammography is a useful screening tool for the detection of early breast cancer and ensures better prognosis and quality of life. PMID:22977375

  19. Criteria for Developing Criteria Sets.

    ERIC Educational Resources Information Center

    Martin, James L.

    Criteria sets are a necessary step in the systematic development of evaluation in education. Evaluation results from the combination of criteria and evidence. There is a need to develop explicit tools for evaluating criteria, similar to those used in evaluating evidence. The formulation of such criteria depends on distinguishing between terms…

  20. Multi-Criteria Decision Making for a Spatial Decision Support System on the Analysis of Changing Risk

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; van Westen, Cees; Bakker, Wim H.; Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    Natural hazard risk management requires decision making in several stages. Decision making on alternatives for risk reduction planning starts with an intelligence phase for recognition of the decision problems and identifying the objectives. Development of the alternatives and assigning the variable by decision makers to each alternative are employed to the design phase. Final phase evaluates the optimal choice by comparing the alternatives, defining indicators, assigning a weight to each and ranking them. This process is referred to as Multi-Criteria Decision Making analysis (MCDM), Multi-Criteria Evaluation (MCE) or Multi-Criteria Analysis (MCA). In the framework of the ongoing 7th Framework Program "CHANGES" (2011-2014, Grant Agreement No. 263953) of the European Commission, a Spatial Decision Support System is under development, that has the aim to analyse changes in hydro-meteorological risk and provide support to selecting the best risk reduction alternative. This paper describes the module for Multi-Criteria Decision Making analysis (MCDM) that incorporates monetary and non-monetary criteria in the analysis of the optimal alternative. The MCDM module consists of several components. The first step is to define criteria (or Indicators) which are subdivided into disadvantages (criteria that indicate the difficulty for implementing the risk reduction strategy, also referred to as Costs) and advantages (criteria that indicate the favorability, also referred to as benefits). In the next step the stakeholders can use the developed web-based tool for prioritizing criteria and decision matrix. Public participation plays a role in decision making and this is also planned through the use of a mobile web-version where the general local public can indicate their agreement on the proposed alternatives. The application is being tested through a case study related to risk reduction of a mountainous valley in the Alps affected by flooding. Four alternatives are evaluated in

  1. Motion Analysis From Television Images

    NASA Astrophysics Data System (ADS)

    Silberberg, George G.; Keller, Patrick N.

    1982-02-01

    The Department of Defense ranges have relied on photographic instrumentation for gathering data of firings for all types of ordnance. A large inventory of cameras are available on the market that can be used for these tasks. A new set of optical instrumentation is beginning to appear which, in many cases, can directly replace photographic cameras for a great deal of the work being performed now. These are television cameras modified so they can stop motion, see in the dark, perform under hostile environments, and provide real time information. This paper discusses techniques for modifying television cameras so they can be used for motion analysis.

  2. Alternate analysis criteria for the seismic qualification of the supplementary safety system in Reactor Building 105-L, Savannah River Plant

    SciTech Connect

    Quan, C.N.; Wong, P.W.

    1982-09-01

    This system consists of a series of stainless-steel, safety-related pipelines, 1/2 to 2 in. in diameter, which run from the control room to the reactor tank. The alternate analysis criteria were developed for the seismic qualification of the piping system, according to the requirements of the 1967 Housner criteria. The application of alternate analysis criteria is a widely employed and accepted procedure for the seismic qualification of large lengths of nuclear power plant small-diameter piping (generally 2 in. in diameter and smaller). Objective of this procedure is to eliminate the need for extensive and costly individual mathematical modeling and dynamic analysis of a great number of small-bore piping systems. The procedure used to develop the alternate analysis criteria consisted of applying equivalent static seismic loading to the various pipe sizes to determine maximum support spacings for each pipe size based on the allowable stress or deflection limits. Guidelines were also developed to provide for sufficient thermal growth capacity, which is typically in direct conflict with seismic requirements.

  3. SCORE: a novel multi-criteria decision analysis approach to assessing the sustainability of contaminated land remediation.

    PubMed

    Rosén, Lars; Back, Pär-Erik; Söderqvist, Tore; Norrman, Jenny; Brinkhoff, Petra; Norberg, Tommy; Volchko, Yevheniya; Norin, Malin; Bergknut, Magnus; Döberl, Gernot

    2015-04-01

    The multi-criteria decision analysis (MCDA) method provides for a comprehensive and transparent basis for performing sustainability assessments. Development of a relevant MCDA-method requires consideration of a number of key issues, e.g. (a) definition of assessment boundaries, (b) definition of performance scales, both temporal and spatial, (c) selection of relevant criteria (indicators) that facilitate a comprehensive sustainability assessment while avoiding double-counting of effects, and (d) handling of uncertainties. Adding to the complexity is the typically wide variety of inputs, including quantifications based on existing data, expert judgements, and opinions expressed in interviews. The SCORE (Sustainable Choice Of REmediation) MCDA-method was developed to provide a transparent assessment of the sustainability of possible remediation alternatives for contaminated sites relative to a reference alternative, considering key criteria in the economic, environmental, and social sustainability domains. The criteria were identified based on literature studies, interviews and focus-group meetings. SCORE combines a linear additive model to rank the alternatives with a non-compensatory approach to identify alternatives regarded as non-sustainable. The key strengths of the SCORE method are as follows: a framework that at its core is designed to be flexible and transparent; the possibility to integrate both quantitative and qualitative estimations on criteria; its ability, unlike other sustainability assessment tools used in industry and academia, to allow for the alteration of boundary conditions where necessary; the inclusion of a full uncertainty analysis of the results, using Monte Carlo simulation; and a structure that allows preferences and opinions of involved stakeholders to be openly integrated into the analysis. A major insight from practical application of SCORE is that its most important contribution may be that it initiates a process where criteria

  4. SCORE: a novel multi-criteria decision analysis approach to assessing the sustainability of contaminated land remediation.

    PubMed

    Rosén, Lars; Back, Pär-Erik; Söderqvist, Tore; Norrman, Jenny; Brinkhoff, Petra; Norberg, Tommy; Volchko, Yevheniya; Norin, Malin; Bergknut, Magnus; Döberl, Gernot

    2015-04-01

    The multi-criteria decision analysis (MCDA) method provides for a comprehensive and transparent basis for performing sustainability assessments. Development of a relevant MCDA-method requires consideration of a number of key issues, e.g. (a) definition of assessment boundaries, (b) definition of performance scales, both temporal and spatial, (c) selection of relevant criteria (indicators) that facilitate a comprehensive sustainability assessment while avoiding double-counting of effects, and (d) handling of uncertainties. Adding to the complexity is the typically wide variety of inputs, including quantifications based on existing data, expert judgements, and opinions expressed in interviews. The SCORE (Sustainable Choice Of REmediation) MCDA-method was developed to provide a transparent assessment of the sustainability of possible remediation alternatives for contaminated sites relative to a reference alternative, considering key criteria in the economic, environmental, and social sustainability domains. The criteria were identified based on literature studies, interviews and focus-group meetings. SCORE combines a linear additive model to rank the alternatives with a non-compensatory approach to identify alternatives regarded as non-sustainable. The key strengths of the SCORE method are as follows: a framework that at its core is designed to be flexible and transparent; the possibility to integrate both quantitative and qualitative estimations on criteria; its ability, unlike other sustainability assessment tools used in industry and academia, to allow for the alteration of boundary conditions where necessary; the inclusion of a full uncertainty analysis of the results, using Monte Carlo simulation; and a structure that allows preferences and opinions of involved stakeholders to be openly integrated into the analysis. A major insight from practical application of SCORE is that its most important contribution may be that it initiates a process where criteria

  5. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    NASA Astrophysics Data System (ADS)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  6. Medical image analysis with artificial neural networks.

    PubMed

    Jiang, J; Trundle, P; Ren, J

    2010-12-01

    Given that neural networks have been widely reported in the research community of medical imaging, we provide a focused literature survey on recent neural network developments in computer-aided diagnosis, medical image segmentation and edge detection towards visual content analysis, and medical image registration for its pre-processing and post-processing, with the aims of increasing awareness of how neural networks can be applied to these areas and to provide a foundation for further research and practical development. Representative techniques and algorithms are explained in detail to provide inspiring examples illustrating: (i) how a known neural network with fixed structure and training procedure could be applied to resolve a medical imaging problem; (ii) how medical images could be analysed, processed, and characterised by neural networks; and (iii) how neural networks could be expanded further to resolve problems relevant to medical imaging. In the concluding section, a highlight of comparisons among many neural network applications is included to provide a global view on computational intelligence with neural networks in medical imaging.

  7. Image distortion analysis using polynomial series expansion.

    PubMed

    Baggenstoss, Paul M

    2004-11-01

    In this paper, we derive a technique for analysis of local distortions which affect data in real-world applications. In the paper, we focus on image data, specifically handwritten characters. Given a reference image and a distorted copy of it, the method is able to efficiently determine the rotations, translations, scaling, and any other distortions that have been applied. Because the method is robust, it is also able to estimate distortions for two unrelated images, thus determining the distortions that would be required to cause the two images to resemble each other. The approach is based on a polynomial series expansion using matrix powers of linear transformation matrices. The technique has applications in pattern recognition in the presence of distortions. PMID:15521492

  8. Fourier analysis: from cloaking to imaging

    NASA Astrophysics Data System (ADS)

    Wu, Kedi; Cheng, Qiluan; Wang, Guo Ping

    2016-04-01

    Regarding invisibility cloaks as an optical imaging system, we present a Fourier approach to analytically unify both Pendry cloaks and complementary media-based invisibility cloaks into one kind of cloak. By synthesizing different transfer functions, we can construct different devices to realize a series of interesting functions such as hiding objects (events), creating illusions, and performing perfect imaging. In this article, we give a brief review on recent works of applying Fourier approach to analysis invisibility cloaks and optical imaging through scattering layers. We show that, to construct devices to conceal an object, no constructive materials with extreme properties are required, making most, if not all, of the above functions realizable by using naturally occurring materials. As instances, we experimentally verify a method of directionally hiding distant objects and create illusions by using all-dielectric materials, and further demonstrate a non-invasive method of imaging objects completely hidden by scattering layers.

  9. Principal Components Analysis In Medical Imaging

    NASA Astrophysics Data System (ADS)

    Weaver, J. B.; Huddleston, A. L.

    1986-06-01

    Principal components analysis, PCA, is basically a data reduction technique. PCA has been used in several problems in diagnostic radiology: processing radioisotope brain scans (Ref.1), automatic alignment of radionuclide images (Ref. 2), processing MRI images (Ref. 3,4), analyzing first-pass cardiac studies (Ref. 5) correcting for attenuation in bone mineral measurements (Ref. 6) and in dual energy x-ray imaging (Ref. 6,7). This paper will progress as follows; a brief introduction to the mathematics of PCA will be followed by two brief examples of how PCA has been used in the literature. Finally my own experience with PCA in dual-energy x-ray imaging will be given.

  10. Analysis of extensively washed hair from cocaine users and drug chemists to establish new reporting criteria.

    PubMed

    Morris-Kukoski, Cynthia L; Montgomery, Madeline A; Hammer, Rena L

    2014-01-01

    Samples from a self-proclaimed cocaine (COC) user, from 19 drug users (postmortem) and from 27 drug chemists were extensively washed and analyzed for COC, benzoylecgonine, norcocaine (NC), cocaethylene (CE) and aryl hydroxycocaines by liquid chromatography-tandem mass spectrometry. Published wash criteria and cutoffs were applied to the results. Additionally, the data were used to formulate new reporting criteria and interpretation guidelines for forensic casework. Applying the wash and reporting criteria, hair that was externally contaminated with COC was distinguished from hair collected from individuals known to have consumed COC. In addition, CE, NC and hydroxycocaine metabolites were only present in COC users' hair and not in drug chemists' hair. When properly applied, the use of an extended wash, along with the reporting criteria defined here, will exclude false-positive results from environmental contact with COC.

  11. Analysis of extensively washed hair from cocaine users and drug chemists to establish new reporting criteria.

    PubMed

    Morris-Kukoski, Cynthia L; Montgomery, Madeline A; Hammer, Rena L

    2014-01-01

    Samples from a self-proclaimed cocaine (COC) user, from 19 drug users (postmortem) and from 27 drug chemists were extensively washed and analyzed for COC, benzoylecgonine, norcocaine (NC), cocaethylene (CE) and aryl hydroxycocaines by liquid chromatography-tandem mass spectrometry. Published wash criteria and cutoffs were applied to the results. Additionally, the data were used to formulate new reporting criteria and interpretation guidelines for forensic casework. Applying the wash and reporting criteria, hair that was externally contaminated with COC was distinguished from hair collected from individuals known to have consumed COC. In addition, CE, NC and hydroxycocaine metabolites were only present in COC users' hair and not in drug chemists' hair. When properly applied, the use of an extended wash, along with the reporting criteria defined here, will exclude false-positive results from environmental contact with COC. PMID:25100648

  12. Addressing preference heterogeneity in public health policy by combining Cluster Analysis and Multi-Criteria Decision Analysis: Proof of Method.

    PubMed

    Kaltoft, Mette Kjer; Turner, Robin; Cunich, Michelle; Salkeld, Glenn; Nielsen, Jesper Bo; Dowie, Jack

    2015-01-01

    The use of subgroups based on biological-clinical and socio-demographic variables to deal with population heterogeneity is well-established in public policy. The use of subgroups based on preferences is rare, except when religion based, and controversial. If it were decided to treat subgroup preferences as valid determinants of public policy, a transparent analytical procedure is needed. In this proof of method study we show how public preferences could be incorporated into policy decisions in a way that respects both the multi-criterial nature of those decisions, and the heterogeneity of the population in relation to the importance assigned to relevant criteria. It involves combining Cluster Analysis (CA), to generate the subgroup sets of preferences, with Multi-Criteria Decision Analysis (MCDA), to provide the policy framework into which the clustered preferences are entered. We employ three techniques of CA to demonstrate that not only do different techniques produce different clusters, but that choosing among techniques (as well as developing the MCDA structure) is an important task to be undertaken in implementing the approach outlined in any specific policy context. Data for the illustrative, not substantive, application are from a Randomized Controlled Trial of online decision aids for Australian men aged 40-69 years considering Prostate-specific Antigen testing for prostate cancer. We show that such analyses can provide policy-makers with insights into the criterion-specific needs of different subgroups. Implementing CA and MCDA in combination to assist in the development of policies on important health and community issues such as drug coverage, reimbursement, and screening programs, poses major challenges -conceptual, methodological, ethical-political, and practical - but most are exposed by the techniques, not created by them.

  13. Measuring toothbrush interproximal penetration using image analysis

    NASA Astrophysics Data System (ADS)

    Hayworth, Mark S.; Lyons, Elizabeth K.

    1994-09-01

    An image analysis method of measuring the effectiveness of a toothbrush in reaching the interproximal spaces of teeth is described. Artificial teeth are coated with a stain that approximates real plaque and then brushed with a toothbrush on a brushing machine. The teeth are then removed and turned sideways so that the interproximal surfaces can be imaged. The areas of stain that have been removed within masked regions that define the interproximal regions are measured and reported. These areas correspond to the interproximal areas of the tooth reached by the toothbrush bristles. The image analysis method produces more precise results (10-fold decrease in standard deviation) in a fraction (22%) of the time as compared to our prior visual grading method.

  14. Up-to-seven criteria for hepatocellular carcinoma liver transplantation: A single center analysis

    PubMed Central

    Lei, Jian-Yong; Wang, Wen-Tao; Yan, Lu-Nan

    2013-01-01

    AIM: To detect whether the up-to-seven should be used as inclusion criteria for liver transplantation for hepatocellular carcinoma. METHODS: Between April 2002 and July 2008, 220 hepatocellular carcinoma (HCC) patients who were diagnosed with HCC and underwent liver transplantation (LT) at our liver transplantation center were included. These patients were divided into three groups according to the characteristics of their tumors (tumor diameter, tumor number): the Milan criteria group (Group 1), the in up-to-seven group (Group 2) and the out up-to-seven group (Group 3). Then, we compared long-term survival and tumor recurrence of these three groups. RESULTS: The baseline characteristics of transplant recipients were comparable among these three groups, except for the type of liver graft (deceased donor liver transplant or live donor liver transplantation). There were also no significant differences in the pre-operative α-fetoprotein level. The 1-, 3-, and 5-year overall survival and tumor-free survival rate for the Milan criteria group were 94.8%, 91.4%, 89.7% and 91.4%, 86.2%, and 86.2% respectively; in the up-to-seven criteria group, these rates were 87.8%, 77.8%, and 76.6% and 85.6%, 75.6%, and 75.6% respectively (P < 0.05). However, the advanced HCC patients’ (in the group out of up-to-seven criteria) overall and tumor-free survival rates were much lower, at 75%, 53.3%, and 50% and 65.8%, 42.5%, and 41.7%, respectively (P < 0.01). CONCLUSION: Considering that patients in the up-to-seven criteria group exhibited a considerable but lower survival rate compared with the Milan criteria group, the up-to-seven criteria should be used carefully and selectively. PMID:24106409

  15. Discussion paper on applicability of oil and grease analysis for RCRA closure criteria

    SciTech Connect

    1995-02-01

    A site characterization (SC) was performed for the Building 9409-5 Diked Tank Storage Facility. The initial SC indicated areas which had oil and grease levels above the criteria of the currently proposed RCRA closure plan. After further investigation, it was demonstrated that the oil and grease parameter may not be an accurate indication of a release from this facility and should not be included as a contaminant of concern in the closure criteria.

  16. Unsupervised hyperspectral image analysis using independent component analysis (ICA)

    SciTech Connect

    S. S. Chiang; I. W. Ginsberg

    2000-06-30

    In this paper, an ICA-based approach is proposed for hyperspectral image analysis. It can be viewed as a random version of the commonly used linear spectral mixture analysis, in which the abundance fractions in a linear mixture model are considered to be unknown independent signal sources. It does not require the full rank of the separating matrix or orthogonality as most ICA methods do. More importantly, the learning algorithm is designed based on the independency of the material abundance vector rather than the independency of the separating matrix generally used to constrain the standard ICA. As a result, the designed learning algorithm is able to converge to non-orthogonal independent components. This is particularly useful in hyperspectral image analysis since many materials extracted from a hyperspectral image may have similar spectral signatures and may not be orthogonal. The AVIRIS experiments have demonstrated that the proposed ICA provides an effective unsupervised technique for hyperspectral image classification.

  17. PIXE analysis and imaging of papyrus documents

    NASA Astrophysics Data System (ADS)

    Lövestam, N. E. Göran; Swietlicki, Erik

    1990-01-01

    The analysis of antique papyrus documents using an external milliprobe is described. Missing characters of text in the documents were made visible by means of PIXE analysis and X-ray imaging of the areas studied. The contrast between the papyrus and the ink was further increased when the information contained in all the elements was taken into account simultaneously using a multivariate technique (partial least-squares regression).

  18. Visualization of Parameter Space for Image Analysis

    PubMed Central

    Pretorius, A. Johannes; Bray, Mark-Anthony P.; Carpenter, Anne E.; Ruddle, Roy A.

    2013-01-01

    Image analysis algorithms are often highly parameterized and much human input is needed to optimize parameter settings. This incurs a time cost of up to several days. We analyze and characterize the conventional parameter optimization process for image analysis and formulate user requirements. With this as input, we propose a change in paradigm by optimizing parameters based on parameter sampling and interactive visual exploration. To save time and reduce memory load, users are only involved in the first step - initialization of sampling - and the last step - visual analysis of output. This helps users to more thoroughly explore the parameter space and produce higher quality results. We describe a custom sampling plug-in we developed for CellProfiler - a popular biomedical image analysis framework. Our main focus is the development of an interactive visualization technique that enables users to analyze the relationships between sampled input parameters and corresponding output. We implemented this in a prototype called Paramorama. It provides users with a visual overview of parameters and their sampled values. User-defined areas of interest are presented in a structured way that includes image-based output and a novel layout algorithm. To find optimal parameter settings, users can tag high- and low-quality results to refine their search. We include two case studies to illustrate the utility of this approach. PMID:22034361

  19. Using Image Analysis to Build Reading Comprehension

    ERIC Educational Resources Information Center

    Brown, Sarah Drake; Swope, John

    2010-01-01

    Content area reading remains a primary concern of history educators. In order to better prepare students for encounters with text, the authors propose the use of two image analysis strategies tied with a historical theme to heighten student interest in historical content and provide a basis for improved reading comprehension.

  20. Scale Free Reduced Rank Image Analysis.

    ERIC Educational Resources Information Center

    Horst, Paul

    In the traditional Guttman-Harris type image analysis, a transformation is applied to the data matrix such that each column of the transformed data matrix is the best least squares estimate of the corresponding column of the data matrix from the remaining columns. The model is scale free. However, it assumes (1) that the correlation matrix is…

  1. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  2. Multi-criteria multi-stakeholder decision analysis using a fuzzy-stochastic approach for hydrosystem management

    NASA Astrophysics Data System (ADS)

    Subagadis, Y. H.; Schütze, N.; Grundmann, J.

    2014-09-01

    The conventional methods used to solve multi-criteria multi-stakeholder problems are less strongly formulated, as they normally incorporate only homogeneous information at a time and suggest aggregating objectives of different decision-makers avoiding water-society interactions. In this contribution, Multi-Criteria Group Decision Analysis (MCGDA) using a fuzzy-stochastic approach has been proposed to rank a set of alternatives in water management decisions incorporating heterogeneous information under uncertainty. The decision making framework takes hydrologically, environmentally, and socio-economically motivated conflicting objectives into consideration. The criteria related to the performance of the physical system are optimized using multi-criteria simulation-based optimization, and fuzzy linguistic quantifiers have been used to evaluate subjective criteria and to assess stakeholders' degree of optimism. The proposed methodology is applied to find effective and robust intervention strategies for the management of a coastal hydrosystem affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. Preliminary results show that the MCGDA based on a fuzzy-stochastic approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.

  3. Building a picture: Prioritisation of exotic diseases for the pig industry in Australia using multi-criteria decision analysis.

    PubMed

    Brookes, V J; Hernández-Jover, M; Cowled, B; Holyoake, P K; Ward, M P

    2014-01-01

    Diseases that are exotic to the pig industry in Australia were prioritised using a multi-criteria decision analysis framework that incorporated weights of importance for a range of criteria important to industry stakeholders. Measurements were collected for each disease for nine criteria that described potential disease impacts. A total score was calculated for each disease using a weighted sum value function that aggregated the nine disease criterion measurements and weights of importance for the criteria that were previously elicited from two groups of industry stakeholders. One stakeholder group placed most value on the impacts of disease on livestock, and one group placed more value on the zoonotic impacts of diseases. Prioritisation lists ordered by disease score were produced for both of these groups. Vesicular diseases were found to have the highest priority for the group valuing disease impacts on livestock, followed by acute forms of African and classical swine fever, then highly pathogenic porcine reproductive and respiratory syndrome. The group who valued zoonotic disease impacts prioritised rabies, followed by Japanese encephalitis, Eastern equine encephalitis and Nipah virus, interspersed with vesicular diseases. The multi-criteria framework used in this study systematically prioritised diseases using a multi-attribute theory based technique that provided transparency and repeatability in the process. Flexibility of the framework was demonstrated by aggregating the criterion weights from more than one stakeholder group with the disease measurements for the criteria. This technique allowed industry stakeholders to be active in resource allocation for their industry without the need to be disease experts. We believe it is the first prioritisation of livestock diseases using values provided by industry stakeholders. The prioritisation lists will be used by industry stakeholders to identify diseases for further risk analysis and disease spread modelling to

  4. Building a picture: Prioritisation of exotic diseases for the pig industry in Australia using multi-criteria decision analysis.

    PubMed

    Brookes, V J; Hernández-Jover, M; Cowled, B; Holyoake, P K; Ward, M P

    2014-01-01

    Diseases that are exotic to the pig industry in Australia were prioritised using a multi-criteria decision analysis framework that incorporated weights of importance for a range of criteria important to industry stakeholders. Measurements were collected for each disease for nine criteria that described potential disease impacts. A total score was calculated for each disease using a weighted sum value function that aggregated the nine disease criterion measurements and weights of importance for the criteria that were previously elicited from two groups of industry stakeholders. One stakeholder group placed most value on the impacts of disease on livestock, and one group placed more value on the zoonotic impacts of diseases. Prioritisation lists ordered by disease score were produced for both of these groups. Vesicular diseases were found to have the highest priority for the group valuing disease impacts on livestock, followed by acute forms of African and classical swine fever, then highly pathogenic porcine reproductive and respiratory syndrome. The group who valued zoonotic disease impacts prioritised rabies, followed by Japanese encephalitis, Eastern equine encephalitis and Nipah virus, interspersed with vesicular diseases. The multi-criteria framework used in this study systematically prioritised diseases using a multi-attribute theory based technique that provided transparency and repeatability in the process. Flexibility of the framework was demonstrated by aggregating the criterion weights from more than one stakeholder group with the disease measurements for the criteria. This technique allowed industry stakeholders to be active in resource allocation for their industry without the need to be disease experts. We believe it is the first prioritisation of livestock diseases using values provided by industry stakeholders. The prioritisation lists will be used by industry stakeholders to identify diseases for further risk analysis and disease spread modelling to

  5. Good relationships between computational image analysis and radiological physics

    SciTech Connect

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  6. Multi-criteria decision analysis for bioenergy in the Centre Region of Portugal

    NASA Astrophysics Data System (ADS)

    Esteves, T. C. J.; Cabral, P.; Ferreira, A. J. D.; Teixeira, J. C.

    2012-04-01

    With the consumption of fossil fuels, the resources essential to Man's survival are being rapidly contaminated. A sustainable future may be achieved by the use of renewable energies, allowing countries without non-renewable energy resources to guarantee energetic sovereignty. Using bioenergy may mean a steep reduction and/or elimination of the external dependency, enhancing the countries' capital and potentially reducing of the negative effects that outcome from the use of fossil fuels, such as loss of biodiversity, air, water, and soil pollution, … This work's main focus is to increase bioenergy use in the centre region of Portugal by allying R&D to facilitate determination of bioenergy availability and distribution throughout the study area.This analysis is essential, given that nowadays this knowledge is still very limited in the study area. Geographic Information Systems (GIS) was the main tool used to asses this study, due to its unseeingly ability to integrate various types of information (such as alphanumerical, statistical, geographical, …) and various sources of biomass (forest, agricultural, husbandry, municipal and industrial residues, shrublands, used vegetable oil and energy crops) to determine the bioenergy potential of the study area, as well as their spatial distribution. By allying GIS with multi-criteria decision analysis, the initial table-like information of difficult comprehension is transformed into tangible and easy to read results: both intermediate and final results of the created models will facilitate the decision making process. General results show that the major contributors for the bioenergy potential in the Centre Region of Portugal are forest residues, which are mostly located in the inner region of the study area. However, a more detailed analysis should be made to analyze the viability to use energy crops. As a main conclusion, we can say that, although this region may not use only this type of energy to be completely

  7. Frequency domain analysis of knock images

    NASA Astrophysics Data System (ADS)

    Qi, Yunliang; He, Xin; Wang, Zhi; Wang, Jianxin

    2014-12-01

    High speed imaging-based knock analysis has mainly focused on time domain information, e.g. the spark triggered flame speed, the time when end gas auto-ignition occurs and the end gas flame speed after auto-ignition. This study presents a frequency domain analysis on the knock images recorded using a high speed camera with direct photography in a rapid compression machine (RCM). To clearly visualize the pressure wave oscillation in the combustion chamber, the images were high-pass-filtered to extract the luminosity oscillation. The luminosity spectrum was then obtained by applying fast Fourier transform (FFT) to three basic colour components (red, green and blue) of the high-pass-filtered images. Compared to the pressure spectrum, the luminosity spectra better identify the resonant modes of pressure wave oscillation. More importantly, the resonant mode shapes can be clearly visualized by reconstructing the images based on the amplitudes of luminosity spectra at the corresponding resonant frequencies, which agree well with the analytical solutions for mode shapes of gas vibration in a cylindrical cavity.

  8. Environmental condition assessment of US military installations using GIS based spatial multi-criteria decision analysis.

    PubMed

    Singer, Steve; Wang, Guangxing; Howard, Heidi; Anderson, Alan

    2012-08-01

    Environment functions in various aspects including soil and water conservation, biodiversity and habitats, and landscape aesthetics. Comprehensive assessment of environmental condition is thus a great challenge. The issues include how to assess individual environmental components such as landscape aesthetics and integrate them into an indicator that can comprehensively quantify environmental condition. In this study, a geographic information systems based spatial multi-criteria decision analysis was used to integrate environmental variables and create the indicator. This approach was applied to Fort Riley Military installation in which land condition and its dynamics due to military training activities were assessed. The indicator was derived by integrating soil erosion, water quality, landscape fragmentation, landscape aesthetics, and noise based on the weights from the experts by assessing and ranking the environmental variables in terms of their importance. The results showed that landscape level indicator well quantified the overall environmental condition and its dynamics, while the indicator at level of patch that is defined as a homogeneous area that is different from its surroundings detailed the spatiotemporal variability of environmental condition. The environmental condition was mostly determined by soil erosion, then landscape fragmentation, water quality, landscape aesthetics, and noise. Overall, environmental condition at both landscape and patch levels greatly varied depending on the degree of ground and canopy disturbance and their spatial patterns due to military training activities and being related to slope. It was also determined the environment itself could be recovered quickly once military training was halt or reduced. Thus, this study provided an effective tool for the army land managers to monitor environmental dynamics and plan military training activities. Its limitation lies at that the obtained values of the indicator vary and are

  9. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation

    PubMed Central

    2016-01-01

    River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA) is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes), elicited the experts´ preferences for objectives on a standardized scale (value functions) and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral), depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and resource

  10. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation.

    PubMed

    Langhans, Simone D; Lienert, Judit

    2016-01-01

    River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA) is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes), elicited the experts´ preferences for objectives on a standardized scale (value functions) and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral), depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and resource

  11. Carbon storage, timber production, and biodiversity: comparing ecosystem services with multi-criteria decision analysis.

    PubMed

    Schwenk, W Scott; Donovan, Therese M; Keeton, William S; Nunery, Jared S

    2012-07-01

    Increasingly, land managers seek ways to manage forests for multiple ecosystem services and functions, yet considerable challenges exist in comparing disparate services and balancing trade-offs among them. We applied multi-criteria decision analysis (MCDA) and forest simulation models to simultaneously consider three objectives: (1) storing carbon, (2) producing timber and wood products, and (3) sustaining biodiversity. We used the Forest Vegetation Simulator (FVS) applied to 42 northern hardwood sites to simulate forest development over 100 years and to estimate carbon storage and timber production. We estimated biodiversity implications with occupancy models for 51 terrestrial bird species that were linked to FVS outputs. We simulated four alternative management prescriptions that spanned a range of harvesting intensities and forest structure retention. We found that silvicultural approaches emphasizing less frequent harvesting and greater structural retention could be expected to achieve the greatest net carbon storage but also produce less timber. More intensive prescriptions would enhance biodiversity because positive responses of early successional species exceeded negative responses of late successional species within the heavily forested study area. The combinations of weights assigned to objectives had a large influence on which prescriptions were scored as optimal. Overall, we found that a diversity of silvicultural approaches is likely to be preferable to any single approach, emphasizing the need for landscape-scale management to provide a full range of ecosystem goods and services. Our analytical framework that combined MCDA with forest simulation modeling was a powerful tool in understanding trade-offs among management objectives and how they can be simultaneously accommodated. PMID:22908717

  12. Digital imaging analysis to assess scar phenotype.

    PubMed

    Smith, Brian J; Nidey, Nichole; Miller, Steven F; Moreno Uribe, Lina M; Baum, Christian L; Hamilton, Grant S; Wehby, George L; Dunnwald, Martine

    2014-01-01

    In order to understand the link between the genetic background of patients and wound clinical outcomes, it is critical to have a reliable method to assess the phenotypic characteristics of healed wounds. In this study, we present a novel imaging method that provides reproducible, sensitive, and unbiased assessments of postsurgical scarring. We used this approach to investigate the possibility that genetic variants in orofacial clefting genes are associated with suboptimal healing. Red-green-blue digital images of postsurgical scars of 68 patients, following unilateral cleft lip repair, were captured using the 3dMD imaging system. Morphometric and colorimetric data of repaired regions of the philtrum and upper lip were acquired using ImageJ software, and the unaffected contralateral regions were used as patient-specific controls. Repeatability of the method was high with intraclass correlation coefficient score > 0.8. This method detected a very significant difference in all three colors, and for all patients, between the scarred and the contralateral unaffected philtrum (p ranging from 1.20(-05) to 1.95(-14) ). Physicians' clinical outcome ratings from the same images showed high interobserver variability (overall Pearson coefficient = 0.49) as well as low correlation with digital image analysis results. Finally, we identified genetic variants in TGFB3 and ARHGAP29 associated with suboptimal healing outcome.

  13. Ultrasonic image analysis for beef tenderness

    NASA Astrophysics Data System (ADS)

    Park, Bosoon; Thane, Brian R.; Whittaker, A. D.

    1993-05-01

    Objective measurement of meat tenderness has been a topic of concern for palatability evaluation. In this study, a real-time ultrasonic B-mode imaging method was used for measuring beef palatability attributes such as juiciness, muscle fiber tenderness, connective tissue amount, overall tenderness, flavor intensity, and percent total collagen noninvasively. A temporal averaging image enhancement method was used for image analysis. Ultrasonic image intensity, fractal dimension, attenuation, and statistical gray-tone spatial-dependence matrix image texture measurement were analyzed. The contrast of the textural feature was the most correlated parameter with palatability attributes. The longitudinal scanning method was better for juiciness, muscle fiber tenderness, flavor intensity, and percent soluble collagen, whereas, the cross-sectional method was better for connective tissue, overall tenderness. The multivariate linear regression models were developed as a function of textural features and image intensity parameters. The determinant coefficients of regression models were for juiciness (R2 equals .97), for percent total collagen (R2 equals .88), for flavor intensity (R2 equals .75), for muscle fiber tenderness (R2 equals .55), and for overall tenderness (R2 equals .49), respectively.

  14. Digital imaging analysis to assess scar phenotype

    PubMed Central

    Smith, Brian J.; Nidey, Nichole; Miller, Steven F.; Moreno, Lina M.; Baum, Christian L.; Hamilton, Grant S.; Wehby, George L.; Dunnwald, Martine

    2015-01-01

    In order to understand the link between the genetic background of patients and wound clinical outcomes, it is critical to have a reliable method to assess the phenotypic characteristics of healed wounds. In this study, we present a novel imaging method that provides reproducible, sensitive and unbiased assessments of post-surgical scarring. We used this approach to investigate the possibility that genetic variants in orofacial clefting genes are associated with suboptimal healing. Red-green-blue (RGB) digital images of post-surgical scars of 68 patients, following unilateral cleft lip repair, were captured using the 3dMD image system. Morphometric and colorimetric data of repaired regions of the philtrum and upper lip were acquired using ImageJ software and the unaffected contralateral regions were used as patient-specific controls. Repeatability of the method was high with interclass correlation coefficient score > 0.8. This method detected a very significant difference in all three colors, and for all patients, between the scarred and the contralateral unaffected philtrum (P ranging from 1.20−05 to 1.95−14). Physicians’ clinical outcome ratings from the same images showed high inter-observer variability (overall Pearson coefficient = 0.49) as well as low correlation with digital image analysis results. Finally, we identified genetic variants in TGFB3 and ARHGAP29 associated with suboptimal healing outcome. PMID:24635173

  15. A multi-criteria analysis approach for ranking and selection of microorganisms for the production of oils for biodiesel production.

    PubMed

    Ahmad, Farah B; Zhang, Zhanying; Doherty, William O S; O'Hara, Ian M

    2015-08-01

    Oleaginous microorganisms have potential to be used to produce oils as alternative feedstock for biodiesel production. Microalgae (Chlorella protothecoides and Chlorella zofingiensis), yeasts (Cryptococcus albidus and Rhodotorula mucilaginosa), and fungi (Aspergillus oryzae and Mucor plumbeus) were investigated for their ability to produce oil from glucose, xylose and glycerol. Multi-criteria analysis (MCA) using analytic hierarchy process (AHP) and preference ranking organization method for the enrichment of evaluations (PROMETHEE) with graphical analysis for interactive aid (GAIA), was used to rank and select the preferred microorganisms for oil production for biodiesel application. This was based on a number of criteria viz., oil concentration, content, production rate and yield, substrate consumption rate, fatty acids composition, biomass harvesting and nutrient costs. PROMETHEE selected A. oryzae, M. plumbeus and R. mucilaginosa as the most prospective species for oil production. However, further analysis by GAIA Webs identified A. oryzae and M. plumbeus as the best performing microorganisms.

  16. Symmetric subspace learning for image analysis.

    PubMed

    Papachristou, Konstantinos; Tefas, Anastasios; Pitas, Ioannis

    2014-12-01

    Subspace learning (SL) is one of the most useful tools for image analysis and recognition. A large number of such techniques have been proposed utilizing a priori knowledge about the data. In this paper, new subspace learning techniques are presented that use symmetry constraints in their objective functions. The rational behind this idea is to exploit the a priori knowledge that geometrical symmetry appears in several types of data, such as images, objects, faces, and so on. Experiments on artificial, facial expression recognition, face recognition, and object categorization databases highlight the superiority and the robustness of the proposed techniques, in comparison with standard SL techniques.

  17. Autonomous Image Analysis for Future Mars Missions

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Ruzon, M. A.; Bandari, E.; Roush, T. L.

    1999-01-01

    To explore high priority landing sites and to prepare for eventual human exploration, future Mars missions will involve rovers capable of traversing tens of kilometers. However, the current process by which scientists interact with a rover does not scale to such distances. Specifically, numerous command cycles are required to complete even simple tasks, such as, pointing the spectrometer at a variety of nearby rocks. In addition, the time required by scientists to interpret image data before new commands can be given and the limited amount of data that can be downlinked during a given command cycle constrain rover mobility and achievement of science goals. Experience with rover tests on Earth supports these concerns. As a result, traverses to science sites as identified in orbital images would require numerous science command cycles over a period of many weeks, months or even years, perhaps exceeding rover design life and other constraints. Autonomous onboard science analysis can address these problems in two ways. First, it will allow the rover to preferentially transmit "interesting" images, defined as those likely to have higher science content. Second, the rover will be able to anticipate future commands. For example, a rover might autonomously acquire and return spectra of "interesting" rocks along with a high-resolution image of those rocks in addition to returning the context images in which they were detected. Such approaches, coupled with appropriate navigational software, help to address both the data volume and command cycle bottlenecks that limit both rover mobility and science yield. We are developing fast, autonomous algorithms to enable such intelligent on-board decision making by spacecraft. Autonomous algorithms developed to date have the ability to identify rocks and layers in a scene, locate the horizon, and compress multi-spectral image data. We are currently investigating the possibility of reconstructing a 3D surface from a sequence of images

  18. Morphological analysis of infrared images for waterjets

    NASA Astrophysics Data System (ADS)

    Gong, Yuxin; Long, Aifang

    2013-03-01

    High-speed waterjet has been widely used in industries and been investigated as a model of free shearing turbulence. This paper presents an investigation involving the flow visualization of high speed water jet, the noise reduction of the raw thermogram using a high-pass morphological filter ? and a median filter; the image enhancement using white top-hat filter; and the image segmentation using the multiple thresholding method. The image processing results by the designed morphological filters, ? - top-hat, were proved being ideal for further quantitative and in-depth analysis and can be used as a new morphological filter bank that may be of general implications for the analogous work

  19. Image sequence analysis workstation for multipoint motion analysis

    NASA Astrophysics Data System (ADS)

    Mostafavi, Hassan

    1990-08-01

    This paper describes an application-specific engineering workstation designed and developed to analyze motion of objects from video sequences. The system combines the software and hardware environment of a modem graphic-oriented workstation with the digital image acquisition, processing and display techniques. In addition to automation and Increase In throughput of data reduction tasks, the objective of the system Is to provide less invasive methods of measurement by offering the ability to track objects that are more complex than reflective markers. Grey level Image processing and spatial/temporal adaptation of the processing parameters is used for location and tracking of more complex features of objects under uncontrolled lighting and background conditions. The applications of such an automated and noninvasive measurement tool include analysis of the trajectory and attitude of rigid bodies such as human limbs, robots, aircraft in flight, etc. The system's key features are: 1) Acquisition and storage of Image sequences by digitizing and storing real-time video; 2) computer-controlled movie loop playback, freeze frame display, and digital Image enhancement; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored Image sequence; 4) model-based estimation and tracking of the six degrees of freedom of a rigid body: 5) field-of-view and spatial calibration: 6) Image sequence and measurement data base management; and 7) offline analysis software for trajectory plotting and statistical analysis.

  20. Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox

    PubMed Central

    Lacerda, Luis Miguel; Ferreira, Hugo Alexandre

    2015-01-01

    Aim. In recent years, connectivity studies using neuroimaging data have increased the understanding of the organization of large-scale structural and functional brain networks. However, data analysis is time consuming as rigorous procedures must be assured, from structuring data and pre-processing to modality specific data procedures. Until now, no single toolbox was able to perform such investigations on truly multimodal image data from beginning to end, including the combination of different connectivity analyses. Thus, we have developed the Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox with the goal of diminishing time waste in data processing and to allow an innovative and comprehensive approach to brain connectivity. Materials and Methods. The MIBCA toolbox is a fully automated all-in-one connectivity toolbox that offers pre-processing, connectivity and graph theoretical analyses of multimodal image data such as diffusion-weighted imaging, functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). It was developed in MATLAB environment and pipelines well-known neuroimaging softwares such as Freesurfer, SPM, FSL, and Diffusion Toolkit. It further implements routines for the construction of structural, functional and effective or combined connectivity matrices, as well as, routines for the extraction and calculation of imaging and graph-theory metrics, the latter using also functions from the Brain Connectivity Toolbox. Finally, the toolbox performs group statistical analysis and enables data visualization in the form of matrices, 3D brain graphs and connectograms. In this paper the MIBCA toolbox is presented by illustrating its capabilities using multimodal image data from a group of 35 healthy subjects (19–73 years old) with volumetric T1-weighted, diffusion tensor imaging, and resting state fMRI data, and 10 subjets with 18F-Altanserin PET data also. Results. It was observed both a high inter-hemispheric symmetry

  1. Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox.

    PubMed

    Ribeiro, Andre Santos; Lacerda, Luis Miguel; Ferreira, Hugo Alexandre

    2015-01-01

    Aim. In recent years, connectivity studies using neuroimaging data have increased the understanding of the organization of large-scale structural and functional brain networks. However, data analysis is time consuming as rigorous procedures must be assured, from structuring data and pre-processing to modality specific data procedures. Until now, no single toolbox was able to perform such investigations on truly multimodal image data from beginning to end, including the combination of different connectivity analyses. Thus, we have developed the Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox with the goal of diminishing time waste in data processing and to allow an innovative and comprehensive approach to brain connectivity. Materials and Methods. The MIBCA toolbox is a fully automated all-in-one connectivity toolbox that offers pre-processing, connectivity and graph theoretical analyses of multimodal image data such as diffusion-weighted imaging, functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). It was developed in MATLAB environment and pipelines well-known neuroimaging softwares such as Freesurfer, SPM, FSL, and Diffusion Toolkit. It further implements routines for the construction of structural, functional and effective or combined connectivity matrices, as well as, routines for the extraction and calculation of imaging and graph-theory metrics, the latter using also functions from the Brain Connectivity Toolbox. Finally, the toolbox performs group statistical analysis and enables data visualization in the form of matrices, 3D brain graphs and connectograms. In this paper the MIBCA toolbox is presented by illustrating its capabilities using multimodal image data from a group of 35 healthy subjects (19-73 years old) with volumetric T1-weighted, diffusion tensor imaging, and resting state fMRI data, and 10 subjets with 18F-Altanserin PET data also. Results. It was observed both a high inter-hemispheric symmetry and

  2. Scalable histopathological image analysis via active learning.

    PubMed

    Zhu, Yan; Zhang, Shaoting; Liu, Wei; Metaxas, Dimitris N

    2014-01-01

    Training an effective and scalable system for medical image analysis usually requires a large amount of labeled data, which incurs a tremendous annotation burden for pathologists. Recent progress in active learning can alleviate this issue, leading to a great reduction on the labeling cost without sacrificing the predicting accuracy too much. However, most existing active learning methods disregard the "structured information" that may exist in medical images (e.g., data from individual patients), and make a simplifying assumption that unlabeled data is independently and identically distributed. Both may not be suitable for real-world medical images. In this paper, we propose a novel batch-mode active learning method which explores and leverages such structured information in annotations of medical images to enforce diversity among the selected data, therefore maximizing the information gain. We formulate the active learning problem as an adaptive submodular function maximization problem subject to a partition matroid constraint, and further present an efficient greedy algorithm to achieve a good solution with a theoretically proven bound. We demonstrate the efficacy of our algorithm on thousands of histopathological images of breast microscopic tissues. PMID:25320821

  3. The synthesis and analysis of color images

    NASA Technical Reports Server (NTRS)

    Wandell, B. A.

    1985-01-01

    A method is described for performing the synthesis and analysis of digital color images. The method is based on two principles. First, image data are represented with respect to the separate physical factors, surface reflectance and the spectral power distribution of the ambient light, that give rise to the perceived color of an object. Second, the encoding is made efficient by using a basis expansion for the surface spectral reflectance and spectral power distribution of the ambient light that takes advantage of the high degree of correlation across the visible wavelengths normally found in such functions. Within this framework, the same basic methods can be used to synthesize image data for color display monitors and printed materials, and to analyze image data into estimates of the spectral power distribution and surface spectral reflectances. The method can be applied to a variety of tasks. Examples of applications include the color balancing of color images, and the identification of material surface spectral reflectance when the lighting cannot be completely controlled.

  4. Pain related inflammation analysis using infrared images

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Bardhan, Shawli; Das, Kakali; Bhattacharjee, Debotosh; Nath, Satyabrata

    2016-05-01

    Medical Infrared Thermography (MIT) offers a potential non-invasive, non-contact and radiation free imaging modality for assessment of abnormal inflammation having pain in the human body. The assessment of inflammation mainly depends on the emission of heat from the skin surface. Arthritis is a disease of joint damage that generates inflammation in one or more anatomical joints of the body. Osteoarthritis (OA) is the most frequent appearing form of arthritis, and rheumatoid arthritis (RA) is the most threatening form of them. In this study, the inflammatory analysis has been performed on the infrared images of patients suffering from RA and OA. For the analysis, a dataset of 30 bilateral knee thermograms has been captured from the patient of RA and OA by following a thermogram acquisition standard. The thermograms are pre-processed, and areas of interest are extracted for further processing. The investigation of the spread of inflammation is performed along with the statistical analysis of the pre-processed thermograms. The objectives of the study include: i) Generation of a novel thermogram acquisition standard for inflammatory pain disease ii) Analysis of the spread of the inflammation related to RA and OA using K-means clustering. iii) First and second order statistical analysis of pre-processed thermograms. The conclusion reflects that, in most of the cases, RA oriented inflammation affects bilateral knees whereas inflammation related to OA present in the unilateral knee. Also due to the spread of inflammation in OA, contralateral asymmetries are detected through the statistical analysis.

  5. Weighting of Criteria for Disease Prioritization Using Conjoint Analysis and Based on Health Professional and Student Opinion

    PubMed Central

    Stebler, Nadine; Schuepbach-Regula, Gertraud; Braam, Peter; Falzon, Laura Cristina

    2016-01-01

    Disease prioritization exercises have been used by several organizations to inform surveillance and control measures. Though most methodologies for disease prioritization are based on expert opinion, it is becoming more common to include different stakeholders in the prioritization exercise. This study was performed to compare the weighting of disease criteria, and the consequent prioritization of zoonoses, by both health professionals and students in Switzerland using a Conjoint Analysis questionnaire. The health professionals comprised public health and food safety experts, cantonal physicians and cantonal veterinarians, while the student group comprised first-year veterinary and agronomy students. Eight criteria were selected for this prioritization based on expert elicitation and literature review. These criteria, described on a 3-tiered scale, were evaluated through a choice-based Conjoint Analysis questionnaire with 25 choice tasks. Questionnaire results were analyzed to obtain importance scores (for each criterion) and mean utility values (for each criterion level), and the latter were then used to rank 16 zoonoses. While the most important criterion for both groups was “Severity of the disease in humans”, the second ranked criteria by the health professionals and students were “Economy” and “Treatment in humans”, respectively. Regarding the criterion “Control and Prevention”, health professionals tended to prioritize a disease when the control and preventive measures were described to be 95% effective, while students prioritized a disease if there were almost no control and preventive measures available. Bovine Spongiform Encephalopathy was the top-ranked disease by both groups. Health professionals and students agreed on the weighting of certain criteria such as “Severity” and “Treatment of disease in humans”, but disagreed on others such as “Economy” or “Control and Prevention”. Nonetheless, the overall disease ranking

  6. Weighting of Criteria for Disease Prioritization Using Conjoint Analysis and Based on Health Professional and Student Opinion.

    PubMed

    Stebler, Nadine; Schuepbach-Regula, Gertraud; Braam, Peter; Falzon, Laura Cristina

    2016-01-01

    Disease prioritization exercises have been used by several organizations to inform surveillance and control measures. Though most methodologies for disease prioritization are based on expert opinion, it is becoming more common to include different stakeholders in the prioritization exercise. This study was performed to compare the weighting of disease criteria, and the consequent prioritization of zoonoses, by both health professionals and students in Switzerland using a Conjoint Analysis questionnaire. The health professionals comprised public health and food safety experts, cantonal physicians and cantonal veterinarians, while the student group comprised first-year veterinary and agronomy students. Eight criteria were selected for this prioritization based on expert elicitation and literature review. These criteria, described on a 3-tiered scale, were evaluated through a choice-based Conjoint Analysis questionnaire with 25 choice tasks. Questionnaire results were analyzed to obtain importance scores (for each criterion) and mean utility values (for each criterion level), and the latter were then used to rank 16 zoonoses. While the most important criterion for both groups was "Severity of the disease in humans", the second ranked criteria by the health professionals and students were "Economy" and "Treatment in humans", respectively. Regarding the criterion "Control and Prevention", health professionals tended to prioritize a disease when the control and preventive measures were described to be 95% effective, while students prioritized a disease if there were almost no control and preventive measures available. Bovine Spongiform Encephalopathy was the top-ranked disease by both groups. Health professionals and students agreed on the weighting of certain criteria such as "Severity" and "Treatment of disease in humans", but disagreed on others such as "Economy" or "Control and Prevention". Nonetheless, the overall disease ranking lists were similar, and these may be

  7. Weighting of Criteria for Disease Prioritization Using Conjoint Analysis and Based on Health Professional and Student Opinion.

    PubMed

    Stebler, Nadine; Schuepbach-Regula, Gertraud; Braam, Peter; Falzon, Laura Cristina

    2016-01-01

    Disease prioritization exercises have been used by several organizations to inform surveillance and control measures. Though most methodologies for disease prioritization are based on expert opinion, it is becoming more common to include different stakeholders in the prioritization exercise. This study was performed to compare the weighting of disease criteria, and the consequent prioritization of zoonoses, by both health professionals and students in Switzerland using a Conjoint Analysis questionnaire. The health professionals comprised public health and food safety experts, cantonal physicians and cantonal veterinarians, while the student group comprised first-year veterinary and agronomy students. Eight criteria were selected for this prioritization based on expert elicitation and literature review. These criteria, described on a 3-tiered scale, were evaluated through a choice-based Conjoint Analysis questionnaire with 25 choice tasks. Questionnaire results were analyzed to obtain importance scores (for each criterion) and mean utility values (for each criterion level), and the latter were then used to rank 16 zoonoses. While the most important criterion for both groups was "Severity of the disease in humans", the second ranked criteria by the health professionals and students were "Economy" and "Treatment in humans", respectively. Regarding the criterion "Control and Prevention", health professionals tended to prioritize a disease when the control and preventive measures were described to be 95% effective, while students prioritized a disease if there were almost no control and preventive measures available. Bovine Spongiform Encephalopathy was the top-ranked disease by both groups. Health professionals and students agreed on the weighting of certain criteria such as "Severity" and "Treatment of disease in humans", but disagreed on others such as "Economy" or "Control and Prevention". Nonetheless, the overall disease ranking lists were similar, and these may be

  8. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  9. Quantitative image analysis of celiac disease.

    PubMed

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  10. Characterisation of mycelial morphology using image analysis.

    PubMed

    Paul, G C; Thomas, C R

    1998-01-01

    Image analysis is now well established in quantifying and characterising microorganisms from fermentation samples. In filamentous fermentations it has become an invaluable tool for characterising complex mycelial morphologies, although it is not yet used extensively in industry. Recent method developments include characterisation of spore germination from the inoculum stage and of the subsequent dispersed and pellet forms. Further methods include characterising vacuolation and simple structural differentiation of mycelia, also from submerged cultures. Image analysis can provide better understanding of the development of mycelial morphology, of the physiological states of the microorganisms in the fermenter, and of their interactions with the fermentation conditions. This understanding should lead to improved design and operation of mycelial fermentations. PMID:9468800

  11. Evaluation of 3D multimodality image registration using receiver operating characteristic (ROC) analysis

    NASA Astrophysics Data System (ADS)

    Holton Tainter, Kerrie S.; Robb, Richard A.; Taneja, Udita; Gray, Joel E.

    1995-04-01

    Receiver operating characteristic analysis has evolved as a useful method for evaluating the discriminatory capability and efficacy of visualization. The ability of such analysis to account for the variance in decision criteria of multiple observers, multiple reading, and a wide range of difficulty in detection among case studies makes ROC especially useful for interpreting the results of a viewing experiment. We are currently using ROC analysis to evaluate the effectiveness of using fused multispectral, or complementary multimodality imaging data in the diagnostic process. The use of multispectral image recordings, gathered from multiple imaging modalities, to provide advanced image visualization and quantization capabilities in evaluating medical images is an important challenge facing medical imaging scientists. Such capabilities would potentially significantly enhance the ability of clinicians to extract scientific and diagnostic information from images. a first step in the effective use of multispectral information is the spatial registration of complementary image datasets so that a point-to-point correspondence exists between them. We are developing a paradigm of measuring the accuracy of existing image registration techniques which includes the ability to relate quantitative measurements, taken from the images themselves, to the decisions made by observers about the state of registration (SOR) of the 3D images. We have used ROC analysis to evaluate the ability of observers to discriminate between correctly registered and incorrectly registered multimodality fused images. We believe this experience is original and represents the first time that ROC analysis has been used to evaluate registered/fused images. We have simulated low-resolution and high-resolution images from real patient MR images of the brain, and fused them with the original MR to produce colorwash superposition images whose exact SOR is known. We have also attempted to extend this analysis to

  12. Machine learning for medical images analysis.

    PubMed

    Criminisi, A

    2016-10-01

    This article discusses the application of machine learning for the analysis of medical images. Specifically: (i) We show how a special type of learning models can be thought of as automatically optimized, hierarchically-structured, rule-based algorithms, and (ii) We discuss how the issue of collecting large labelled datasets applies to both conventional algorithms as well as machine learning techniques. The size of the training database is a function of model complexity rather than a characteristic of machine learning methods.

  13. Analysis of Handling Qualities Design Criteria for Active Inceptor Force-Feel Characteristics

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos A.; Lusardi, Jeff A.

    2013-01-01

    ratio. While these two studies produced boundaries for acceptable/unacceptable stick dynamics for rotorcraft, they were not able to provide guidance on how variations of the stick dynamics in the acceptable region impact handling qualities. More recently, a ground based simulation study [5] suggested little benefit was to be obtained from variations of the damping ratio for a side-stick controller exhibiting high natural frequencies (greater than 17 rad/s) and damping ratios (greater than 2.0). A flight test campaign was conducted concurrently on the RASCAL JUH-60A in-flight simulator and the ACT/FHS EC-135 in flight simulator [6]. Upon detailed analysis of the pilot evaluations the study identified a clear preference for a high damping ratio and natural frequency of the center stick inceptors. Side stick controllers were found to be less sensitive to the damping. While these studies have compiled a substantial amount of data, in the form of qualitative and quantitative pilot opinion, a fundamental analysis of the effect of the inceptor force-feel system on flight control is found to be lacking. The study of Ref. [6] specifically concluded that a systematic analysis was necessary, since discrepancies with the assigned handling qualities showed that proposed analytical design metrics, or criteria, were not suitable. The overall goal of the present study is to develop a clearer fundamental understanding of the underlying mechanisms associated with the inceptor dynamics that govern the handling qualities using a manageable analytical methodology.

  14. Automated image analysis method to detect and quantify macrovesicular steatosis in human liver hematoxylin and eosin-stained histology images

    PubMed Central

    Nativ, Nir I.; Chen, Alvin I.; Yarmush, Gabriel; Henry, Scot D.; Lefkowitch, Jay H.; Klein, Kenneth M.; Maguire, Timothy J.; Schloss, Rene; Guarrera, James V.; Berthiaume, Francois; Yarmush, Martin L.

    2014-01-01

    Large-droplet macrovesicular steatosis (ld-MaS) in over 30% of the liver graft hepatocytes is a major risk factor in liver transplantation. An accurate assessment of ld-MaS percentage is crucial to determine liver graft transplantability, which is currently based on pathologists’ evaluations of hematoxylin and eosin (H&E) stained liver histology specimens, with the predominant criteria being the lipid droplets’ (LDs) relative size and their propensity to displace the hepatocyte’s nucleus to the cell periphery. Automated image analysis systems aimed at objectively and reproducibly quantifying ld-MaS do not accurately differentiate large LDs from small-droplet macrovesicular steatosis (sd-MaS) and do not take into account LD-mediated nuclear displacement, leading to poor correlation with pathologists’ assessment. Here we present an improved image analysis method that incorporates nuclear displacement as a key image feature to segment and classify ld-MaS from H&E stained liver histology slides. More than 52,000 LDs in 54 digital images from 9 patients were analyzed, and the performance of the proposed method was compared against that of current image analysis methods and the ld-MaS percentage evaluations of two trained pathologists from different centers. We show that combining nuclear displacement and LD size information significantly improves the separation between large and small macrovesicular LDs (specificity=93.7%, sensitivity=99.3%) and the correlation with the pathologists’ ld-MaS percentage assessment (R2=0.97). This performance vastly exceeds that of other automated image analyzers, which typically underestimate or overestimate the pathologists’ ld-MaS score. This work demonstrates the potential of automated ld-MaS analysis in monitoring the steatotic state of livers. The image analysis principles demonstrated here may help standardize ld-MaS scores among centers and ultimately help in the process of determining liver graft transplantability

  15. Image analysis of blood platelets adhesion.

    PubMed

    Krízová, P; Rysavá, J; Vanícková, M; Cieslar, P; Dyr, J E

    2003-01-01

    Adhesion of blood platelets is one of the major events in haemostatic and thrombotic processes. We studied adhesion of blood platelets on fibrinogen and fibrin dimer sorbed on solid support material (glass, polystyrene). Adhesion was carried on under static and dynamic conditions and measured as percentage of the surface covered with platelets. Within a range of platelet counts in normal and in thrombocytopenic blood we observed a very significant decrease in platelet adhesion on fibrin dimer with bounded active thrombin with decreasing platelet count. Our results show the imperative use of platelet poor blood preparations as control samples in experiments with thrombocytopenic blood. Experiments carried on adhesive surfaces sorbed on polystyrene showed lower relative inaccuracy than on glass. Markedly different behaviour of platelets adhered on the same adhesive surface, which differed only in support material (glass or polystyrene) suggest that adhesion and mainly spreading of platelets depends on physical quality of the surface. While on polystyrene there were no significant differences between fibrin dimer and fibrinogen, adhesion measured on glass support material markedly differed between fibrin dimer and fibrinogen. We compared two methods of thresholding in image analysis of adhered platelets. Results obtained by image analysis of spreaded platelets showed higher relative inaccuracy than results obtained by image analysis of platelets centres and aggregates.

  16. Reticle defect sizing of optical proximity correction defects using SEM imaging and image analysis techniques

    NASA Astrophysics Data System (ADS)

    Zurbrick, Larry S.; Wang, Lantian; Konicek, Paul; Laird, Ellen R.

    2000-07-01

    Sizing of programmed defects on optical proximity correction (OPC) feature sis addressed using high resolution scanning electron microscope (SEM) images and image analysis techniques. A comparison and analysis of different sizing methods is made. This paper addresses the issues of OPC defect definition and discusses the experimental measurement results obtained by SEM in combination with image analysis techniques.

  17. Multispectral laser imaging for advanced food analysis

    NASA Astrophysics Data System (ADS)

    Senni, L.; Burrascano, P.; Ricci, M.

    2016-07-01

    A hardware-software apparatus for food inspection capable of realizing multispectral NIR laser imaging at four different wavelengths is herein discussed. The system was designed to operate in a through-transmission configuration to detect the presence of unwanted foreign bodies inside samples, whether packed or unpacked. A modified Lock-In technique was employed to counterbalance the significant signal intensity attenuation due to transmission across the sample and to extract the multispectral information more efficiently. The NIR laser wavelengths used to acquire the multispectral images can be varied to deal with different materials and to focus on specific aspects. In the present work the wavelengths were selected after a preliminary analysis to enhance the image contrast between foreign bodies and food in the sample, thus identifying the location and nature of the defects. Experimental results obtained from several specimens, with and without packaging, are presented and the multispectral image processing as well as the achievable spatial resolution of the system are discussed.

  18. Criteria for the use of regression analysis for remote sensing of sediment and pollutants

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R.

    1982-01-01

    An examination of limitations, requirements, and precision of the linear multiple-regression technique for quantification of marine environmental parameters is conducted. Both environmental and optical physics conditions have been defined for which an exact solution to the signal response equations is of the same form as the multiple regression equation. Various statistical parameters are examined to define a criteria for selection of an unbiased fit when upwelled radiance values contain error and are correlated with each other. Field experimental data are examined to define data smoothing requirements in order to satisfy the criteria of Daniel and Wood (1971). Recommendations are made concerning improved selection of ground-truth locations to maximize variance and to minimize physical errors associated with the remote sensing experiment.

  19. Analysis of different feature selection criteria based on a covariance convergence perspective for a SLAM algorithm.

    PubMed

    Auat Cheein, Fernando A; Carelli, Ricardo

    2011-01-01

    This paper introduces several non-arbitrary feature selection techniques for a Simultaneous Localization and Mapping (SLAM) algorithm. The feature selection criteria are based on the determination of the most significant features from a SLAM convergence perspective. The SLAM algorithm implemented in this work is a sequential EKF (Extended Kalman filter) SLAM. The feature selection criteria are applied on the correction stage of the SLAM algorithm, restricting it to correct the SLAM algorithm with the most significant features. This restriction also causes a decrement in the processing time of the SLAM. Several experiments with a mobile robot are shown in this work. The experiments concern the map reconstruction and a comparison between the different proposed techniques performance. The experiments were carried out at an outdoor environment composed by trees, although the results shown herein are not restricted to a special type of features. PMID:22346568

  20. Analysis of Different Feature Selection Criteria Based on a Covariance Convergence Perspective for a SLAM Algorithm

    PubMed Central

    Auat Cheein, Fernando A.; Carelli, Ricardo

    2011-01-01

    This paper introduces several non-arbitrary feature selection techniques for a Simultaneous Localization and Mapping (SLAM) algorithm. The feature selection criteria are based on the determination of the most significant features from a SLAM convergence perspective. The SLAM algorithm implemented in this work is a sequential EKF (Extended Kalman filter) SLAM. The feature selection criteria are applied on the correction stage of the SLAM algorithm, restricting it to correct the SLAM algorithm with the most significant features. This restriction also causes a decrement in the processing time of the SLAM. Several experiments with a mobile robot are shown in this work. The experiments concern the map reconstruction and a comparison between the different proposed techniques performance. The experiments were carried out at an outdoor environment composed by trees, although the results shown herein are not restricted to a special type of features. PMID:22346568

  1. Optimal site selection for sitting a solar park using multi-criteria decision analysis and geographical information systems

    NASA Astrophysics Data System (ADS)

    Georgiou, Andreas; Skarlatos, Dimitrios

    2016-07-01

    Among the renewable power sources, solar power is rapidly becoming popular because it is inexhaustible, clean, and dependable. It has also become more efficient since the power conversion efficiency of photovoltaic solar cells has increased. Following these trends, solar power will become more affordable in years to come and considerable investments are to be expected. Despite the size of solar plants, the sitting procedure is a crucial factor for their efficiency and financial viability. Many aspects influence such a decision: legal, environmental, technical, and financial to name a few. This paper describes a general integrated framework to evaluate land suitability for the optimal placement of photovoltaic solar power plants, which is based on a combination of a geographic information system (GIS), remote sensing techniques, and multi-criteria decision-making methods. An application of the proposed framework for the Limassol district in Cyprus is further illustrated. The combination of a GIS and multi-criteria methods produces an excellent analysis tool that creates an extensive database of spatial and non-spatial data, which will be used to simplify problems as well as solve and promote the use of multiple criteria. A set of environmental, economic, social, and technical constrains, based on recent Cypriot legislation, European's Union policies, and expert advice, identifies the potential sites for solar park installation. The pairwise comparison method in the context of the analytic hierarchy process (AHP) is applied to estimate the criteria weights in order to establish their relative importance in site evaluation. In addition, four different methods to combine information layers and check their sensitivity were used. The first considered all the criteria as being equally important and assigned them equal weight, whereas the others grouped the criteria and graded them according to their objective perceived importance. The overall suitability of the study

  2. F-106 data summary and model results relative to threat criteria and protection design analysis

    NASA Technical Reports Server (NTRS)

    Pitts, F. L.; Finelli, G. B.; Perala, R. A.; Rudolph, T. H.

    1986-01-01

    The NASA F-106 has acquired considerable data on the rates-of-change of EM parameters on the aircraft surface during 690 direct lightning strikes while penetrating thunderstorms at altitudes from 15,000 to 40,000 feet. The data are presently being used in updating previous lightning criteria and standards. The new lightning standards will, therefore, be the first which reflect actual aircraft responses measured at flight altitudes.

  3. Guidelines as rationing tools: a qualitative analysis of psychosocial patient selection criteria for cardiac procedures

    PubMed Central

    Giacomini, Mita K.; Cook, Deborah J.; Streiner, David L.; Anand, Sonia S.

    2001-01-01

    Background Cardiac procedure guidelines often include psychosocial criteria for selecting patients that potentially introduce social value judgements into clinical decisions and decisions about the rationing of care. The aim of this study was to investigate the terms and justifications for and the meanings of psychosocial patient characteristics used in cardiac procedure guidelines. Methods We selected English-language guidelines published since 1990 and chapters in textbooks published since 1989. These guidelines amalgamated multiple sources of evidence and expertise and made recommendations regarding patient selection for specific procedures. A multidisciplinary team of physicians and social scientists extracted passages regarding psychosocial criteria and developed categories and conceptual relationships to describe and interpret their content. Results Sixty-five papers met the criteria for inclusion in the study. Forty-five (69%) mentioned psychosocial criteria as procedure indications or contraindications. The latter fell into several categories, including behavioural and psychological issues, relationships with significant others, financial resources, social roles and environmental circumstances. Interpretation Psychosocial characteristics are portrayed as having 2 roles in patient selection: as risk factors intrinsic to the candidate or as indicators of need for special intervention. Guidelines typically simply list psychosocial contraindications without clarifying their specific nature or providing any justification for their use. Psychosocial considerations can help in the evaluation of patients for cardiac procedures, but they become ethically controversial when used to restrict access. The use of psychosocial indications and contraindications could be improved by more precise descriptions of the psychosocial problem at issue, explanations regarding why the criterion matters and justification of the characteristic using a biological rationale or research

  4. Analysis of expanded criteria to select candidates for active surveillance of low-risk prostate cancer

    PubMed Central

    Jo, Jung Ki; Lee, Han Sol; Lee, Young Ik; Lee, Sang Eun; Hong, Sung Kyu

    2015-01-01

    We aimed to analyze the value of each criterion for clinically insignificant prostate cancer (PCa) in the selection of men for active surveillance (AS) of low-risk PCa. We identified 532 men who were treated with radical prostatectomy from 2006 to 2013 who met 4 or all 5 of the criteria for clinically insignificant PCa (clinical stage ≤ T1, prostate specific antigen [PSA] density ≤ 0.15, biopsy Gleason score ≤ 6, number of positive biopsy cores ≤ 2, and no core with > 50% involvement) and analyzed their pathologic and biochemical outcomes. Patients who met all 5 criteria for clinically insignificant PCa were designated as group A (n = 172), and those who met 4 of 5 criteria were designated as group B (n = 360). The association of each criterion with adverse pathologic features was assessed via logistic regression analyses. Comparison of group A and B and also logistic regression analyses showed that PSA density > 0.15 ng ml−1 and high (≥7) biopsy Gleason score were associated with adverse pathologic features. Higher (> T1c) clinical stage was not associated with any adverse pathologic features. Although ≤ 3 positive cores were not associated with any adverse pathology, ≥4 positive cores were associated with higher risk of extracapsular extension. Among potential candidates for AS, PSA density > 0.15 ng ml−1 and biopsy Gleason score > 6 pose significantly higher risks of harboring more aggressive disease. The eligibility criteria for AS may be expanded to include men with clinical stage T2 tumor and 3 positive cores. PMID:25432498

  5. Analysis of expanded criteria to select candidates for active surveillance of low-risk prostate cancer.

    PubMed

    Jo, Jung Ki; Lee, Han Sol; Lee, Young Ik; Lee, Sang Eun; Hong, Sung Kyu

    2015-01-01

    We aimed to analyze the value of each criterion for clinically insignificant prostate cancer (PCa) in the selection of men for active surveillance (AS) of low-risk PCa. We identified 532 men who were treated with radical prostatectomy from 2006 to 2013 who met 4 or all 5 of the criteria for clinically insignificant PCa (clinical stage ≤ T1, prostate specific antigen [PSA] density ≤ 0.15, biopsy Gleason score ≤ 6, number of positive biopsy cores ≤ 2, and no core with > 50% involvement) and analyzed their pathologic and biochemical outcomes. Patients who met all 5 criteria for clinically insignificant PCa were designated as group A (n = 172), and those who met 4 of 5 criteria were designated as group B (n = 360). The association of each criterion with adverse pathologic features was assessed via logistic regression analyses. Comparison of group A and B and also logistic regression analyses showed that PSA density > 0.15 ng ml-1 and high (≥7) biopsy Gleason score were associated with adverse pathologic features. Higher (> T1c) clinical stage was not associated with any adverse pathologic features. Although ≤ 3 positive cores were not associated with any adverse pathology, ≥4 positive cores were associated with higher risk of extracapsular extension. Among potential candidates for AS, PSA density > 0.15 ng ml-1 and biopsy Gleason score > 6 pose significantly higher risks of harboring more aggressive disease. The eligibility criteria for AS may be expanded to include men with clinical stage T2 tumor and 3 positive cores.

  6. A multiple criteria analysis for household solid waste management in the urban community of Dakar.

    PubMed

    Kapepula, Ka-Mbayu; Colson, Gerard; Sabri, Karim; Thonart, Philippe

    2007-01-01

    Household solid waste management is a severe problem in big cities of developing countries. Mismanaged solid waste dumpsites produce bad sanitary, ecological and economic consequences for the whole population, especially for the poorest urban inhabitants. Dealing with this problem, this paper utilizes field data collected in the urban community of Dakar, in view of ranking nine areas of the city with respect to multiple criteria of nuisance. Nine criteria are built and organized in three families that represent three classical viewpoints: the production of wastes, their collection and their treatment. Thanks to the method PROMETHEE and the software ARGOS, we do a pair-wise comparison of the nine areas, which allows their multiple criteria rankings according to each viewpoint and then globally. Finding the worst and best areas in terms of nuisance for a better waste management in the city is our final purpose, fitting as well as possible the needs of the urban community. Based on field knowledge and on the literature, we suggest applying general and area-specific remedies to the household solid waste problems. PMID:17064885

  7. Criteria and indicators for the assessment of community forestry outcomes: a comparative analysis from Canada.

    PubMed

    Teitelbaum, Sara

    2014-01-01

    In Canada, there are few structured evaluations of community forestry despite more than twenty years of practice. This article presents a criteria and indicator framework, designed to elicit descriptive information about the types of socio-economic results being achieved by community forests in the Canadian context. The criteria and indicators framework draws on themes proposed by other researchers both in the field of community forestry and related areas. The framework is oriented around three concepts described as amongst the underlying objectives of community forestry, namely participatory governance, local economic benefits and multiple forest use. This article also presents the results of a field-based application of the criteria and indicators framework, comparing four case studies in three Canadian provinces. All four are community forests with direct tenure rights to manage and benefit from forestry activities. Results reveal that in terms of governance, the case studies adhere to two different models, which we name 'interest group' vs. 'local government'. Stronger participatory dimensions are evident in two case studies. In the area of local economic benefits, the four case studies perform similarly, with some of the strongest benefits being in employment creation, especially for those case studies that offer non-timber activities such as recreation and education. Two of four cases have clearly adopted a multiple-use approach to management.

  8. Nursing image: an evolutionary concept analysis.

    PubMed

    Rezaei-Adaryani, Morteza; Salsali, Mahvash; Mohammadi, Eesa

    2012-12-01

    A long-term challenge to the nursing profession is the concept of image. In this study, we used the Rodgers' evolutionary concept analysis approach to analyze the concept of nursing image (NI). The aim of this concept analysis was to clarify the attributes, antecedents, consequences, and implications associated with the concept. We performed an integrative internet-based literature review to retrieve English literature published from 1980-2011. Findings showed that NI is a multidimensional, all-inclusive, paradoxical, dynamic, and complex concept. The media, invisibility, clothing style, nurses' behaviors, gender issues, and professional organizations are the most important antecedents of the concept. We found that NI is pivotal in staff recruitment and nursing shortage, resource allocation to nursing, nurses' job performance, workload, burnout and job dissatisfaction, violence against nurses, public trust, and salaries available to nurses. An in-depth understanding of the NI concept would assist nurses to eliminate negative stereotypes and build a more professional image for the nurse and the profession. PMID:23343236

  9. Simple Low Level Features for Image Analysis

    NASA Astrophysics Data System (ADS)

    Falcoz, Paolo

    As human beings, we perceive the world around us mainly through our eyes, and give what we see the status of “reality”; as such we historically tried to create ways of recording this reality so we could augment or extend our memory. From early attempts in photography like the image produced in 1826 by the French inventor Nicéphore Niépce (Figure 2.1) to the latest high definition camcorders, the number of recorded pieces of reality increased exponentially, posing the problem of managing all that information. Most of the raw video material produced today has lost its memory augmentation function, as it will hardly ever be viewed by any human; pervasive CCTVs are an example. They generate an enormous amount of data each day, but there is not enough “human processing power” to view them. Therefore the need for effective automatic image analysis tools is great, and a lot effort has been put in it, both from the academia and the industry. In this chapter, a review of some of the most important image analysis tools are presented.

  10. System analysis approach to deriving design criteria (Loads) for Space Shuttle and its payloads. Volume 2: Typical examples

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.

    1981-01-01

    The achievement of an optimized design from the system standpoint under the low cost, high risk constraints of the present day environment was analyzed. Space Shuttle illustrates the requirement for an analysis approach that considers all major disciplines (coupling between structures control, propulsion, thermal, aeroelastic, and performance), simultaneously. The Space Shuttle and certain payloads, Space Telescope and Spacelab, are examined. The requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements, control requirements, and the resulting design verification approaches are illustrated. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.

  11. System analysis approach to deriving design criteria (loads) for Space Shuttle and its payloads. Volume 1: General statement of approach

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.

    1981-01-01

    Space shuttle, the most complex transportation system designed to date, illustrates the requirement for an analysis approach that considers all major disciplines simultaneously. Its unique cross coupling and high sensitivity to aerodynamic uncertainties and high performance requirements dictated a less conservative approach than those taken in programs. Analyses performed for the space shuttle and certain payloads, Space Telescope and Spacelab, are used a examples. These illustrate the requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements control requirements and the resulting design verification approaches. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.

  12. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    PubMed

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments).

  13. Covariance of Lucky Images: Performance analysis

    NASA Astrophysics Data System (ADS)

    Cagigal, Manuel P.; Valle, Pedro J.; Cagigas, Miguel A.; Villó-Pérez, Isidro; Colodro-Conde, Carlos; Ginski, C.; Mugrauer, M.; Seeliger, M.

    2016-09-01

    The covariance of ground-based Lucky Images (COELI) is a robust and easy-to-use algorithm that allows us to detect faint companions surrounding a host star. In this paper we analyze the relevance of the number of processed frames, the frames quality, the atmosphere conditions and the detection noise on the companion detectability. This analysis has been carried out using both experimental and computer simulated imaging data. Although the technique allows us the detection of faint companions, the camera detection noise and the use of a limited number of frames reduce the minimum detectable companion intensity to around 1000 times fainter than that of the host star when placed at an angular distance corresponding to the few first Airy rings. The reachable contrast could be even larger when detecting companions with the assistance of an adaptive optics system.

  14. PAMS photo image retrieval prototype alternatives analysis

    SciTech Connect

    Conner, M.L.

    1996-04-30

    Photography and Audiovisual Services uses a system called the Photography and Audiovisual Management System (PAMS) to perform order entry and billing services. The PAMS system utilizes Revelation Technologies database management software, AREV. Work is currently in progress to link the PAMS AREV system to a Microsoft SQL Server database engine to provide photograph indexing and query capabilities. The link between AREV and SQLServer will use a technique called ``bonding.`` This photograph imaging subsystem will interface to the PAMS system and handle the image capture and retrieval portions of the project. The intent of this alternatives analysis is to examine the software and hardware alternatives available to meet the requirements for this project, and identify a cost-effective solution.

  15. [Imaging Mass Spectrometry in Histopathologic Analysis].

    PubMed

    Yamazaki, Fumiyoshi; Seto, Mitsutoshi

    2015-04-01

    Matrix-assisted laser desorption/ionization (MALDI)-imaging mass spectrometry (IMS) enables visualization of the distribution of a range of biomolecules by integrating biochemical information from mass spectrometry with positional information from microscopy. IMS identifies a target molecule. In addition, IMS enables global analysis of biomolecules containing unknown molecules by detecting the ratio of the molecular weight to electric charge without any target, which makes it possible to identify novel molecules. IMS generates data on the distribution of lipids and small molecules in tissues, which is difficult to visualize with either conventional counter-staining or immunohistochemistry. In this review, we firstly introduce the principle of imaging mass spectrometry and recent advances in the sample preparation method. Secondly, we present findings regarding biological samples, especially pathological ones. Finally, we discuss the limitations and problems of the IMS technique and clinical application, such as in drug development. PMID:26536781

  16. Uses of software in digital image analysis: a forensic report

    NASA Astrophysics Data System (ADS)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  17. Wavelet-based image analysis system for soil texture analysis

    NASA Astrophysics Data System (ADS)

    Sun, Yun; Long, Zhiling; Jang, Ping-Rey; Plodinec, M. John

    2003-05-01

    Soil texture is defined as the relative proportion of clay, silt and sand found in a given soil sample. It is an important physical property of soil that affects such phenomena as plant growth and agricultural fertility. Traditional methods used to determine soil texture are either time consuming (hydrometer), or subjective and experience-demanding (field tactile evaluation). Considering that textural patterns observed at soil surfaces are uniquely associated with soil textures, we propose an innovative approach to soil texture analysis, in which wavelet frames-based features representing texture contents of soil images are extracted and categorized by applying a maximum likelihood criterion. The soil texture analysis system has been tested successfully with an accuracy of 91% in classifying soil samples into one of three general categories of soil textures. In comparison with the common methods, this wavelet-based image analysis approach is convenient, efficient, fast, and objective.

  18. Bone feature analysis using image processing techniques.

    PubMed

    Liu, Z Q; Austin, T; Thomas, C D; Clement, J G

    1996-01-01

    In order to establish the correlation between bone structure and age, and information about age-related bone changes, it is necessary to study microstructural features of human bone. Traditionally, in bone biology and forensic science, the analysis if bone cross-sections has been carried out manually. Such a process is known to be slow, inefficient and prone to human error. Consequently, the results obtained so far have been unreliable. In this paper we present a new approach to quantitative analysis of cross-sections of human bones using digital image processing techniques. We demonstrate that such a system is able to extract various bone features consistently and is capable of providing more reliable data and statistics for bones. Consequently, we will be able to correlate features of bone microstructure with age and possibly also with age related bone diseases such as osteoporosis. The development of knowledge-based computer vision-systems for automated bone image analysis can now be considered feasible.

  19. Soil Surface Roughness through Image Analysis

    NASA Astrophysics Data System (ADS)

    Tarquis, A. M.; Saa-Requejo, A.; Valencia, J. L.; Moratiel, R.; Paz-Gonzalez, A.; Agro-Environmental Modeling

    2011-12-01

    Soil erosion is a complex phenomenon involving the detachment and transport of soil particles, storage and runoff of rainwater, and infiltration. The relative magnitude and importance of these processes depends on several factors being one of them surface micro-topography, usually quantified trough soil surface roughness (SSR). SSR greatly affects surface sealing and runoff generation, yet little information is available about the effect of roughness on the spatial distribution of runoff and on flow concentration. The methods commonly used to measure SSR involve measuring point elevation using a pin roughness meter or laser, both of which are labor intensive and expensive. Lately a simple and inexpensive technique based on percentage of shadow in soil surface image has been developed to determine SSR in the field in order to obtain measurement for wide spread application. One of the first steps in this technique is image de-noising and thresholding to estimate the percentage of black pixels in the studied area. In this work, a series of soil surface images have been analyzed applying several de-noising wavelet analysis and thresholding algorithms to study the variation in percentage of shadows and the shadows size distribution. Funding provided by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no. AGL2010- 21501/AGR and by Xunta de Galicia through project no INCITE08PXIB1621 are greatly appreciated.

  20. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  1. Monotonic correlation analysis of image quality measures for image fusion

    NASA Astrophysics Data System (ADS)

    Kaplan, Lance M.; Burks, Stephen D.; Moore, Richard K.; Nguyen, Quang

    2008-04-01

    The next generation of night vision goggles will fuse image intensified and long wave infra-red to create a hybrid image that will enable soldiers to better interpret their surroundings during nighttime missions. Paramount to the development of such goggles is the exploitation of image quality (IQ) measures to automatically determine the best image fusion algorithm for a particular task. This work introduces a novel monotonic correlation coefficient to investigate how well possible IQ features correlate to actual human performance, which is measured by a perception study. The paper will demonstrate how monotonic correlation can identify worthy features that could be overlooked by traditional correlation values.

  2. An analysis of the qualification criteria for small radioactive material shipping packages

    SciTech Connect

    McClure, J.D.

    1983-05-01

    The RAM package design certification process has two important elements, testing and acceptance. These terms sound very similar but they have specific meanings. Qualification testing in the context of this study is the imposition of simulated accident test conditions upon the candidate package design. (Normal transportation environments may also be included.) Following qualification testing, the acceptance criteria provide the performance levels which, if demonstrated, indicate the ability of the RAM package to sustain the severity of the qualification testing sequence and yet maintain specified levels of package integrity. This study has used Severities of Transportation Accidents as a data base to examine the regulatory test criteria which are required to be met by small packages containing Type B quantities of radioactive material (RAM). The basic findings indicate that the present regulatory test standards provide significantly higher levels of protection for the surface transportation modes (truck, rail) than for RAM packages shipped by aircraft. It should also be noted that various risk assessment studies have shown that the risk to the public due to severe transport accidents by surface and air transport modes is very low. A key element in this study was the quantification of the severity of the transportation accident environment and the severity of the present qualification test standards (called qualification test standards in this document) so that a direct comparison could be made between them to assess the effectiveness of the existing qualification test standards. The manner in which this was accomplished is described.

  3. Theoretical analysis on the sampling criteria for time-interleaved photonic analog-to-digital converters

    NASA Astrophysics Data System (ADS)

    Chen, Jianping; Su, Feiran; Wu, Guiling

    2014-11-01

    Time-interleaved photonic analog-to-digital converter (TIPADC) is a promising candidate to process ultra-wideband signals. In a TIPADC, quantization is electrical in order to obtain large effective number of bits (ENOB). In this paper, we study the issues on the signal sampling and reconstruction in the TIPADC from the systematic point of view. The sampling output and frequency response of the system are derived using a model that includes the photonic sampling, demultiplexing, photo detecting, electronic quantizing and digital processing. The signal sampling and reconstruction mechanism of TIPADC with a uniform system sampling rate and matched channels are illuminated with the spectrum of signal in each processing step. The effect of the sampling pulse and back-end electronics on the system frequency response is analyzed in detail. The feasible regions of the system for alias-free sampling in terms of system frequency response, and a set of sampling criteria on bandwidth of the sampling pulse and back-end electronics are presented for the TIPADC. We find that the analog bandwidth of TIPADC can be much higher than the bandwidth of back-end electronics due to the weighted summing introduced by the multichannel time-interleaved photonic sampling. The proposed model and sampling criteria are validated by simulations under different parameter configurations.

  4. Wavelet Analysis of Space Solar Telescope Images

    NASA Astrophysics Data System (ADS)

    Zhu, Xi-An; Jin, Sheng-Zhen; Wang, Jing-Yu; Ning, Shu-Nian

    2003-12-01

    The scientific satellite SST (Space Solar Telescope) is an important research project strongly supported by the Chinese Academy of Sciences. Every day, SST acquires 50 GB of data (after processing) but only 10GB can be transmitted to the ground because of limited time of satellite passage and limited channel volume. Therefore, the data must be compressed before transmission. Wavelets analysis is a new technique developed over the last 10 years, with great potential of application. We start with a brief introduction to the essential principles of wavelet analysis, and then describe the main idea of embedded zerotree wavelet coding, used for compressing the SST images. The results show that this coding is adequate for the job.

  5. Difference Image Analysis of Galactic Microlensing. I. Data Analysis

    SciTech Connect

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.

    1999-08-20

    This is a preliminary report on the application of Difference Image Analysis (DIA) to Galactic bulge images. The aim of this analysis is to increase the sensitivity to the detection of gravitational microlensing. We discuss how the DIA technique simplifies the process of discovering microlensing events by detecting only objects that have variable flux. We illustrate how the DIA technique is not limited to detection of so-called ''pixel lensing'' events but can also be used to improve photometry for classical microlensing events by removing the effects of blending. We will present a method whereby DIA can be used to reveal the true unblended colors, positions, and light curves of microlensing events. We discuss the need for a technique to obtain the accurate microlensing timescales from blended sources and present a possible solution to this problem using the existing Hubble Space Telescope color-magnitude diagrams of the Galactic bulge and LMC. The use of such a solution with both classical and pixel microlensing searches is discussed. We show that one of the major causes of systematic noise in DIA is differential refraction. A technique for removing this systematic by effectively registering images to a common air mass is presented. Improvements to commonly used image differencing techniques are discussed. (c) 1999 The American Astronomical Society.

  6. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health.

  7. The Scientific Image in Behavior Analysis.

    PubMed

    Keenan, Mickey

    2016-05-01

    Throughout the history of science, the scientific image has played a significant role in communication. With recent developments in computing technology, there has been an increase in the kinds of opportunities now available for scientists to communicate in more sophisticated ways. Within behavior analysis, though, we are only just beginning to appreciate the importance of going beyond the printing press to elucidate basic principles of behavior. The aim of this manuscript is to stimulate appreciation of both the role of the scientific image and the opportunities provided by a quick response code (QR code) for enhancing the functionality of the printed page. I discuss the limitations of imagery in behavior analysis ("Introduction"), and I show examples of what can be done with animations and multimedia for teaching philosophical issues that arise when teaching about private events ("Private Events 1 and 2"). Animations are also useful for bypassing ethical issues when showing examples of challenging behavior ("Challenging Behavior"). Each of these topics can be accessed only by scanning the QR code provided. This contingency has been arranged to help the reader embrace this new technology. In so doing, I hope to show its potential for going beyond the limitations of the printing press. PMID:27606187

  8. A review and classification of approaches for dealing with uncertainty in multi-criteria decision analysis for healthcare decisions.

    PubMed

    Broekhuizen, Henk; Groothuis-Oudshoorn, Catharina G M; van Til, Janine A; Hummel, J Marjan; IJzerman, Maarten J

    2015-05-01

    Multi-criteria decision analysis (MCDA) is increasingly used to support decisions in healthcare involving multiple and conflicting criteria. Although uncertainty is usually carefully addressed in health economic evaluations, whether and how the different sources of uncertainty are dealt with and with what methods in MCDA is less known. The objective of this study is to review how uncertainty can be explicitly taken into account in MCDA and to discuss which approach may be appropriate for healthcare decision makers. A literature review was conducted in the Scopus and PubMed databases. Two reviewers independently categorized studies according to research areas, the type of MCDA used, and the approach used to quantify uncertainty. Selected full text articles were read for methodological details. The search strategy identified 569 studies. The five approaches most identified were fuzzy set theory (45% of studies), probabilistic sensitivity analysis (15%), deterministic sensitivity analysis (31%), Bayesian framework (6%), and grey theory (3%). A large number of papers considered the analytic hierarchy process in combination with fuzzy set theory (31%). Only 3% of studies were published in healthcare-related journals. In conclusion, our review identified five different approaches to take uncertainty into account in MCDA. The deterministic approach is most likely sufficient for most healthcare policy decisions because of its low complexity and straightforward implementation. However, more complex approaches may be needed when multiple sources of uncertainty must be considered simultaneously.

  9. Cost and schedule control systems criteria for contract performance measurement: contractor reporting/data analysis guide

    SciTech Connect

    Not Available

    1980-11-01

    The DOE Cost and Schedule Control Systems Criteria (CSCSC) require that a contractor's management control systems include methods and procedures designed to ensure that they will accomplish, in addition to other requirements, a summarization of data elements to the level of reporting to DOE specified in the contract under separate clause. Reports provided to DOE must relate contract cost, schedule, and technical accomplishment to a baseline plan, within the framework of both the contract Work Breakdown Structure (WBS) and the contractor's organizational structure. This Guide describes the reports available from contractors, with emphasis on the Cost Performance Report (CPR), and provides a framework for using the reported data as a basis for decision making. This Guide was developed to assist DOE Project Managers in assessing contractor performance through proper use of the CPR and supporting reports.

  10. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  11. Deciding on Science: An Analysis of Higher Education Science Student Major Choice Criteria

    NASA Astrophysics Data System (ADS)

    White, Stephen Wilson

    The number of college students choosing to major in science, technology, engineering, and math (STEM) in the United States affects the size and quality of the American workforce (Winters, 2009). The number of graduates in these academic fields has been on the decline in the United States since the 1960s, which, according to Lips and McNeil (2009), has resulted in a diminished ability of the United States to compete in science and engineering on the world stage. The purpose of this research was to learn why students chose a STEM major and determine what decision criteria influenced this decision. According to Ajzen's (1991) theory of planned behavior (TPB), the key components of decision-making can be quantified and used as predictors of behavior. In this study the STEM majors' decision criteria were compared between different institution types (two-year, public four-year, and private four-year), and between demographic groups (age and sex). Career, grade, intrinsic, self-efficacy, and self-determination were reported as motivational factors by a majority of science majors participating in this study. Few students reported being influenced by friends and family when deciding to major in science. Science students overwhelmingly attributed the desire to solve meaningful problems as central to their decision to major in science. A majority of students surveyed credited a teacher for influencing their desire to pursue science as a college major. This new information about the motivational construct of the studied group of science majors can be applied to the previously stated problem of not enough STEM majors in the American higher education system to provide workers required to fill the demand of a globally STEM-competitive United States (National Academy of Sciences, National Academy of Engineering, & Institute of Medicine, 2010).

  12. Decision-theoretic analysis of forensic sampling criteria using bayesian decision networks.

    PubMed

    Biedermann, A; Bozza, S; Garbolino, P; Taroni, F

    2012-11-30

    Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker--typically a client of a forensic examination or a scientist acting on behalf of a client--ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked

  13. Monte Carlo-based interval transformation analysis for multi-criteria decision analysis of groundwater management strategies under uncertain naphthalene concentrations and health risks

    NASA Astrophysics Data System (ADS)

    Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong

    2016-08-01

    A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.

  14. Analysis of autostereoscopic three-dimensional images using multiview wavelets.

    PubMed

    Saveljev, Vladimir; Palchikova, Irina

    2016-08-10

    We propose that multiview wavelets can be used in processing multiview images. The reference functions for the synthesis/analysis of multiview images are described. The synthesized binary images were observed experimentally as three-dimensional visual images. The symmetric multiview B-spline wavelets are proposed. The locations recognized in the continuous wavelet transform correspond to the layout of the test objects. The proposed wavelets can be applied to the multiview, integral, and plenoptic images. PMID:27534470

  15. Vector processing enhancements for real-time image analysis.

    SciTech Connect

    Shoaf, S.; APS Engineering Support Division

    2008-01-01

    A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.

  16. Effect of tree nuts on metabolic syndrome criteria: a systematic review and meta-analysis of randomised controlled trials

    PubMed Central

    Blanco Mejia, Sonia; Kendall, Cyril W C; Viguiliouk, Effie; Augustin, Livia S; Ha, Vanessa; Cozma, Adrian I; Mirrahimi, Arash; Maroleanu, Adriana; Chiavaroli, Laura; Leiter, Lawrence A; de Souza, Russell J; Jenkins, David J A; Sievenpiper, John L

    2014-01-01

    Objective To provide a broader evidence summary to inform dietary guidelines of the effect of tree nuts on criteria of the metabolic syndrome (MetS). Design We conducted a systematic review and meta-analysis of the effect of tree nuts on criteria of the MetS. Data sources We searched MEDLINE, EMBASE, CINAHL and the Cochrane Library (through 4 April 2014). Eligibility criteria for selecting studies We included relevant randomised controlled trials (RCTs) of ≥3 weeks reporting at least one criterion of the MetS. Data extraction Two or more independent reviewers extracted all relevant data. Data were pooled using the generic inverse variance method using random effects models and expressed as mean differences (MD) with 95% CIs. Heterogeneity was assessed by the Cochran Q statistic and quantified by the I2 statistic. Study quality and risk of bias were assessed. Results Eligibility criteria were met by 49 RCTs including 2226 participants who were otherwise healthy or had dyslipidaemia, MetS or type 2 diabetes mellitus. Tree nut interventions lowered triglycerides (MD=−0.06 mmol/L (95% CI −0.09 to −0.03 mmol/L)) and fasting blood glucose (MD=−0.08 mmol/L (95% CI −0.16 to −0.01 mmol/L)) compared with control diet interventions. There was no effect on waist circumference, high-density lipoprotein cholesterol or blood pressure with the direction of effect favouring tree nuts for waist circumference. There was evidence of significant unexplained heterogeneity in all analyses (p<0.05). Conclusions Pooled analyses show a MetS benefit of tree nuts through modest decreases in triglycerides and fasting blood glucose with no adverse effects on other criteria across nut types. As our conclusions are limited by the short duration and poor quality of the majority of trials, as well as significant unexplained between-study heterogeneity, there remains a need for larger, longer, high-quality trials. Trial registration number NCT01630980. PMID:25074070

  17. Thermal image analysis for detecting facemask leakage

    NASA Astrophysics Data System (ADS)

    Dowdall, Jonathan B.; Pavlidis, Ioannis T.; Levine, James

    2005-03-01

    Due to the modern advent of near ubiquitous accessibility to rapid international transportation the epidemiologic trends of highly communicable diseases can be devastating. With the recent emergence of diseases matching this pattern, such as Severe Acute Respiratory Syndrome (SARS), an area of overt concern has been the transmission of infection through respiratory droplets. Approved facemasks are typically effective physical barriers for preventing the spread of viruses through droplets, but breaches in a mask"s integrity can lead to an elevated risk of exposure and subsequent infection. Quality control mechanisms in place during the manufacturing process insure that masks are defect free when leaving the factory, but there remains little to detect damage caused by transportation or during usage. A system that could monitor masks in real-time while they were in use would facilitate a more secure environment for treatment and screening. To fulfill this necessity, we have devised a touchless method to detect mask breaches in real-time by utilizing the emissive properties of the mask in the thermal infrared spectrum. Specifically, we use a specialized thermal imaging system to detect minute air leakage in masks based on the principles of heat transfer and thermodynamics. The advantage of this passive modality is that thermal imaging does not require contact with the subject and can provide instant visualization and analysis. These capabilities can prove invaluable for protecting personnel in scenarios with elevated levels of transmission risk such as hospital clinics, border check points, and airports.

  18. Roentgen stereophotogrammetric analysis using computer-based image-analysis.

    PubMed

    Ostgaard, S E; Gottlieb, L; Toksvig-Larsen, S; Lebech, A; Talbot, A; Lund, B

    1997-09-01

    The two-dimensional position of markers in radiographs for Roentgen Stereophotogrammetric Analysis (RSA) is usually determined using a measuring table. The purpose of this study was to evaluate the reproducibility and the accuracy of a new RSA system using digitized radiographs and image-processing algorithms to determine the marker position in the radiographs. Four double-RSA examinations of a phantom and 18 RSA examinations from six patients included in different RSA-studies of knee prostheses were used to test the reproducibility and the accuracy of the system. The radiographs were scanned at 600 dpi resolution and 256 gray levels. The center of each of the tantalum-markers in the radiographs was calculated by the computer program from the contour of the marker with the use of an edge-detection software algorithm after the marker was identified on a PC monitor. The study showed that computer-based image analysis can be used in RSA-examinations. The advantages of using image-processing software in RSA are that the marker positions are determined in an objective manner, and that there is no need for a systematic manual identification of all the markers on the radiograph before the actual measurement.

  19. DIAGNOSTIC IMAGING IN A DIRECT-ACCESS SPORTS PHYSICAL THERAPY CLINIC: A 2-YEAR RETROSPECTIVE PRACTICE ANALYSIS

    PubMed Central

    Dedekam, Erik A.; Johnson, Michael R.; Dembowski, Scott C.; Westrick, Richard B.; Goss, Donald L.

    2016-01-01

    Background While advanced diagnostic imaging is a large contributor to the growth in health care costs, direct-access to physical therapy is associated with decreased rates of diagnostic imaging. No study has systematically evaluated with evidence-based criteria the appropriateness of advanced diagnostic imaging, including magnetic resonance imaging (MRI), when ordered by physical therapists. The primary purpose of this study was to describe the appropriateness of magnetic resonance imaging (MRI) or magnetic resonance arthrogram (MRA) exams ordered by physical therapists in a direct-access sports physical therapy clinic. Study Design Retrospective observational study of practice. Hypothesis Greater than 80% of advanced diagnostic imaging orders would have an American College of Radiology (ACR) Appropriateness Criteria rating of greater than 6, indicating an imaging order that is usually appropriate. Methods A 2-year retrospective analysis identified 108 MRI/MRA examination orders from four physical therapists. A board-certified radiologist determined the appropriateness of each order based on ACR appropriateness criteria. The principal investigator and co-investigator radiologist assessed agreement between the clinical diagnosis and MRI/surgical findings. Results Knee (31%) and shoulder (25%) injuries were the most common. Overall, 55% of injuries were acute. The mean ACR rating was 7.7; scores from six to nine have been considered appropriate orders and higher ratings are better. The percentage of orders complying with ACR appropriateness criteria was 83.2%. Physical therapist's clinical diagnosis was confirmed by MRI/MRA findings in 64.8% of cases and was confirmed by surgical findings in 90% of cases. Conclusions Physical therapists providing musculoskeletal primary care in a direct-access sports physical therapy clinic appropriately ordered advanced diagnostic imaging in over 80% of cases. Future research should prospectively compare physical therapist

  20. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  1. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  2. Image analysis by integration of disparate information

    NASA Technical Reports Server (NTRS)

    Lemoigne, Jacqueline

    1993-01-01

    Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.

  3. Multiparametric Image Analysis of Lung Branching Morphogenesis

    PubMed Central

    Schnatwinkel, Carsten; Niswander, Lee

    2013-01-01

    BACKGROUND Lung branching morphogenesis is a fundamental developmental process, yet the cellular dynamics that occur during lung development and the molecular mechanisms underlying recent postulated branching modes are poorly understood. RESULTS Here, we implemented a time-lapse video microscopy method to study the cellular behavior and molecular mechanisms of planar bifurcation and domain branching in lung explant- and organotypic cultures. Our analysis revealed morphologically distinct stages that are shaped at least in part by a combination of localized and orientated cell divisions and by local mechanical forces. We also identified myosin light-chain kinase as an important regulator of bud bifurcation, but not domain branching in lung explants. CONCLUSION This live imaging approach provides a method to study cellular behavior during lung branching morphogenesis and suggests the importance of a mechanism primarily based on oriented cell proliferation and mechanical forces in forming and shaping the developing lung airways. PMID:23483685

  4. Multi-criteria analysis for the determination of the best WEEE management scenario in Cyprus.

    PubMed

    Rousis, K; Moustakas, K; Malamis, S; Papadopoulos, A; Loizidou, M

    2008-01-01

    Waste from electrical and electronic equipment (WEEE) constitutes one of the most complicated solid waste streams in terms of its composition, and, as a result, it is difficult to be effectively managed. In view of the environmental problems derived from WEEE management, many countries have established national legislation to improve the reuse, recycling and other forms of recovery of this waste stream so as to apply suitable management schemes. In this work, alternative systems are examined for the WEEE management in Cyprus. These systems are evaluated by developing and applying the Multi-Criteria Decision Making (MCDM) method PROMETHEE. In particular, through this MCDM method, 12 alternative management systems were compared and ranked according to their performance and efficiency. The obtained results show that the management schemes/systems based on partial disassembly are the most suitable for implementation in Cyprus. More specifically, the optimum scenario/system that can be implemented in Cyprus is that of partial disassembly and forwarding of recyclable materials to the native existing market and disposal of the residues at landfill sites. PMID:18262405

  5. Analysis of Criteria for MRI Diagnosis of TMJ Disc Displacement and Arthralgia

    PubMed Central

    Shaefer, Jeffry R.; Riley, Cara Joy; Caruso, Paul; Keith, David

    2012-01-01

    Aims. To improve diagnostic criteria for TMJ disc displacement (DD). Methods. The standard protocol for MRI diagnosis of DD, using a 12 o'clock reference position, was compared to an alternative protocol. The alternative protocol involves the functional relationship between the condyle and articular eminence, using a line perpendicular to the posterior slope of the eminence as a reference for disc position. The disc location was examined using both protocols, and disc diagnoses were compared in their relationship with joint pain. Statistical analyses included P value, sensitivity, specificity, odds ratio, and kappa statistic. Results. 58 MRIs were interpreted. 36 subjects reported arthralgia; 22 did not. Both protocols demonstrated significance (standard P = 0.004, alternative P < 0.001) for the ability to predict arthralgia. The odds of arthralgia increased in DD patients diagnosed by standard methods 9.71 times and in DD diagnosed by alternative means 37.15 times. The diagnostic sensitivity decreased 30% using the alternative versus the standard protocol (0.6389 versus 0.9444), while specificity increased 60% (0.9545 versus 0.3636). Conclusions. A stronger relationship occurs between DD and arthralgia when using a function-based protocol. The alternative protocol correctly identifies subjects without arthralgia, who by standard methods would be diagnosed with DD, as having nondisplaced discs, providing a more clinically relevant assessment of TMJ disc displacement. PMID:23304143

  6. [Analysis of new criteria for increasing the specificity of commercial Western blots].

    PubMed

    Gershy-Damet, G M; Koffi, K; Soro, B; Sanon, S; Traore, M; Assoa, A; N'Goran, K; N'Gom, A

    1992-01-01

    We examined the frequency of serum cross-reactivity on Western blot for HIV1 and HIV2. 661 patients with tuberculosis in Abidjan, and 4,899 asymptomatic persons for HIV1 and HIV2 infections were tested. All specimens positive on ELISA for HIV1 or HIV2 were further characterized by synthetic peptide based tests. Confirmed positive samples were tested by HIV1 and HIV2 specific Western blot criteres utilisis. Dual serologic reactivity on synthetic peptide tests was significantly more frequent in HIV positive patients with tuberculosis than asymptomatic subjects. Positive HIV1 Western blots were seen in 61%-86% of specimens positive for HIV2 only on synthetic peptide tests. [Cross-reactivity, to HIV2 Western blots by HIV1 positive specimens was significantly more frequent in patients with tuberculosis than in asymptomatic subjects.] Using recently recommended criteria for HIV1 and HIV2 Western blot interpretation (presence of 2 env bands) reduced the overall proportion of HIV1 positive specimens having a positive HIV2 Western blot from 39% to 14% and HIV2 positive specimens having a positive HIV1 Western blot from 31% to 8%.

  7. Characterization of Tank 23H Supernate Per Saltstone Waste Acceptance Criteria Analysis Requirements -2005

    SciTech Connect

    Oji, L

    2005-05-05

    Variable depth Tank 23H samples (22-inch sample [HTF-014] and 185-inch sample [HTF-013]) were pulled from Tank 23H in February, 2005 for characterization. The characterization of the Tank 23H low activity waste is part of the overall liquid waste processing activities. This characterization examined the species identified in the Saltstone Waste Acceptance Criteria (WAC) for the transfer of waste into the Salt-Feed Tank (SFT). The samples were delivered to the Savannah River National Laboratory (SRNL) and analyzed. Apart from radium-226 with an average measured detection limit of < 2.64E+03 pCi/mL, which is about the same order of magnitude as the WAC limit (< 8.73E+03 pCi/mL), none of the species analyzed was found to approach the limits provided in the Saltstone WAC. The concentration of most of the species analyzed for the Tank 23H samples were 2-5 orders of magnitude lower than the WAC limits. The achievable detection limits for a number of the analytes were several orders of magnitude lower than the WAC limits, but one or two orders of magnitude higher than the requested detection limits. Analytes which fell into this category included plutonium-241, europium-154/155, antimony-125, tin-126, ruthenium/rhodium-106, selenium-79, nickel-59/63, ammonium ion, copper, total nickel, manganese and total organic carbon.

  8. Characterization of Tank 23H Supernate Per Saltstone Waste Acceptance Criteria Analysis Requirements-2005

    SciTech Connect

    Oji, L

    2005-06-01

    Variable depth Tank 23H samples (22-inch sample [HTF-014] and 185-inch sample [HTF-013]) were pulled from Tank 23H in February, 2005 for characterization. The characterization of the Tank 23H low activity waste is part of the overall liquid waste processing activities. This characterization examined the species identified in the Saltstone Waste Acceptance Criteria (WAC) for the transfer of waste into the Salt-Feed Tank (SFT). The samples were delivered to the Savannah River National Laboratory (SRNL) and analyzed. Apart from radium-226 with an average measured detection limit of < 2.64E+03 pCi/mL, which is about the same order of magnitude as the WAC limit (< 8.73E+03 pCi/mL), none of the species analyzed was found to approach the limits provided in the Saltstone WAC. The concentration of most of the species analyzed for the Tank 23H samples were 2-5 orders of magnitude lower than the WAC limits. The achievable detection limits for a number of the analytes were several orders of magnitude lower than the WAC limits, but one or two orders of magnitude higher than the requested detection limits. Analytes which fell into this category included plutonium-241, europium-154/155, antimony-125, tin-126, ruthenium/rhodium-106, selenium-79, nickel-59/63, ammonium ion, copper, total nickel, manganese and total organic carbon.

  9. F-106 data summary and model results relative to threat criteria and protection design analysis

    NASA Technical Reports Server (NTRS)

    Pitts, F. L.; Finelli, G. B.; Perala, R. A.; Rudolph, T. H.

    1986-01-01

    The NASA F-106 has acquired considerable data on the rates-of-change of electromagnetic parameters on the aircraft surface during 690 direct lightning strikes while penetrating thunderstorms at altitudes ranging from 15,000 to 40,000 feet. These in-situ measurements have provided the basis for the first statistical quantification of the lightning electromagnetic threat to aircrat appropriate for determining lightning indirect effects on aircraft. The data are presently being used in updating previous lightning criteria and standards developed over the years from ground-based measurements. The new lightning standards will, therefore, be the first which reflect actual aircraft responses measured at flight altitudes. The modeling technique developed to interpret and understand the direct strike electromagnetic data acquired on the F-106 provides a means to model the interaction of the lightning channel with the F-106. The reasonable results obtained with the model, compared to measured responses, yield confidence that the model may be credibly applied to other aircraft types and uses in the prediction of internal coupling effects in the design of lightning protection for new aircraft.

  10. Multi-criteria analysis for the determination of the best WEEE management scenario in Cyprus.

    PubMed

    Rousis, K; Moustakas, K; Malamis, S; Papadopoulos, A; Loizidou, M

    2008-01-01

    Waste from electrical and electronic equipment (WEEE) constitutes one of the most complicated solid waste streams in terms of its composition, and, as a result, it is difficult to be effectively managed. In view of the environmental problems derived from WEEE management, many countries have established national legislation to improve the reuse, recycling and other forms of recovery of this waste stream so as to apply suitable management schemes. In this work, alternative systems are examined for the WEEE management in Cyprus. These systems are evaluated by developing and applying the Multi-Criteria Decision Making (MCDM) method PROMETHEE. In particular, through this MCDM method, 12 alternative management systems were compared and ranked according to their performance and efficiency. The obtained results show that the management schemes/systems based on partial disassembly are the most suitable for implementation in Cyprus. More specifically, the optimum scenario/system that can be implemented in Cyprus is that of partial disassembly and forwarding of recyclable materials to the native existing market and disposal of the residues at landfill sites.

  11. Multi-criteria analysis for the determination of the best WEEE management scenario in Cyprus

    SciTech Connect

    Rousis, K.; Moustakas, K.; Malamis, S. Papadopoulos, A.; Loizidou, M.

    2008-07-01

    Waste from electrical and electronic equipment (WEEE) constitutes one of the most complicated solid waste streams in terms of its composition, and, as a result, it is difficult to be effectively managed. In view of the environmental problems derived from WEEE management, many countries have established national legislation to improve the reuse, recycling and other forms of recovery of this waste stream so as to apply suitable management schemes. In this work, alternative systems are examined for the WEEE management in Cyprus. These systems are evaluated by developing and applying the Multi-Criteria Decision Making (MCDM) method PROMETHEE. In particular, through this MCDM method, 12 alternative management systems were compared and ranked according to their performance and efficiency. The obtained results show that the management schemes/systems based on partial disassembly are the most suitable for implementation in Cyprus. More specifically, the optimum scenario/system that can be implemented in Cyprus is that of partial disassembly and forwarding of recyclable materials to the native existing market and disposal of the residues at landfill sites.

  12. Quantitative Analysis of High-Resolution Microendoscopic Images for Diagnosis of Esophageal Squamous Cell Carcinoma

    PubMed Central

    Shin, Dongsuk; Protano, Marion-Anna; Polydorides, Alexandros D.; Dawsey, Sanford M.; Pierce, Mark C.; Kim, Michelle Kang; Schwarz, Richard A.; Quang, Timothy; Parikh, Neil; Bhutani, Manoop S.; Zhang, Fan; Wang, Guiqi; Xue, Liyan; Wang, Xueshan; Xu, Hong; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca R.

    2014-01-01

    Background & Aims High-resolution microendoscopy is an optical imaging technique with the potential to improve the accuracy of endoscopic screening for esophageal squamous neoplasia. Although these microscopic images can readily be interpreted by trained personnel, quantitative image analysis software could facilitate the use of this technology in low-resource settings. In this study we developed and evaluated quantitative image analysis criteria for the evaluation of neoplastic and non-neoplastic squamous esophageal mucosa. Methods We performed image analysis of 177 patients undergoing standard upper endoscopy for screening or surveillance of esophageal squamous neoplasia, using high-resolution microendoscopy, at 2 hospitals in China and 1 in the United States from May 2010 to October 2012. Biopsies were collected from imaged sites (n=375); a consensus diagnosis was provided by 2 expert gastrointestinal pathologists and used as the standard. Results Quantitative information from the high-resolution images was used to develop an algorithm to identify high-grade squamous dysplasia or invasive squamous cell cancer, based on histopathology findings. Optimal performance was obtained using mean nuclear area as the basis for classification, resulting in sensitivities and specificities of 93% and 92% in the training set, 87% and 97% in the test set, and 84% and 95% in an independent validation set, respectively. Conclusions High-resolution microendoscopy with quantitative image analysis can aid in the identification of esophageal squamous neoplasia. Use of software-based image guides may overcome issues of training and expertise in low-resource settings, allowing for widespread use of these optical biopsy technologies. PMID:25066838

  13. Decerns: A framework for multi-criteria decision analysis

    SciTech Connect

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; Sullivan, Terry

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  14. High resolution ultraviolet imaging spectrometer for latent image analysis.

    PubMed

    Lyu, Hang; Liao, Ningfang; Li, Hongsong; Wu, Wenmin

    2016-03-21

    In this work, we present a close-range ultraviolet imaging spectrometer with high spatial resolution, and reasonably high spectral resolution. As the transmissive optical components cause chromatic aberration in the ultraviolet (UV) spectral range, an all-reflective imaging scheme is introduced to promote the image quality. The proposed instrument consists of an oscillating mirror, a Cassegrain objective, a Michelson structure, an Offner relay, and a UV enhanced CCD. The finished spectrometer has a spatial resolution of 29.30μm on the target plane; the spectral scope covers both near and middle UV band; and can obtain approximately 100 wavelength samples over the range of 240~370nm. The control computer coordinates all the components of the instrument and enables capturing a series of images, which can be reconstructed into an interferogram datacube. The datacube can be converted into a spectrum datacube, which contains spectral information of each pixel with many wavelength samples. A spectral calibration is carried out by using a high pressure mercury discharge lamp. A test run demonstrated that this interferometric configuration can obtain high resolution spectrum datacube. The pattern recognition algorithm is introduced to analyze the datacube and distinguish the latent traces from the base materials. This design is particularly good at identifying the latent traces in the application field of forensic imaging.

  15. The development of a multi-criteria decision analysis aid to help with contraceptive choices: My Contraception Tool.

    PubMed

    French, Rebecca S; Cowan, Frances M; Wellings, Kaye; Dowie, Jack

    2014-04-01

    My Contraception Tool (MCT) applies the principles of multi-criteria decision analysis to the choice of contraceptive method. Its purpose is to make the decision-making process transparent to the user and to suggest a method to them based on their own preferences. The contraceptive option that emerges as optimal from the analysis takes account of the probability of a range of outcomes and the relative weight ascribed to them by the user. The development of MCT was a collaborative project between London School of Hygiene & Tropical Medicine, Brook, FPA and Maldaba Ltd. MCT is available online via the Brook and FPA websites. In this article we describe MCT's development and how it works. Further work is needed to assess the impact it has on decision quality and contraceptive behaviour.

  16. Imaging biomarkers in multiple Sclerosis: From image analysis to population imaging.

    PubMed

    Barillot, Christian; Edan, Gilles; Commowick, Olivier

    2016-10-01

    The production of imaging data in medicine increases more rapidly than the capacity of computing models to extract information from it. The grand challenges of better understanding the brain, offering better care for neurological disorders, and stimulating new drug design will not be achieved without significant advances in computational neuroscience. The road to success is to develop a new, generic, computational methodology and to confront and validate this methodology on relevant diseases with adapted computational infrastructures. This new concept sustains the need to build new research paradigms to better understand the natural history of the pathology at the early phase; to better aggregate data that will provide the most complete representation of the pathology in order to better correlate imaging with other relevant features such as clinical, biological or genetic data. In this context, one of the major challenges of neuroimaging in clinical neurosciences is to detect quantitative signs of pathological evolution as early as possible to prevent disease progression, evaluate therapeutic protocols or even better understand and model the natural history of a given neurological pathology. Many diseases encompass brain alterations often not visible on conventional MRI sequences, especially in normal appearing brain tissues (NABT). MRI has often a low specificity for differentiating between possible pathological changes which could help in discriminating between the different pathological stages or grades. The objective of medical image analysis procedures is to define new quantitative neuroimaging biomarkers to track the evolution of the pathology at different levels. This paper illustrates this issue in one acute neuro-inflammatory pathology: Multiple Sclerosis (MS). It exhibits the current medical image analysis approaches and explains how this field of research will evolve in the next decade to integrate larger scale of information at the temporal, cellular

  17. Quartic Rotation Criteria and Algorithms.

    ERIC Educational Resources Information Center

    Clarkson, Douglas B.; Jennrich, Robert I.

    1988-01-01

    Most of the current analytic rotation criteria for simple structure in factor analysis are summarized and identified as members of a general symmetric family of quartic criteria. A unified development of algorithms for orthogonal and direct oblique rotation using arbitrary criteria from this family is presented. (Author/TJH)

  18. Criteria for the use of regression analysis for remote sensing of sediment and pollutants

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R. (Principal Investigator)

    1982-01-01

    Data analysis procedures for quantification of water quality parameters that are already identified and are known to exist within the water body are considered. The liner multiple-regression technique was examined as a procedure for defining and calibrating data analysis algorithms for such instruments as spectrometers and multispectral scanners.

  19. A framework for joint image-and-shape analysis

    NASA Astrophysics Data System (ADS)

    Gao, Yi; Tannenbaum, Allen; Bouix, Sylvain

    2014-03-01

    Techniques in medical image analysis are many times used for the comparison or regression on the intensities of images. In general, the domain of the image is a given Cartesian grids. Shape analysis, on the other hand, studies the similarities and differences among spatial objects of arbitrary geometry and topology. Usually, there is no function defined on the domain of shapes. Recently, there has been a growing needs for defining and analyzing functions defined on the shape space, and a coupled analysis on both the shapes and the functions defined on them. Following this direction, in this work we present a coupled analysis for both images and shapes. As a result, the statistically significant discrepancies in both the image intensities as well as on the underlying shapes are detected. The method is applied on both brain images for the schizophrenia and heart images for atrial fibrillation patients.

  20. Geostatistical analysis of groundwater level using Euclidean and non-Euclidean distance metrics and variable variogram fitting criteria

    NASA Astrophysics Data System (ADS)

    Theodoridou, Panagiota G.; Karatzas, George P.; Varouchakis, Emmanouil A.; Corzo Perez, Gerald A.

    2015-04-01

    Groundwater level is an important information in hydrological modelling. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram model is very important for the optimal method performance. This work compares three different criteria, the least squares sum method, the Akaike Information Criterion and the Cressie's Indicator, to assess the theoretical variogram that fits to the experimental one and investigates the impact on the prediction results. Moreover, five different distance functions (Euclidean, Minkowski, Manhattan, Canberra, and Bray-Curtis) are applied to calculate the distance between observations that affects both the variogram calculation and the Kriging estimator. Cross validation analysis in terms of Ordinary Kriging is applied by using sequentially a different distance metric and the above three variogram fitting criteria. The spatial dependence of the observations in the tested dataset is studied by fitting classical variogram models and the Matérn model. The proposed comparison analysis performed for a data set of two hundred fifty hydraulic head measurements distributed over an alluvial aquifer that covers an area of 210 km2. The study area is located in the Prefecture of Drama, which belongs to the Water District of East Macedonia (Greece). This area was selected in terms of hydro-geological data availability and geological homogeneity. The analysis showed that a combination of the Akaike information Criterion for the variogram fitting assessment and the Brays-Curtis distance metric provided the most accurate cross-validation results. The Power-law variogram model provided the best fit to the experimental data. The aforementioned approach for the specific dataset in terms of the Ordinary Kriging method improves the prediction efficiency in comparison to the classical Euclidean distance metric. Therefore, maps of the spatial

  1. ST segment/heart rate slope as a predictor of coronary artery disease: comparison with quantitative thallium imaging and conventional ST segment criteria

    SciTech Connect

    Finkelhor, R.S.; Newhouse, K.E.; Vrobel, T.R.; Miron, S.D.; Bahler, R.C.

    1986-08-01

    The ST segment shift relative to exercise-induced increments in heart rate, the ST/heart rate slope (ST/HR slope), has been proposed as a more accurate ECG criterion for diagnosing significant coronary artery disease (CAD). Its clinical utility, with the use of a standard treadmill protocol, was compared with quantitative stress thallium (TI) and standard treadmill criteria in 64 unselected patients who underwent coronary angiography. The overall diagnostic accuracy of the ST/HR slope was an improvement over TI and conventional ST criteria (81%, 67%, and 69%). For patients failing to reach 85% of their age-predicted maximal heart rate, its diagnostic accuracy was comparable with TI (77% and 74%). Its sensitivity in patients without prior myocardial infarctions was equivalent to that of thallium (91% and 95%). The ST/HR slope was directly related to the angiographic severity (Gensini score) of CAD in patients without a prior infarction (r = 0.61, p less than 0.001). The ST/HR slope was an improved ECG criterion for diagnosing CAD and compared favorably with TI imaging.

  2. Exercise and ankylosing spondylitis with New York modified criteria: a systematic review of controlled trials with meta-analysis.

    PubMed

    Martins, N A; Furtado, Guilherme Eustáquio; Campos, Maria João; Leitão, José Carlos; Filaire, Edith; Ferreira, José Pedro

    2014-01-01

    Ankylosing spondylitis is a systemic rheumatic disease that affects the axial skeleton, causing inflammatory back pain, structural and functional changes which decrease quality of life. Several treatments for ankylosing spondylitis have been proposed and among them the use of exercise. The present study aims to synthesize information from the literature and identify the results of controlled clinical trials on exercise in patients with ankylosing spondylitis with the New York modified diagnostic criteria and to assess whether exercise is more effective than physical activity to reduce functional impairment. The sources of studies used were: LILACS, Pubmed, EBSCOhost, B-on, personal communication, manual research and lists of references. The criteria used for the studies selection was controlled clinical trials, participants with New York modified diagnostic criteria for ankylosing spondylitis, and with interventions through exercise. The variables studied were related to primary outcomes such as BASFI (Bath Ankylosing Spondylitis Functional Index) as a functional index, BASDAI (Bath Ankylosing Spondylitis Disease Activity Index) as an index of intensity of disease activity and BASMI (Bath Ankylosing Spondylitis Metrology Index) as a metrological index assessing patient's limitation on movement. From the 603 studies identified after screening only 37 articles were selected for eligibility, from which 18 studies were included. The methodological quality was assessed to select those with an high methodological expressiveness using the PEDro scale. A cumulative meta-analysis was subsequently performed to compare exercise versus usual level of physical activity. Exercise shows significant statistical outcomes for the BASFI, BASDAI and BASMI, higher than those found for usual level of physical activity.

  3. Wild Fire Risk Map in the Eastern Steppe of Mongolia Using Spatial Multi-Criteria Analysis

    NASA Astrophysics Data System (ADS)

    Nasanbat, Elbegjargal; Lkhamjav, Ochirkhuyag

    2016-06-01

    Grassland fire is a cause of major disturbance to ecosystems and economies throughout the world. This paper investigated to identify risk zone of wildfire distributions on the Eastern Steppe of Mongolia. The study selected variables for wildfire risk assessment using a combination of data collection, including Social Economic, Climate, Geographic Information Systems, Remotely sensed imagery, and statistical yearbook information. Moreover, an evaluation of the result is used field validation data and assessment. The data evaluation resulted divided by main three group factors Environmental, Social Economic factor, Climate factor and Fire information factor into eleven input variables, which were classified into five categories by risk levels important criteria and ranks. All of the explanatory variables were integrated into spatial a model and used to estimate the wildfire risk index. Within the index, five categories were created, based on spatial statistics, to adequately assess respective fire risk: very high risk, high risk, moderate risk, low and very low. Approximately more than half, 68 percent of the study area was predicted accuracy to good within the very high, high risk and moderate risk zones. The percentages of actual fires in each fire risk zone were as follows: very high risk, 42 percent; high risk, 26 percent; moderate risk, 13 percent; low risk, 8 percent; and very low risk, 11 percent. The main overall accuracy to correct prediction from the model was 62 percent. The model and results could be support in spatial decision making support system processes and in preventative wildfire management strategies. Also it could be help to improve ecological and biodiversity conservation management.

  4. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    NASA Astrophysics Data System (ADS)

    Teichert, K.; Süss, P.; Serna, J. I.; Monz, M.; Küfer, K. H.; Thieke, C.

    2011-06-01

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.

  5. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning.

    PubMed

    Teichert, K; Süss, P; Serna, J I; Monz, M; Küfer, K H; Thieke, C

    2011-06-21

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g., photons versus protons) than with the classical method of comparing single treatment plans.

  6. Image pattern recognition supporting interactive analysis and graphical visualization

    NASA Technical Reports Server (NTRS)

    Coggins, James M.

    1992-01-01

    Image Pattern Recognition attempts to infer properties of the world from image data. Such capabilities are crucial for making measurements from satellite or telescope images related to Earth and space science problems. Such measurements can be the required product itself, or the measurements can be used as input to a computer graphics system for visualization purposes. At present, the field of image pattern recognition lacks a unified scientific structure for developing and evaluating image pattern recognition applications. The overall goal of this project is to begin developing such a structure. This report summarizes results of a 3-year research effort in image pattern recognition addressing the following three principal aims: (1) to create a software foundation for the research and identify image pattern recognition problems in Earth and space science; (2) to develop image measurement operations based on Artificial Visual Systems; and (3) to develop multiscale image descriptions for use in interactive image analysis.

  7. Prognostic Relevance of Objective Response According to EASL Criteria and mRECIST Criteria in Hepatocellular Carcinoma Patients Treated with Loco-Regional Therapies: A Literature-Based Meta-Analysis

    PubMed Central

    Vincenzi, Bruno; Di Maio, Massimo; Silletta, Marianna; D’Onofrio, Loretta; Spoto, Chiara; Piccirillo, Maria Carmela; Daniele, Gennaro; Comito, Francesca; Maci, Eliana; Bronte, Giuseppe; Russo, Antonio; Santini, Daniele; Perrone, Francesco; Tonini, Giuseppe

    2015-01-01

    Background The European Association for the Study of the Liver (EASL) criteria and the modified Response Evaluation Criteria in Solid Tumors (mRECIST) are currently adopted to evaluate radiological response in patients affected by HCC and treated with loco-regional procedures. Several studies explored the validity of these measurements in predicting survival but definitive data are still lacking. Aim To conduct a systematic review of studies exploring mRECIST and EASL criteria usefulness in predictive radiological response in HCC undergoing loco-regional therapies and their validity in predicting survival. Methods A comprehensive search of the literature was performed in electronic databases EMBASE, MEDLINE, COCHRANE LIBRARY, ASCO conferences and EASL conferences up to June 10, 2014. Our overall search strategy included terms for HCC, mRECIST, and EASL. Loco-regional procedures included transarterial embolization (TAE), transarterial chemoembolization (TACE) and cryoablation. Inter-method agreement between EASL and mRECIST was assessed using the k coefficient. For each criteria, overall survival was described in responders vs. non-responders patients, considering all target lesions response. Results Among 18 initially found publications, 7 reports including 1357 patients were considered eligible. All studies were published as full-text articles. Proportion of responders according to mRECIST and EASL criteria was 62.4% and 61.3%, respectively. In the pooled population, 1286 agreements were observed between the two methods (kappa statistics 0.928, 95% confidence interval 0.912–0.944). HR for overall survival (responders versus non responders) according to mRECIST and EASL was 0.39 (95% confidence interval 0.26–0.61, p<0.0001) and 0.38 (95% confidence interval 0.24–0.61, p<0.0001), respectively. Conclusion In this literature-based meta-analysis, mRECIST and EASL criteria showed very good concordance in HCC patients undergoing loco-regional treatments. Objective

  8. Three modality image registration of brain SPECT/CT and MR images for quantitative analysis of dopamine transporter imaging

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Yuzuho; Takeda, Yuta; Hara, Takeshi; Zhou, Xiangrong; Matsusako, Masaki; Tanaka, Yuki; Hosoya, Kazuhiko; Nihei, Tsutomu; Katafuchi, Tetsuro; Fujita, Hiroshi

    2016-03-01

    Important features in Parkinson's disease (PD) are degenerations and losses of dopamine neurons in corpus striatum. 123I-FP-CIT can visualize activities of the dopamine neurons. The activity radio of background to corpus striatum is used for diagnosis of PD and Dementia with Lewy Bodies (DLB). The specific activity can be observed in the corpus striatum on SPECT images, but the location and the shape of the corpus striatum on SPECT images only are often lost because of the low uptake. In contrast, MR images can visualize the locations of the corpus striatum. The purpose of this study was to realize a quantitative image analysis for the SPECT images by using image registration technique with brain MR images that can determine the region of corpus striatum. In this study, the image fusion technique was used to fuse SPECT and MR images by intervening CT image taken by SPECT/CT. The mutual information (MI) for image registration between CT and MR images was used for the registration. Six SPECT/CT and four MR scans of phantom materials are taken by changing the direction. As the results of the image registrations, 16 of 24 combinations were registered within 1.3mm. By applying the approach to 32 clinical SPECT/CT and MR cases, all of the cases were registered within 0.86mm. In conclusions, our registration method has a potential in superimposing MR images on SPECT images.

  9. Analysis of physical processes via imaging vectors

    NASA Astrophysics Data System (ADS)

    Volovodenko, V.; Efremova, N.; Efremov, V.

    2016-06-01

    Practically, all modeling processes in one way or another are random. The foremost formulated theoretical foundation embraces Markov processes, being represented in different forms. Markov processes are characterized as a random process that undergoes transitions from one state to another on a state space, whereas the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. In the Markov processes the proposition (model) of the future by no means changes in the event of the expansion and/or strong information progression relative to preceding time. Basically, modeling physical fields involves process changing in time, i.e. non-stationay processes. In this case, the application of Laplace transformation provides unjustified description complications. Transition to other possibilities results in explicit simplification. The method of imaging vectors renders constructive mathematical models and necessary transition in the modeling process and analysis itself. The flexibility of the model itself using polynomial basis leads to the possible rapid transition of the mathematical model and further analysis acceleration. It should be noted that the mathematical description permits operator representation. Conversely, operator representation of the structures, algorithms and data processing procedures significantly improve the flexibility of the modeling process.

  10. A virtual laboratory for medical image analysis.

    PubMed

    Olabarriaga, Sílvia D; Glatard, Tristan; de Boer, Piter T

    2010-07-01

    This paper presents the design, implementation, and usage of a virtual laboratory for medical image analysis. It is fully based on the Dutch grid, which is part of the Enabling Grids for E-sciencE (EGEE) production infrastructure and driven by the gLite middleware. The adopted service-oriented architecture enables decoupling the user-friendly clients running on the user's workstation from the complexity of the grid applications and infrastructure. Data are stored on grid resources and can be browsed/viewed interactively by the user with the Virtual Resource Browser (VBrowser). Data analysis pipelines are described as Scufl workflows and enacted on the grid infrastructure transparently using the MOTEUR workflow management system. VBrowser plug-ins allow for easy experiment monitoring and error detection. Because of the strict compliance to the grid authentication model, all operations are performed on behalf of the user, ensuring basic security and facilitating collaboration across organizations. The system has been operational and in daily use for eight months (December 2008), with six users, leading to the submission of 9000 jobs/month in average and the production of several terabytes of data.

  11. Medical Image Analysis by Cognitive Information Systems - a Review.

    PubMed

    Ogiela, Lidia; Takizawa, Makoto

    2016-10-01

    This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis. PMID:27526188

  12. Medical Image Analysis by Cognitive Information Systems - a Review.

    PubMed

    Ogiela, Lidia; Takizawa, Makoto

    2016-10-01

    This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis.

  13. Ripening of salami: assessment of colour and aspect evolution using image analysis and multivariate image analysis.

    PubMed

    Fongaro, Lorenzo; Alamprese, Cristina; Casiraghi, Ernestina

    2015-03-01

    During ripening of salami, colour changes occur due to oxidation phenomena involving myoglobin. Moreover, shrinkage due to dehydration results in aspect modifications, mainly ascribable to fat aggregation. The aim of this work was the application of image analysis (IA) and multivariate image analysis (MIA) techniques to the study of colour and aspect changes occurring in salami during ripening. IA results showed that red, green, blue, and intensity parameters decreased due to the development of a global darker colour, while Heterogeneity increased due to fat aggregation. By applying MIA, different salami slice areas corresponding to fat and three different degrees of oxidised meat were identified and quantified. It was thus possible to study the trend of these different areas as a function of ripening, making objective an evaluation usually performed by subjective visual inspection.

  14. Dynamic chest image analysis: model-based pulmonary perfusion analysis with pyramid images

    NASA Astrophysics Data System (ADS)

    Liang, Jianming; Haapanen, Arto; Jaervi, Timo; Kiuru, Aaro J.; Kormano, Martti; Svedstrom, Erkki; Virkki, Raimo

    1998-07-01

    The aim of the study 'Dynamic Chest Image Analysis' is to develop computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected at different phases of the respiratory/cardiac cycles in a short period of time. We have proposed a framework for ventilation study with an explicit ventilation model based on pyramid images. In this paper, we extend the framework to pulmonary perfusion study. A perfusion model and the truncated pyramid are introduced. The perfusion model aims at extracting accurate, geographic perfusion parameters, and the truncated pyramid helps in understanding perfusion at multiple resolutions and speeding up the convergence process in optimization. Three cases are included to illustrate the experimental results.

  15. A multi-criteria analysis of options for energy recovery from municipal solid waste in India and the UK.

    PubMed

    Yap, H Y; Nixon, J D

    2015-12-01

    Energy recovery from municipal solid waste plays a key role in sustainable waste management and energy security. However, there are numerous technologies that vary in suitability for different economic and social climates. This study sets out to develop and apply a multi-criteria decision making methodology that can be used to evaluate the trade-offs between the benefits, opportunities, costs and risks of alternative energy from waste technologies in both developed and developing countries. The technologies considered are mass burn incineration, refuse derived fuel incineration, gasification, anaerobic digestion and landfill gas recovery. By incorporating qualitative and quantitative assessments, a preference ranking of the alternative technologies is produced. The effect of variations in decision criteria weightings are analysed in a sensitivity analysis. The methodology is applied principally to compare and assess energy recovery from waste options in the UK and India. These two countries have been selected as they could both benefit from further development of their waste-to-energy strategies, but have different technical and socio-economic challenges to consider. It is concluded that gasification is the preferred technology for the UK, whereas anaerobic digestion is the preferred technology for India. We believe that the presented methodology will be of particular value for waste-to-energy decision-makers in both developed and developing countries.

  16. A multi-criteria analysis of options for energy recovery from municipal solid waste in India and the UK.

    PubMed

    Yap, H Y; Nixon, J D

    2015-12-01

    Energy recovery from municipal solid waste plays a key role in sustainable waste management and energy security. However, there are numerous technologies that vary in suitability for different economic and social climates. This study sets out to develop and apply a multi-criteria decision making methodology that can be used to evaluate the trade-offs between the benefits, opportunities, costs and risks of alternative energy from waste technologies in both developed and developing countries. The technologies considered are mass burn incineration, refuse derived fuel incineration, gasification, anaerobic digestion and landfill gas recovery. By incorporating qualitative and quantitative assessments, a preference ranking of the alternative technologies is produced. The effect of variations in decision criteria weightings are analysed in a sensitivity analysis. The methodology is applied principally to compare and assess energy recovery from waste options in the UK and India. These two countries have been selected as they could both benefit from further development of their waste-to-energy strategies, but have different technical and socio-economic challenges to consider. It is concluded that gasification is the preferred technology for the UK, whereas anaerobic digestion is the preferred technology for India. We believe that the presented methodology will be of particular value for waste-to-energy decision-makers in both developed and developing countries. PMID:26275797

  17. Wndchrm – an open source utility for biological image analysis

    PubMed Central

    Shamir, Lior; Orlov, Nikita; Eckley, D Mark; Macura, Tomasz; Johnston, Josiah; Goldberg, Ilya G

    2008-01-01

    Background Biological imaging is an emerging field, covering a wide range of applications in biological and clinical research. However, while machinery for automated experimenting and data acquisition has been developing rapidly in the past years, automated image analysis often introduces a bottleneck in high content screening. Methods Wndchrm is an open source utility for biological image analysis. The software works by first extracting image content descriptors from the raw image, image transforms, and compound image transforms. Then, the most informative features are selected, and the feature vector of each image is used for classification and similarity measurement. Results Wndchrm has been tested using several publicly available biological datasets, and provided results which are favorably comparable to the performance of task-specific algorithms developed for these datasets. The simple user interface allows researchers who are not knowledgeable in computer vision methods and have no background in computer programming to apply image analysis to their data. Conclusion We suggest that wndchrm can be effectively used for a wide range of biological image analysis tasks. Using wndchrm can allow scientists to perform automated biological image analysis while avoiding the costly challenge of implementing computer vision and pattern recognition algorithms. PMID:18611266

  18. Wave-Optics Analysis of Pupil Imaging

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.; Bos, Brent J.

    2006-01-01

    Pupil imaging performance is analyzed from the perspective of physical optics. A multi-plane diffraction model is constructed by propagating the scalar electromagnetic field, surface by surface, along the optical path comprising the pupil imaging optical system. Modeling results are compared with pupil images collected in the laboratory. The experimental setup, although generic for pupil imaging systems in general, has application to the James Webb Space Telescope (JWST) optical system characterization where the pupil images are used as a constraint to the wavefront sensing and control process. Practical design considerations follow from the diffraction modeling which are discussed in the context of the JWST Observatory.

  19. Advanced image analysis for the preservation of cultural heritage

    NASA Astrophysics Data System (ADS)

    France, Fenella G.; Christens-Barry, William; Toth, Michael B.; Boydston, Kenneth

    2010-02-01

    The Library of Congress' Preservation Research and Testing Division has established an advanced preservation studies scientific program for research and analysis of the diverse range of cultural heritage objects in its collection. Using this system, the Library is currently developing specialized integrated research methodologies for extending preservation analytical capacities through non-destructive hyperspectral imaging of cultural objects. The research program has revealed key information to support preservation specialists, scholars and other institutions. The approach requires close and ongoing collaboration between a range of scientific and cultural heritage personnel - imaging and preservation scientists, art historians, curators, conservators and technology analysts. A research project of the Pierre L'Enfant Plan of Washington DC, 1791 had been undertaken to implement and advance the image analysis capabilities of the imaging system. Innovative imaging options and analysis techniques allow greater processing and analysis capacities to establish the imaging technique as the first initial non-invasive analysis and documentation step in all cultural heritage analyses. Mapping spectral responses, organic and inorganic data, topography semi-microscopic imaging, and creating full spectrum images have greatly extended this capacity from a simple image capture technique. Linking hyperspectral data with other non-destructive analyses has further enhanced the research potential of this image analysis technique.

  20. Super-resolution analysis of microwave image using WFIPOCS

    NASA Astrophysics Data System (ADS)

    Wang, Xue; Wu, Jin

    2013-03-01

    Microwave images are always blurred and distorted. Super-resolution analysis is crucial in microwave image processing. In this paper, we propose the WFIPOCS algorithm, which represents the wavelet-based fractal interpolation incorporates the improved projection onto convex sets (IPOCS) technique. Firstly, we apply down sampling and wiener filtering to a low resolution (LR) microwave image. Then, the wavelet-based fractal interpolation is applied to preprocess the LR image. Finally, the IPOCS technique is applied to solve the problems arisen by interpolation and to approach a high resolution (HR) image. The experimental results indicate that the WFIPOCS algorithm improves spatial resolution of microwave images.

  1. Multi-criteria evaluation of CMIP5 GCMs for climate change impact analysis

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Rana, Arun; Moradkhani, Hamid; Sharma, Ashish

    2015-12-01

    Climate change is expected to have severe impacts on global hydrological cycle along with food-water-energy nexus. Currently, there are many climate models used in predicting important climatic variables. Though there have been advances in the field, there are still many problems to be resolved related to reliability, uncertainty, and computing needs, among many others. In the present work, we have analyzed performance of 20 different global climate models (GCMs) from Climate Model Intercomparison Project Phase 5 (CMIP5) dataset over the Columbia River Basin (CRB) in the Pacific Northwest USA. We demonstrate a statistical multicriteria approach, using univariate and multivariate techniques, for selecting suitable GCMs to be used for climate change impact analysis in the region. Univariate methods includes mean, standard deviation, coefficient of variation, relative change (variability), Mann-Kendall test, and Kolmogorov-Smirnov test (KS-test); whereas multivariate methods used were principal component analysis (PCA), singular value decomposition (SVD), canonical correlation analysis (CCA), and cluster analysis. The analysis is performed on raw GCM data, i.e., before bias correction, for precipitation and temperature climatic variables for all the 20 models to capture the reliability and nature of the particular model at regional scale. The analysis is based on spatially averaged datasets of GCMs and observation for the period of 1970 to 2000. Ranking is provided to each of the GCMs based on the performance evaluated against gridded observational data on various temporal scales (daily, monthly, and seasonal). Results have provided insight into each of the methods and various statistical properties addressed by them employed in ranking GCMs. Further; evaluation was also performed for raw GCM simulations against different sets of gridded observational dataset in the area.

  2. Image analysis for dental bone quality assessment using CBCT imaging

    NASA Astrophysics Data System (ADS)

    Suprijanto; Epsilawati, L.; Hajarini, M. S.; Juliastuti, E.; Susanti, H.

    2016-03-01

    Cone beam computerized tomography (CBCT) is one of X-ray imaging modalities that are applied in dentistry. Its modality can visualize the oral region in 3D and in a high resolution. CBCT jaw image has potential information for the assessment of bone quality that often used for pre-operative implant planning. We propose comparison method based on normalized histogram (NH) on the region of inter-dental septum and premolar teeth. Furthermore, the NH characteristic from normal and abnormal bone condition are compared and analyzed. Four test parameters are proposed, i.e. the difference between teeth and bone average intensity (s), the ratio between bone and teeth average intensity (n) of NH, the difference between teeth and bone peak value (Δp) of NH, and the ratio between teeth and bone of NH range (r). The results showed that n, s, and Δp have potential to be the classification parameters of dental calcium density.

  3. Analysis of Anechoic Chamber Testing of the Hurricane Imaging Radiometer

    NASA Technical Reports Server (NTRS)

    Fenigstein, David; Ruf, Chris; James, Mark; Simmons, David; Miller, Timothy; Buckley, Courtney

    2010-01-01

    The Hurricane Imaging Radiometer System (HIRAD) is a new airborne passive microwave remote sensor developed to observe hurricanes. HIRAD incorporates synthetic thinned array radiometry technology, which use Fourier synthesis to reconstruct images from an array of correlated antenna elements. The HIRAD system response to a point emitter has been measured in an anechoic chamber. With this data, a Fourier inversion image reconstruction algorithm has been developed. Performance analysis of the apparatus is presented, along with an overview of the image reconstruction algorithm

  4. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  5. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  6. Chapter 1 Eligibility Factors and Weights: Using Probit Analysis To Determine Eligibility Criteria.

    ERIC Educational Resources Information Center

    Willis, John A.

    Kanawha County (West Virginia) schools use Z-scores to identify elementary students eligible for Chapter 1 services in reading and mathematics. A probit analysis of over 500 previously served students was used to determine the variables and weights in the Z-score equations. Independent variables were chosen from those commonly used to identify…

  7. K Basin sludge packaging design criteria (PDC) and safety analysis report for packaging (SARP) approval plan

    SciTech Connect

    Brisbin, S.A.

    1996-03-06

    This document delineates the plan for preparation, review, and approval of the Packaging Design Crieteria for the K Basin Sludge Transportation System and the Associated on-site Safety Analysis Report for Packaging. The transportation system addressed in the subject documents will be used to transport sludge from the K Basins using bulk packaging.

  8. Slide Set: reproducible image analysis and batch processing with ImageJ

    PubMed Central

    Nanes, Benjamin A.

    2015-01-01

    Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets that are common in biology. This paper introduces Slide Set, a framework for reproducible image analysis and batch processing with ImageJ. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution. PMID:26554504

  9. Slide Set: Reproducible image analysis and batch processing with ImageJ.

    PubMed

    Nanes, Benjamin A

    2015-11-01

    Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets common in biology. Here we present the Slide Set plugin for ImageJ, which provides a framework for reproducible image analysis and batch processing. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution.

  10. Multi-criteria decision analysis for health technology assessment in Canada: insights from an expert panel discussion.

    PubMed

    Diaby, Vakaramoko; Goeree, Ron; Hoch, Jeffrey; Siebert, Uwe

    2015-02-01

    Multi-criteria decision analysis (MCDA), a decision-making tool, has received increasing attention in recent years, notably in the healthcare field. For Canada, it is unclear whether and how MCDA should be incorporated into the existing health technology assessment (HTA) decision-making process. To facilitate debate on improving HTA decision-making in Canada, a workshop was held in conjunction with the 8th World Congress on Health Economics of the International Health Economics Association in Toronto, Canada in July 2011. The objective of the workshop was to discuss the potential benefits and challenges related to the use of MCDA for HTA decision-making in Canada. This paper summarizes and discusses the recommendations of an expert panel convened at the workshop to discuss opportunities and concerns with reference to the implementation of MCDA in Canada.

  11. HOW TO DEAL WITH WASTE ACCEPTANCE UNCERTAINTY USING THE WASTE ACCEPTANCE CRITERIA FORECASTING AND ANALYSIS CAPABILITY SYSTEM (WACFACS)

    SciTech Connect

    Redus, K. S.; Hampshire, G. J.; Patterson, J. E.; Perkins, A. B.

    2002-02-25

    The Waste Acceptance Criteria Forecasting and Analysis Capability System (WACFACS) is used to plan for, evaluate, and control the supply of approximately 1.8 million yd3 of low-level radioactive, TSCA, and RCRA hazardous wastes from over 60 environmental restoration projects between FY02 through FY10 to the Oak Ridge Environmental Management Waste Management Facility (EMWMF). WACFACS is a validated decision support tool that propagates uncertainties inherent in site-related contaminant characterization data, disposition volumes during EMWMF operations, and project schedules to quantitatively determine the confidence that risk-based performance standards are met. Trade-offs in schedule, volumes of waste lots, and allowable concentrations of contaminants are performed to optimize project waste disposition, regulatory compliance, and disposal cell management.

  12. Image analysis of neuropsychological test responses

    NASA Astrophysics Data System (ADS)

    Smith, Stephen L.; Hiller, Darren L.

    1996-04-01

    This paper reports recent advances in the development of an automated approach to neuropsychological testing. High performance image analysis algorithms have been developed as part of a convenient and non-invasive computer-based system to provide an objective assessment of patient responses to figure-copying tests. Tests of this type are important in determining the neurological function of patients following stroke through evaluation of their visuo-spatial performance. Many conventional neuropsychological tests suffer from the serious drawback that subjective judgement on the part of the tester is required in the measurement of the patient's response which leads to a qualitative neuropsychological assessment that can be both inconsistent and inaccurate. Results for this automated approach are presented for three clinical populations: patients suffering right hemisphere stroke are compared with adults with no known neurological disorder and a population comprising normal school children of 11 years is presented to demonstrate the sensitivity of the technique. As well as providing a more reliable and consistent diagnosis this technique is sufficiently sensitive to monitor a patient's progress over a period of time and will provide the neuropsychologist with a practical means of evaluating the effectiveness of therapy or medication administered as part of a rehabilitation program.

  13. The Dynairship. [structural design criteria and feasibility analysis of an airplane - airship

    NASA Technical Reports Server (NTRS)

    Miller, W. M., Jr.

    1975-01-01

    A feasibility analysis for the construction and use of a combination airplane-airship named 'Dynairship' is undertaken. Payload capacities, fuel consumption, and the structural design of the craft are discussed and compared to a conventional commercial aircraft (a Boeing 747). Cost estimates of construction and operation of the craft are also discussed. The various uses of the craft are examined (i.e, in police work, materials handling, and ocean surveillance), and aerodynamic configurations and photographs are shown.

  14. An optimal control approach to pilot/vehicle analysis and Neal-Smith criteria

    NASA Technical Reports Server (NTRS)

    Bacon, B. J.; Schmidt, D. K.

    1984-01-01

    The approach of Neal and Smith was merged with the advances in pilot modeling by means of optimal control techniques. While confirming the findings of Neal and Smith, a methodology that explicitly includes the pilot's objective in attitude tracking was developed. More importantly, the method yields the required system bandwidth along with a better pilot model directly applicable to closed-loop analysis of systems in any order.

  15. Covariance Based Pre-Filters and Screening Criteria for Conjunction Analysis

    NASA Astrophysics Data System (ADS)

    George, E., Chan, K.

    2012-09-01

    Several relationships are developed relating object size, initial covariance and range at closest approach to probability of collision. These relationships address the following questions: - Given the objects' initial covariance and combined hard body size, what is the maximum possible value of the probability of collision (Pc)? - Given the objects' initial covariance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the combined hard body radius, what is the minimum miss distance for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the miss distance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? The first relationship above allows the elimination of object pairs from conjunction analysis (CA) on the basis of the initial covariance and hard-body sizes of the objects. The application of this pre-filter to present day catalogs with estimated covariance results in the elimination of approximately 35% of object pairs as unable to ever conjunct with a probability of collision exceeding 1x10-6. Because Pc is directly proportional to object size and inversely proportional to covariance size, this pre-filter will have a significantly larger impact on future catalogs, which are expected to contain a much larger fraction of small debris tracked only by a limited subset of available sensors. This relationship also provides a mathematically rigorous basis for eliminating objects from analysis entirely based on element set age or quality - a practice commonly done by rough rules of thumb today. Further, these relations can be used to determine the required geometric screening radius for all objects. This analysis reveals the screening volumes for small objects are much larger than needed, while the screening volumes for

  16. Multimodal digital color imaging system for facial skin lesion analysis

    NASA Astrophysics Data System (ADS)

    Bae, Youngwoo; Lee, Youn-Heum; Jung, Byungjo

    2008-02-01

    In dermatology, various digital imaging modalities have been used as an important tool to quantitatively evaluate the treatment effect of skin lesions. Cross-polarization color image was used to evaluate skin chromophores (melanin and hemoglobin) information and parallel-polarization image to evaluate skin texture information. In addition, UV-A induced fluorescent image has been widely used to evaluate various skin conditions such as sebum, keratosis, sun damages, and vitiligo. In order to maximize the evaluation efficacy of various skin lesions, it is necessary to integrate various imaging modalities into an imaging system. In this study, we propose a multimodal digital color imaging system, which provides four different digital color images of standard color image, parallel and cross-polarization color image, and UV-A induced fluorescent color image. Herein, we describe the imaging system and present the examples of image analysis. By analyzing the color information and morphological features of facial skin lesions, we are able to comparably and simultaneously evaluate various skin lesions. In conclusion, we are sure that the multimodal color imaging system can be utilized as an important assistant tool in dermatology.

  17. Processing, analysis, recognition, and automatic understanding of medical images

    NASA Astrophysics Data System (ADS)

    Tadeusiewicz, Ryszard; Ogiela, Marek R.

    2004-07-01

    Paper presents some new ideas introducing automatic understanding of the medical images semantic content. The idea under consideration can be found as next step on the way starting from capturing of the images in digital form as two-dimensional data structures, next going throw images processing as a tool for enhancement of the images visibility and readability, applying images analysis algorithms for extracting selected features of the images (or parts of images e.g. objects), and ending on the algorithms devoted to images classification and recognition. In the paper we try to explain, why all procedures mentioned above can not give us full satisfaction in many important medical problems, when we do need understand image semantic sense, not only describe the image in terms of selected features and/or classes. The general idea of automatic images understanding is presented as well as some remarks about the successful applications of such ideas for increasing potential possibilities and performance of computer vision systems dedicated to advanced medical images analysis. This is achieved by means of applying linguistic description of the picture merit content. After this we try use new AI methods to undertake tasks of the automatic understanding of images semantics in intelligent medical information systems. A successful obtaining of the crucial semantic content of the medical image may contribute considerably to the creation of new intelligent multimedia cognitive medical systems. Thanks to the new idea of cognitive resonance between stream of the data extracted form the image using linguistic methods and expectations taken from the representation of the medical knowledge, it is possible to understand the merit content of the image even if the form of the image is very different from any known pattern.

  18. Analysis of airborne MAIS imaging spectrometric data for mineral exploration

    SciTech Connect

    Wang Jinnian; Zheng Lanfen; Tong Qingxi

    1996-11-01

    The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data and chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.

  19. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.

  20. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables. PMID:26853193

  1. PIZZARO: Forensic analysis and restoration of image and video data.

    PubMed

    Kamenicky, Jan; Bartos, Michal; Flusser, Jan; Mahdian, Babak; Kotera, Jan; Novozamsky, Adam; Saic, Stanislav; Sroubek, Filip; Sorel, Michal; Zita, Ales; Zitova, Barbara; Sima, Zdenek; Svarc, Petr; Horinek, Jan

    2016-07-01

    This paper introduces a set of methods for image and video forensic analysis. They were designed to help to assess image and video credibility and origin and to restore and increase image quality by diminishing unwanted blur, noise, and other possible artifacts. The motivation came from the best practices used in the criminal investigation utilizing images and/or videos. The determination of the image source, the verification of the image content, and image restoration were identified as the most important issues of which automation can facilitate criminalists work. Novel theoretical results complemented with existing approaches (LCD re-capture detection and denoising) were implemented in the PIZZARO software tool, which consists of the image processing functionality as well as of reporting and archiving functions to ensure the repeatability of image analysis procedures and thus fulfills formal aspects of the image/video analysis work. Comparison of new proposed methods with the state of the art approaches is shown. Real use cases are presented, which illustrate the functionality of the developed methods and demonstrate their applicability in different situations. The use cases as well as the method design were solved in tight cooperation of scientists from the Institute of Criminalistics, National Drug Headquarters of the Criminal Police and Investigation Service of the Police of the Czech Republic, and image processing experts from the Czech Academy of Sciences.

  2. Dehazing method through polarimetric imaging and multi-scale analysis

    NASA Astrophysics Data System (ADS)

    Cao, Lei; Shao, Xiaopeng; Liu, Fei; Wang, Lin

    2015-05-01

    An approach for haze removal utilizing polarimetric imaging and multi-scale analysis has been developed to solve one problem that haze weather weakens the interpretation of remote sensing because of the poor visibility and short detection distance of haze images. On the one hand, the polarization effects of the airlight and the object radiance in the imaging procedure has been considered. On the other hand, one fact that objects and haze possess different frequency distribution properties has been emphasized. So multi-scale analysis through wavelet transform has been employed to make it possible for low frequency components that haze presents and high frequency coefficients that image details or edges occupy are processed separately. According to the measure of the polarization feather by Stokes parameters, three linear polarized images (0°, 45°, and 90°) have been taken on haze weather, then the best polarized image min I and the worst one max I can be synthesized. Afterwards, those two polarized images contaminated by haze have been decomposed into different spatial layers with wavelet analysis, and the low frequency images have been processed via a polarization dehazing algorithm while high frequency components manipulated with a nonlinear transform. Then the ultimate haze-free image can be reconstructed by inverse wavelet reconstruction. Experimental results verify that the dehazing method proposed in this study can strongly promote image visibility and increase detection distance through haze for imaging warning and remote sensing systems.

  3. PIZZARO: Forensic analysis and restoration of image and video data.

    PubMed

    Kamenicky, Jan; Bartos, Michal; Flusser, Jan; Mahdian, Babak; Kotera, Jan; Novozamsky, Adam; Saic, Stanislav; Sroubek, Filip; Sorel, Michal; Zita, Ales; Zitova, Barbara; Sima, Zdenek; Svarc, Petr; Horinek, Jan

    2016-07-01

    This paper introduces a set of methods for image and video forensic analysis. They were designed to help to assess image and video credibility and origin and to restore and increase image quality by diminishing unwanted blur, noise, and other possible artifacts. The motivation came from the best practices used in the criminal investigation utilizing images and/or videos. The determination of the image source, the verification of the image content, and image restoration were identified as the most important issues of which automation can facilitate criminalists work. Novel theoretical results complemented with existing approaches (LCD re-capture detection and denoising) were implemented in the PIZZARO software tool, which consists of the image processing functionality as well as of reporting and archiving functions to ensure the repeatability of image analysis procedures and thus fulfills formal aspects of the image/video analysis work. Comparison of new proposed methods with the state of the art approaches is shown. Real use cases are presented, which illustrate the functionality of the developed methods and demonstrate their applicability in different situations. The use cases as well as the method design were solved in tight cooperation of scientists from the Institute of Criminalistics, National Drug Headquarters of the Criminal Police and Investigation Service of the Police of the Czech Republic, and image processing experts from the Czech Academy of Sciences. PMID:27182830

  4. A linear mixture analysis-based compression for hyperspectral image analysis

    SciTech Connect

    C. I. Chang; I. W. Ginsberg

    2000-06-30

    In this paper, the authors present a fully constrained least squares linear spectral mixture analysis-based compression technique for hyperspectral image analysis, particularly, target detection and classification. Unlike most compression techniques that directly deal with image gray levels, the proposed compression approach generates the abundance fractional images of potential targets present in an image scene and then encodes these fractional images so as to achieve data compression. Since the vital information used for image analysis is generally preserved and retained in the abundance fractional images, the loss of information may have very little impact on image analysis. In some occasions, it even improves analysis performance. Airborne visible infrared imaging spectrometer (AVIRIS) data experiments demonstrate that it can effectively detect and classify targets while achieving very high compression ratios.

  5. Principles of image processing in machine vision systems for the color analysis of minerals

    NASA Astrophysics Data System (ADS)

    Petukhova, Daria B.; Gorbunova, Elena V.; Chertov, Aleksandr N.; Korotaev, Valery V.

    2014-09-01

    At the moment color sorting method is one of promising methods of mineral raw materials enrichment. This method is based on registration of color differences between images of analyzed objects. As is generally known the problem with delimitation of close color tints when sorting low-contrast minerals is one of the main disadvantages of color sorting method. It is can be related with wrong choice of a color model and incomplete image processing in machine vision system for realizing color sorting algorithm. Another problem is a necessity of image processing features reconfiguration when changing the type of analyzed minerals. This is due to the fact that optical properties of mineral samples vary from one mineral deposit to another. Therefore searching for values of image processing features is non-trivial task. And this task doesn't always have an acceptable solution. In addition there are no uniform guidelines for determining criteria of mineral samples separation. It is assumed that the process of image processing features reconfiguration had to be made by machine learning. But in practice it's carried out by adjusting the operating parameters which are satisfactory for one specific enrichment task. This approach usually leads to the fact that machine vision system unable to estimate rapidly the concentration rate of analyzed mineral ore by using color sorting method. This paper presents the results of research aimed at addressing mentioned shortcomings in image processing organization for machine vision systems which are used to color sorting of mineral samples. The principles of color analysis for low-contrast minerals by using machine vision systems are also studied. In addition, a special processing algorithm for color images of mineral samples is developed. Mentioned algorithm allows you to determine automatically the criteria of mineral samples separation based on an analysis of representative mineral samples. Experimental studies of the proposed algorithm

  6. Low-cost image analysis system

    SciTech Connect

    Lassahn, G.D.

    1995-01-01

    The author has developed an Automatic Target Recognition system based on parallel processing using transputers. This approach gives a powerful, fast image processing system at relatively low cost. This system scans multi-sensor (e.g., several infrared bands) image data to find any identifiable target, such as physical object or a type of vegetation.

  7. A revised analysis of Lawson criteria and its implications for ICF

    SciTech Connect

    Panarella, E. |

    1995-12-31

    Recently, a re-examination of the breakeven conditions for D-T plasmas has been presented. The results show that breakeven might not follow the Lawson nt rule, and in particular the plasma containment time seems to have lost the importance that it previously had. Moreover, a minimum particle density of the order of {approximately}10{sup 15} cm{sup {minus}3} has been found to be required for breakeven, which indicates that the inertial confinement fusion effort is in the right position to reach the fusion goal. In light of these results, a reassessment of Lawson`s analysis has been undertaken. Lawson considered the case of a pulsed system that followed this idealized cycle: the gas is heated instantaneously to a temperature T, which is maintained for a time t, after which the gas is allowed to cool. Conduction loss is neglected entirely, and the energy used to heat the gas and supply the radiation loss is regained as useful heat. In order to illustrate how the analysis by Lawson can be improved, the cycle to which the gas is subjected should be divided in three phases: 1st phase: rapid heating of the gas for a time t{sub 1} to bring it from the original ambient temperature to the fusion temperature T; 2nd phase: continuous injection of energy in the plasma for a time t{sub 2} to maintain the temperature T; 3rd phase: no more injection of energy and cooling of the gas to the ambient temperature in a time t{sub 3}.

  8. Vector sparse representation of color image using quaternion matrix analysis.

    PubMed

    Xu, Yi; Yu, Licheng; Xu, Hongteng; Zhang, Hao; Nguyen, Truong

    2015-04-01

    Traditional sparse image models treat color image pixel as a scalar, which represents color channels separately or concatenate color channels as a monochrome image. In this paper, we propose a vector sparse representation model for color images using quaternion matrix analysis. As a new tool for color image representation, its potential applications in several image-processing tasks are presented, including color image reconstruction, denoising, inpainting, and super-resolution. The proposed model represents the color image as a quaternion matrix, where a quaternion-based dictionary learning algorithm is presented using the K-quaternion singular value decomposition (QSVD) (generalized K-means clustering for QSVD) method. It conducts the sparse basis selection in quaternion space, which uniformly transforms the channel images to an orthogonal color space. In this new color space, it is significant that the inherent color structures can be completely preserved during vector reconstruction. Moreover, the proposed sparse model is more efficient comparing with the current sparse models for image restoration tasks due to lower redundancy between the atoms of different color channels. The experimental results demonstrate that the proposed sparse image model avoids the hue bias issue successfully and shows its potential as a general and powerful tool in color image analysis and processing domain. PMID:25643407

  9. Lymphovascular and perineural invasion as selection criteria for adjuvant therapy in intrahepatic cholangiocarcinoma: a multi-institution analysis

    PubMed Central

    Fisher, Sarah B; Patel, Sameer H; Kooby, David A; Weber, Sharon; Bloomston, Mark; Cho, Clifford; Hatzaras, Ioannis; Schmidt, Carl; Winslow, Emily; Staley III, Charles A; Maithel, Shishir K

    2012-01-01

    Objectives Criteria for the selection of patients for adjuvant chemotherapy in intrahepatic cholangiocarcinoma (IHCC) are lacking. Some authors advocate treating patients with lymph node (LN) involvement; however, nodal assessment is often inadequate or not performed. This study aimed to identify surrogate criteria based on characteristics of the primary tumour. Methods A total of 58 patients who underwent resection for IHCC between January 2000 and January 2010 at any of three institutions were identified. Primary outcome was overall survival (OS). Results Median OS was 23.0 months. Median tumour size was 6.5 cm and the median number of lesions was one. Overall, 16% of patients had positive margins, 38% had perineural invasion (PNI), 40% had lymphovascular invasion (LVI) and 22% had LN involvement. A median of two LNs were removed and a median of zero were positive. Lymph nodes were not sampled in 34% of patients. Lymphovascular and perineural invasion were associated with reduced OS [9.6 months vs. 32.7 months (P= 0.020) and 10.7 months vs. 32.7 months (P= 0.008), respectively]. Lymph node involvement indicated a trend towards reduced OS (10.7 months vs. 30.0 months; P= 0.063). The presence of either LVI or PNI in node-negative patients was associated with a reduction in OS similar to that in node-positive patients (12.1 months vs. 10.7 months; P= 0.541). After accounting for adverse tumour factors, only LVI and PNI remained associated with decreased OS on multivariate analysis (hazard ratio 4.07, 95% confidence interval 1.60–10.40; P= 0.003). Conclusions Lymphovascular and perineural invasion are separately associated with a reduction in OS similar to that in patients with LN-positive disease. As nodal dissection is often not performed and the number of nodes retrieved is frequently inadequate, these tumour-specific factors should be considered as criteria for selection for adjuvant chemotherapy. PMID:22762399

  10. The challenge of obtaining information necessary for multi-criteria decision analysis implementation: the case of physiotherapy services in Canada

    PubMed Central

    2013-01-01

    Background As fiscal constraints dominate health policy discussions across Canada and globally, priority-setting exercises are becoming more common to guide the difficult choices that must be made. In this context, it becomes highly desirable to have accurate estimates of the value of specific health care interventions. Economic evaluation is a well-accepted method to estimate the value of health care interventions. However, economic evaluation has significant limitations, which have lead to an increase in the use of Multi-Criteria Decision Analysis (MCDA). One key concern with MCDA is the availability of the information necessary for implementation. In the Fall 2011, the Canadian Physiotherapy Association embarked on a project aimed at providing a valuation of physiotherapy services that is both evidence-based and relevant to resource allocation decisions. The framework selected for this project was MCDA. We report on how we addressed the challenge of obtaining some of the information necessary for MCDA implementation. Methods MCDA criteria were selected and areas of physiotherapy practices were identified. The building up of the necessary information base was a three step process. First, there was a literature review for each practice area, on each criterion. The next step was to conduct interviews with experts in each of the practice areas to critique the results of the literature review and to fill in gaps where there was no or insufficient literature. Finally, the results of the individual interviews were validated by a national committee to ensure consistency across all practice areas and that a national level perspective is applied. Results Despite a lack of research evidence on many of the considerations relevant to the estimation of the value of physiotherapy services (the criteria), sufficient information was obtained to facilitate MCDA implementation at the local level. Conclusions The results of this research project serve two purposes: 1) a method to

  11. An image analysis system for near-infrared (NIR) fluorescence lymph imaging

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.

    2011-03-01

    Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.

  12. MR brain image analysis in dementia: From quantitative imaging biomarkers to ageing brain models and imaging genetics.

    PubMed

    Niessen, Wiro J

    2016-10-01

    MR brain image analysis has constantly been a hot topic research area in medical image analysis over the past two decades. In this article, it is discussed how the field developed from the construction of tools for automatic quantification of brain morphology, function, connectivity and pathology, to creating models of the ageing brain in normal ageing and disease, and tools for integrated analysis of imaging and genetic data. The current and future role of the field in improved understanding of the development of neurodegenerative disease is discussed, and its potential for aiding in early and differential diagnosis and prognosis of different types of dementia. For the latter, the use of reference imaging data and reference models derived from large clinical and population imaging studies, and the application of machine learning techniques on these reference data, are expected to play a key role. PMID:27344937

  13. MR brain image analysis in dementia: From quantitative imaging biomarkers to ageing brain models and imaging genetics.

    PubMed

    Niessen, Wiro J

    2016-10-01

    MR brain image analysis has constantly been a hot topic research area in medical image analysis over the past two decades. In this article, it is discussed how the field developed from the construction of tools for automatic quantification of brain morphology, function, connectivity and pathology, to creating models of the ageing brain in normal ageing and disease, and tools for integrated analysis of imaging and genetic data. The current and future role of the field in improved understanding of the development of neurodegenerative disease is discussed, and its potential for aiding in early and differential diagnosis and prognosis of different types of dementia. For the latter, the use of reference imaging data and reference models derived from large clinical and population imaging studies, and the application of machine learning techniques on these reference data, are expected to play a key role.

  14. Multi-criteria decision analysis of concentrated solar power with thermal energy storage and dry cooling.

    PubMed

    Klein, Sharon J W

    2013-12-17

    Decisions about energy backup and cooling options for parabolic trough (PT) concentrated solar power have technical, economic, and environmental implications. Although PT development has increased rapidly in recent years, energy policies do not address backup or cooling option requirements, and very few studies directly compare the diverse implications of these options. This is the first study to compare the annual capacity factor, levelized cost of energy (LCOE), water consumption, land use, and life cycle greenhouse gas (GHG) emissions of PT with different backup options (minimal backup (MB), thermal energy storage (TES), and fossil fuel backup (FF)) and different cooling options (wet (WC) and dry (DC). Multicriteria decision analysis was used with five preference scenarios to identify the highest-scoring energy backup-cooling combination for each preference scenario. MB-WC had the highest score in the Economic and Climate Change-Economy scenarios, while FF-DC and FF-WC had the highest scores in the Equal and Availability scenarios, respectively. TES-DC had the highest score for the Environmental scenario. DC was ranked 1-3 in all preference scenarios. Direct comparisons between GHG emissions and LCOE and between GHG emissions and land use suggest a preference for TES if backup is require for PT plants to compete with baseload generators.

  15. Multi-criteria decision analysis of concentrated solar power with thermal energy storage and dry cooling.

    PubMed

    Klein, Sharon J W

    2013-12-17

    Decisions about energy backup and cooling options for parabolic trough (PT) concentrated solar power have technical, economic, and environmental implications. Although PT development has increased rapidly in recent years, energy policies do not address backup or cooling option requirements, and very few studies directly compare the diverse implications of these options. This is the first study to compare the annual capacity factor, levelized cost of energy (LCOE), water consumption, land use, and life cycle greenhouse gas (GHG) emissions of PT with different backup options (minimal backup (MB), thermal energy storage (TES), and fossil fuel backup (FF)) and different cooling options (wet (WC) and dry (DC). Multicriteria decision analysis was used with five preference scenarios to identify the highest-scoring energy backup-cooling combination for each preference scenario. MB-WC had the highest score in the Economic and Climate Change-Economy scenarios, while FF-DC and FF-WC had the highest scores in the Equal and Availability scenarios, respectively. TES-DC had the highest score for the Environmental scenario. DC was ranked 1-3 in all preference scenarios. Direct comparisons between GHG emissions and LCOE and between GHG emissions and land use suggest a preference for TES if backup is require for PT plants to compete with baseload generators. PMID:24245524

  16. Theoretical Analysis of Radiographic Images by Nonstationary Poisson Processes

    NASA Astrophysics Data System (ADS)

    Tanaka, Kazuo; Yamada, Isao; Uchida, Suguru

    1980-12-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples of the one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process.

  17. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  18. Carotid plaque characterization using CT and MRI scans for synergistic image analysis

    NASA Astrophysics Data System (ADS)

    Getzin, Matthew; Xu, Yiqin; Rao, Arhant; Madi, Saaussan; Bahadur, Ali; Lennartz, Michelle R.; Wang, Ge

    2014-09-01

    Noninvasive determination of plaque vulnerability has been a holy grail of medical imaging. Despite advances in tomographic technologies , there is currently no effective way to identify vulnerable atherosclerotic plaques with high sensitivity and specificity. Computed tomography (CT) and magnetic resonance imaging (MRI) are widely used, but neither provides sufficient information of plaque properties. Thus, we are motivated to combine CT and MRI imaging to determine if the composite information can better reflect the histological determination of plaque vulnerability. Two human endarterectomy specimens (1 symptomatic carotid and 1 stable femoral) were imaged using Scanco Medical Viva CT40 and Bruker Pharmascan 16cm 7T Horizontal MRI / MRS systems. μCT scans were done at 55 kVp and tube current of 70 mA. Samples underwent RARE-VTR and MSME pulse sequences to measure T1, T2 values, and proton density. The specimens were processed for histology and scored for vulnerability using the American Heart Association criteria. Single modality-based analyses were performed through segmentation of key imaging biomarkers (i.e. calcification and lumen), image registration, measurement of fibrous capsule, and multi-component T1 and T2 decay modeling. Feature differences were analyzed between the unstable and stable controls, symptomatic carotid and femoral plaque, respectively. By building on the techniques used in this study, synergistic CT+MRI analysis may provide a promising solution for plaque characterization in vivo.

  19. Transfer representation learning for medical image analysis.

    PubMed

    Chuen-Kai Shie; Chung-Hisang Chuang; Chun-Nan Chou; Meng-Hsi Wu; Chang, Edward Y

    2015-08-01

    There are two major challenges to overcome when developing a classifier to perform automatic disease diagnosis. First, the amount of labeled medical data is typically very limited, and a classifier cannot be effectively trained to attain high disease-detection accuracy. Second, medical domain knowledge is required to identify representative features in data for detecting a target disease. Most computer scientists and statisticians do not have such domain knowledge. In this work, we show that employing transfer learning can remedy both problems. We use Otitis Media (OM) to conduct our case study. Instead of using domain knowledge to extract features from labeled OM images, we construct features based on a dataset entirely OM-irrelevant. More specifically, we first learn a codebook in an unsupervised way from 15 million images collected from ImageNet. The codebook gives us what the encoders consider being the fundamental elements of those 15 million images. We then encode OM images using the codebook and obtain a weighting vector for each OM image. Using the resulting weighting vectors as the feature vectors of the OM images, we employ a traditional supervised learning algorithm to train an OM classifier. The achieved detection accuracy is 88.5% (89.63% in sensitivity and 86.9% in specificity), markedly higher than all previous attempts, which relied on domain experts to help extract features.

  20. Whole-breast irradiation: a subgroup analysis of criteria to stratify for prone position treatment

    SciTech Connect

    Ramella, Sara; Trodella, Lucio; Ippolito, Edy; Fiore, Michele; Cellini, Francesco; Stimato, Gerardina; Gaudino, Diego; Greco, Carlo; Ramponi, Sara; Cammilluzzi, Eugenio; Cesarini, Claudio; Piermattei, Angelo; Cesario, Alfredo; D'Angelillo, Rolando Maria

    2012-07-01

    To select among breast cancer patients and according to breast volume size those who may benefit from 3D conformal radiotherapy after conservative surgery applied with prone-position technique. Thirty-eight patients with early-stage breast cancer were grouped according to the target volume (TV) measured in the supine position: small ({<=}400 mL), medium (400-700 mL), and large ({>=}700 ml). An ad-hoc designed and built device was used for prone set-up to displace the contralateral breast away from the tangential field borders. All patients underwent treatment planning computed tomography in both the supine and prone positions. Dosimetric data to explore dose distribution and volume of normal tissue irradiated were calculated for each patient in both positions. Homogeneity index, hot spot areas, the maximum dose, and the lung constraints were significantly reduced in the prone position (p < 0.05). The maximum heart distance and the V{sub 5Gy} did not vary consistently in the 2 positions (p = 0.06 and p = 0.7, respectively). The number of necessary monitor units was significantly higher in the supine position (312 vs. 232, p < 0.0001). The subgroups analysis pointed out the advantage in lung sparing in all TV groups (small, medium and large) for all the evaluated dosimetric constraints (central lung distance, maximum lung distance, and V{sub 5Gy}, p < 0.0001). In the small TV group, a dose reduction in nontarget areas of 22% in the prone position was detected (p = 0.056); in the medium and high TV groups, the difference was of about -10% (p = NS). The decrease in hot spot areas in nontarget tissues was 73%, 47%, and 80% for small, medium, and large TVs in the prone position, respectively. Although prone breast radiotherapy is normally proposed in patients with breasts of large dimensions, this study gives evidence of dosimetric benefit in all patient subgroups irrespective of breast volume size.

  1. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  2. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets.

  3. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  4. Rapid analysis and exploration of fluorescence microscopy images.

    PubMed

    Pavie, Benjamin; Rajaram, Satwik; Ouyang, Austin; Altschuler, Jason M; Steininger, Robert J; Wu, Lani F; Altschuler, Steven J

    2014-03-19

    Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard. Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens.

  5. Histology image analysis for carcinoma detection and grading

    PubMed Central

    He, Lei; Long, L. Rodney; Antani, Sameer; Thoma, George R.

    2012-01-01

    This paper presents an overview of the image analysis techniques in the domain of histopathology, specifically, for the objective of automated carcinoma detection and classification. As in other biomedical imaging areas such as radiology, many computer assisted diagnosis (CAD) systems have been implemented to aid histopathologists and clinicians in cancer diagnosis and research, which have been attempted to significantly reduce the labor and subjectivity of traditional manual intervention with histology images. The task of automated histology image analysis is usually not simple due to the unique characteristics of histology imaging, including the variability in image preparation techniques, clinical interpretation protocols, and the complex structures and very large size of the images themselves. In this paper we discuss those characteristics, provide relevant background information about slide preparation and interpretation, and review the application of digital image processing techniques to the field of histology image analysis. In particular, emphasis is given to state-of-the-art image segmentation methods for feature extraction and disease classification. Four major carcinomas of cervix, prostate, breast, and lung are selected to illustrate the functions and capabilities of existing CAD systems. PMID:22436890

  6. Research of second harmonic generation images based on texture analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yao; Li, Yan; Gong, Haiming; Zhu, Xiaoqin; Huang, Zufang; Chen, Guannan

    2014-09-01

    Texture analysis plays a crucial role in identifying objects or regions of interest in an image. It has been applied to a variety of medical image processing, ranging from the detection of disease and the segmentation of specific anatomical structures, to differentiation between healthy and pathological tissues. Second harmonic generation (SHG) microscopy as a potential noninvasive tool for imaging biological tissues has been widely used in medicine, with reduced phototoxicity and photobleaching. In this paper, we clarified the principles of texture analysis including statistical, transform, structural and model-based methods and gave examples of its applications, reviewing studies of the technique. Moreover, we tried to apply texture analysis to the SHG images for the differentiation of human skin scar tissues. Texture analysis method based on local binary pattern (LBP) and wavelet transform was used to extract texture features of SHG images from collagen in normal and abnormal scars, and then the scar SHG images were classified into normal or abnormal ones. Compared with other texture analysis methods with respect to the receiver operating characteristic analysis, LBP combined with wavelet transform was demonstrated to achieve higher accuracy. It can provide a new way for clinical diagnosis of scar types. At last, future development of texture analysis in SHG images were discussed.

  7. Identifying radiotherapy target volumes in brain cancer by image analysis

    PubMed Central

    Cheng, Kun; Montgomery, Dean; Feng, Yang; Steel, Robin; Liao, Hanqing; McLaren, Duncan B.; Erridge, Sara C.; McLaughlin, Stephen

    2015-01-01

    To establish the optimal radiotherapy fields for treating brain cancer patients, the tumour volume is often outlined on magnetic resonance (MR) images, where the tumour is clearly visible, and mapped onto computerised tomography images used for radiotherapy planning. This process requires considerable clinical experience and is time consuming, which will continue to increase as more complex image sequences are used in this process. Here, the potential of image analysis techniques for automatically identifying the radiation target volume on MR images, and thereby assisting clinicians with this difficult task, was investigated. A gradient-based level set approach was applied on the MR images of five patients with grades II, III and IV malignant cerebral glioma. The relationship between the target volumes produced by image analysis and those produced by a radiation oncologist was also investigated. The contours produced by image analysis were compared with the contours produced by an oncologist and used for treatment. In 93% of cases, the Dice similarity coefficient was found to be between 60 and 80%. This feasibility study demonstrates that image analysis has the potential for automatic outlining in the management of brain cancer patients, however, more testing and validation on a much larger patient cohort is required. PMID:26609418

  8. Using Multi-Criteria Analysis for the Study of Human Impact on Agro-Forestry Ecosystem in the Region of Khenchela (algeria)

    NASA Astrophysics Data System (ADS)

    Bouzekri, A.; Benmessaoud, H.

    2016-06-01

    The objective of this work is to study and analyze the human impact on agro-forestry-pastoral ecosystem of Khenchela region through the application of multi-criteria analysis methods to integrate geographic information systems, our methodology is based on a weighted linear combination of information on four criteria chosen in our analysis representative in the vicinity of variables in relation to roads, urban areas, water resources and agricultural space, the results shows the effect of urbanization and socio-economic activity on the degradation of the physical environment and found that 32% of the total area are very sensitive to human impact.

  9. Introducing PLIA: Planetary Laboratory for Image Analysis

    NASA Astrophysics Data System (ADS)

    Peralta, J.; Hueso, R.; Barrado, N.; Sánchez-Lavega, A.

    2005-08-01

    We present a graphical software tool developed under IDL software to navigate, process and analyze planetary images. The software has a complete Graphical User Interface and is cross-platform. It can also run under the IDL Virtual Machine without the need to own an IDL license. The set of tools included allow image navigation (orientation, centring and automatic limb determination), dynamical and photometric atmospheric measurements (winds and cloud albedos), cylindrical and polar projections, as well as image treatment under several procedures. Being written in IDL, it is modular and easy to modify and grow for adding new capabilities. We show several examples of the software capabilities with Galileo-Venus observations: Image navigation, photometrical corrections, wind profiles obtained by cloud tracking, cylindrical projections and cloud photometric measurements. Acknowledgements: This work has been funded by Spanish MCYT PNAYA2003-03216, fondos FEDER and Grupos UPV 15946/2004. R. Hueso acknowledges a post-doc fellowship from Gobierno Vasco.

  10. Digital color image analysis of core

    SciTech Connect

    Digoggio, R.; Burleigh, K. )

    1990-05-01

    Geologists often identify sands, shales, or UV-fluorescent zones by their color in photos of slabbed core or sidewalls. Similarly, they observe porosity as blue-dyed epoxy in thin sections. Of course, it is difficult to accurately quantify the amount of sand shale, fluorescence, or porosity by eye. With digital images, a computer can quantify the area of an image that is close in shade to a selected color, which is particularly useful for determining net sand or net fluorescence in thinly laminated zones. Digital color photography stores a video image as a large array of numbers (512 {times} 400 {times} 3 colors) in a computer file. With 32 intensity levels each for red, green, and blue, one can distinguish 32,768 different colors. A fluorescent streak or a shale has some natural variation in color that corresponds to hundreds of very similar shades. Thus, to process a digital image, one picks representative shades of some selected feature (e.g., fluorescence). The computer then calculates the eigen values and eigen vectors of the mean-centered covariance matrix of these representative colors. Based on these calculations, it determines which parts of the image have colors similar enough to the representative colors to be considered part of the selected feature. Their results show good agreement with independently measured thin section porosity and with specially prepared images having known amount of a given color.

  11. Radar images analysis for scattering surfaces characterization

    NASA Astrophysics Data System (ADS)

    Piazza, Enrico

    1998-10-01

    According to the different problems and techniques related to the detection and recognition of airplanes and vehicles moving on the Airport surface, the present work mainly deals with the processing of images gathered by a high-resolution radar sensor. The radar images used to test the investigated algorithms are relative to sequence of images obtained in some field experiments carried out by the Electronic Engineering Department of the University of Florence. The radar is the Ka band radar operating in the'Leonardo da Vinci' Airport in Fiumicino (Rome). The images obtained from the radar scan converter are digitized and putted in x, y, (pixel) co- ordinates. For a correct matching of the images, these are corrected in true geometrical co-ordinates (meters) on the basis of fixed points on an airport map. Correlating the airplane 2-D multipoint template with actual radar images, the value of the signal in the points involved in the template can be extracted. Results for a lot of observation show a typical response for the main section of the fuselage and the wings. For the fuselage, the back-scattered echo is low at the prow, became larger near the center on the aircraft and than it decrease again toward the tail. For the wings the signal is growing with a pretty regular slope from the fuselage to the tips, where the signal is the strongest.

  12. Unsupervised analysis of small animal dynamic Cerenkov luminescence imaging

    NASA Astrophysics Data System (ADS)

    Spinelli, Antonello E.; Boschi, Federico

    2011-12-01

    Clustering analysis (CA) and principal component analysis (PCA) were applied to dynamic Cerenkov luminescence images (dCLI). In order to investigate the performances of the proposed approaches, two distinct dynamic data sets obtained by injecting mice with 32P-ATP and 18F-FDG were acquired using the IVIS 200 optical imager. The k-means clustering algorithm has been applied to dCLI and was implemented using interactive data language 8.1. We show that cluster analysis allows us to obtain good agreement between the clustered and the corresponding emission regions like the bladder, the liver, and the tumor. We also show a good correspondence between the time activity curves of the different regions obtained by using CA and manual region of interest analysis on dCLIT and PCA images. We conclude that CA provides an automatic unsupervised method for the analysis of preclinical dynamic Cerenkov luminescence image data.

  13. A grid service-based tool for hyperspectral imaging analysis

    NASA Astrophysics Data System (ADS)

    Carvajal, Carmen L.; Lugo, Wilfredo; Rivera, Wilson; Sanabria, John

    2005-06-01

    This paper outlines the design and implementation of Grid-HSI, a Service Oriented Architecture-based Grid application to enable hyperspectral imaging analysis. Grid-HSI provides users with a transparent interface to access computational resources and perform remotely hyperspectral imaging analysis through a set of Grid services. Grid-HSI is composed by a Portal Grid Interface, a Data Broker and a set of specialized Grid services. Grid based applications, contrary to other clientserver approaches, provide the capabilities of persistence and potential transient process on the web. Our experimental results on Grid-HSI show the suitability of the prototype system to perform efficiently hyperspectral imaging analysis.

  14. Repetition, Power Imbalance, and Intentionality: Do These Criteria Conform to Teenagers' Perception of Bullying? A Role-Based Analysis

    ERIC Educational Resources Information Center

    Cuadrado-Gordillo, Isabel

    2012-01-01

    The criteria that researchers use to classify aggressive behaviour as bullying are "repetition", "power imbalance", and "intent to hurt". However, studies that have analyzed adolescents' perceptions of bullying find that most adolescents do not simultaneously consider these three criteria. This paper examines adolescents' perceptions of bullying…

  15. An exploratory analysis of Indiana and Illinois biotic assemblage data in support of state nutrient criteria development

    EPA Science Inventory

    EPA recognizes the importance of nutrient criteria in protecting designated uses from eutrophication effects associated with elevated phosphorus and nitrogen in streams and has worked with states over the past 12 years to assist them in developing nutrient criteria. Towards that ...

  16. An Analysis of the Selection Criteria for the Eighth Grade Algebra I Accelerated Mathematics Program in Harrison County, West Virginia.

    ERIC Educational Resources Information Center

    Schrecongost, Jonette

    This study analyzed the criteria used in Harrison County, WV, to select students to participate in an accelerated mathematics program. The program's main component is an eighth grade Algebra I course that enables the students to complete five years of college preparatory mathematics, ending with calculus. The scores used as selection criteria,…

  17. Geopositioning Precision Analysis of Multiple Image Triangulation Using Lro Nac Lunar Images

    NASA Astrophysics Data System (ADS)

    Di, K.; Xu, B.; Liu, B.; Jia, M.; Liu, Z.

    2016-06-01

    This paper presents an empirical analysis of the geopositioning precision of multiple image triangulation using Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) images at the Chang'e-3(CE-3) landing site. Nine LROC NAC images are selected for comparative analysis of geopositioning precision. Rigorous sensor models of the images are established based on collinearity equations with interior and exterior orientation elements retrieved from the corresponding SPICE kernels. Rational polynomial coefficients (RPCs) of each image are derived by least squares fitting using vast number of virtual control points generated according to rigorous sensor models. Experiments of different combinations of images are performed for comparisons. The results demonstrate that the plane coordinates can achieve a precision of 0.54 m to 2.54 m, with a height precision of 0.71 m to 8.16 m when only two images are used for three-dimensional triangulation. There is a general trend that the geopositioning precision, especially the height precision, is improved with the convergent angle of the two images increasing from several degrees to about 50°. However, the image matching precision should also be taken into consideration when choosing image pairs for triangulation. The precisions of using all the 9 images are 0.60 m, 0.50 m, 1.23 m in along-track, cross-track, and height directions, which are better than most combinations of two or more images. However, triangulation with selected fewer images could produce better precision than that using all the images.

  18. Multi-Scale Fractal Analysis of Image Texture and Pattern

    NASA Technical Reports Server (NTRS)

    Emerson, Charles W.; Lam, Nina Siu-Ngan; Quattrochi, Dale A.

    1999-01-01

    Analyses of the fractal dimension of Normalized Difference Vegetation Index (NDVI) images of homogeneous land covers near Huntsville, Alabama revealed that the fractal dimension of an image of an agricultural land cover indicates greater complexity as pixel size increases, a forested land cover gradually grows smoother, and an urban image remains roughly self-similar over the range of pixel sizes analyzed (10 to 80 meters). A similar analysis of Landsat Thematic Mapper images of the East Humboldt Range in Nevada taken four months apart show a more complex relation between pixel size and fractal dimension. The major visible difference between the spring and late summer NDVI images of the absence of high elevation snow cover in the summer image. This change significantly alters the relation between fractal dimension and pixel size. The slope of the fractal dimensional-resolution relation provides indications of how image classification or feature identification will be affected by changes in sensor spatial resolution.

  19. Multi-Scale Fractal Analysis of Image Texture and Pattern

    NASA Technical Reports Server (NTRS)

    Emerson, Charles W.; Lam, Nina Siu-Ngan; Quattrochi, Dale A.

    1999-01-01

    Analyses of the fractal dimension of Normalized Difference Vegetation Index (NDVI) images of homogeneous land covers near Huntsville, Alabama revealed that the fractal dimension of an image of an agricultural land cover indicates greater complexity as pixel size increases, a forested land cover gradually grows smoother, and an urban image remains roughly self-similar over the range of pixel sizes analyzed (10 to 80 meters). A similar analysis of Landsat Thematic Mapper images of the East Humboldt Range in Nevada taken four months apart show a more complex relation between pixel size and fractal dimension. The major visible difference between the spring and late summer NDVI images is the absence of high elevation snow cover in the summer image. This change significantly alters the relation between fractal dimension and pixel size. The slope of the fractal dimension-resolution relation provides indications of how image classification or feature identification will be affected by changes in sensor spatial resolution.

  20. Independent component analysis based filtering for penumbral imaging

    SciTech Connect

    Chen Yenwei; Han Xianhua; Nozaki, Shinya

    2004-10-01

    We propose a filtering based on independent component analysis (ICA) for Poisson noise reduction. In the proposed filtering, the image is first transformed to ICA domain and then the noise components are removed by a soft thresholding (shrinkage). The proposed filter, which is used as a preprocessing of the reconstruction, has been successfully applied to penumbral imaging. Both simulation results and experimental results show that the reconstructed image is dramatically improved in comparison to that without the noise-removing filters.

  1. Basic research planning in mathematical pattern recognition and image analysis

    NASA Technical Reports Server (NTRS)

    Bryant, J.; Guseman, L. F., Jr.

    1981-01-01

    Fundamental problems encountered while attempting to develop automated techniques for applications of remote sensing are discussed under the following categories: (1) geometric and radiometric preprocessing; (2) spatial, spectral, temporal, syntactic, and ancillary digital image representation; (3) image partitioning, proportion estimation, and error models in object scene interference; (4) parallel processing and image data structures; and (5) continuing studies in polarization; computer architectures and parallel processing; and the applicability of "expert systems" to interactive analysis.

  2. Visual Pattern Analysis in Histopathology Images Using Bag of Features

    NASA Astrophysics Data System (ADS)

    Cruz-Roa, Angel; Caicedo, Juan C.; González, Fabio A.

    This paper presents a framework to analyse visual patterns in a collection of medical images in a two stage procedure. First, a set of representative visual patterns from the image collection is obtained by constructing a visual-word dictionary under a bag-of-features approach. Second, an analysis of the relationships between visual patterns and semantic concepts in the image collection is performed. The most important visual patterns for each semantic concept are identified using correlation analysis. A matrix visualization of the structure and organization of the image collection is generated using a cluster analysis. The experimental evaluation was conducted on a histopathology image collection and results showed clear relationships between visual patterns and semantic concepts, that in addition, are of easy interpretation and understanding.

  3. A unified approach to image focus and defocus analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yen-Fu

    1998-09-01

    Recovering the three-dimensional (3D) information lost due to the projection of a 3D scene onto a two- dimensional (2D) image plane is an important research area in computer vision. In this thesis we present a new approach to reconstruct a highly accurate 3D shape and focused image of an object from a sequence of noisy defocused images. This new approach-Unified Focus and Defocus Analysis (UFDA)-unifies two approaches- Image Focus Analysis (IFA) and Image Defocus Analysis (IDA)-which have been treated separately in the research literature so far. UFDA is based on modeling the sensing of defocused images in a camera system. The concept of a ``Three-Dimensional Point Spread Function'' (3D PSF) in the (x, y, d) space is introduced, where x and y are the image spatial coordinates and d is a parameter representing the level of defocus. The importance of the choice of this parameterization is that it facilitates the derivation of a 3D convolution equation for image formation under certain weak conditions. The problem of 3D shape and focused image reconstruction is formulated as an optimization problem where the difference (mean- square error) between the observed image data and the estimated image data is minimized by an optimization approach. The estimated image data is obtained from the image sensing model and the current best known solutions to the 3D shape and focused image. Depending on the number of images in the sequence, an initial estimation of the solution can be obtained through IFA or IDA methods. Three optimization techniques have been applied to UFDA-a classical gradient descent approach, a local search method and a regularization technique. Based on these techniques, an efficient computational algorithm has been developed to use a variable number of images. A parallel implementation of UFDA on the Parallel Virtual Machine (PVM) is also investigated. One of the most computationally intensive parts of the UFDA approach is the estimation of image data that

  4. Can Physicians Identify Inappropriate Nuclear Stress Tests? An Examination of Inter-rater Reliability for the 2009 Appropriate Use Criteria for Radionuclide Imaging

    PubMed Central

    Ye, Siqin; Rabbani, LeRoy E.; Kelly, Christopher R.; Kelly, Maureen R.; Lewis, Matthew; Paz, Yehuda; Peck, Clara L.; Rao, Shaline; Bokhari, Sabahat; Weiner, Shepard D.; Einstein, Andrew J.

    2014-01-01

    Background We sought to determine inter-rater reliability of the 2009 Appropriate Use Criteria (AUC) for radionuclide imaging (RNI) and whether physicians at various levels of training can effectively identify nuclear stress tests with inappropriate indications. Methods and Results Four hundred patients were randomly selected from a consecutive cohort of patients undergoing nuclear stress testing at an academic medical center. Raters with different levels of training (including cardiology attending physicians, cardiology fellows, internal medicine hospitalists, and internal medicine interns) classified individual nuclear stress tests using the 2009 AUC. Consensus classification by two cardiologists was considered the operational gold standard, and sensitivity and specificity of individual raters for identifying inappropriate tests was calculated. Inter-rater reliability of the AUC was assessed using Cohen’s kappa statistics for pairs of different raters. The mean age of patients was 61.5 years; 214 (54%) were female. The cardiologists rated 256 (64%) of 400 NSTs as appropriate, 68 (18%) as uncertain, 55 (14%) as inappropriate; 21 (5%) tests were unable to be classified. Inter-rater reliability for non-cardiologist raters was modest (unweighted Cohen’s kappa, 0.51, 95% confidence interval, 0.45 to 0.55). Sensitivity of individual raters for identifying inappropriate tests ranged from 47% to 82%, while specificity ranged from 85% to 97%. Conclusions Inter-rater reliability for the 2009 AUC for RNI is modest, and there is considerable variation in the ability of raters at different levels of training to identify inappropriate tests. PMID:25563660

  5. Challenges and opportunities for quantifying roots and rhizosphere interactions through imaging and image analysis.

    PubMed

    Downie, H F; Adu, M O; Schmidt, S; Otten, W; Dupuy, L X; White, P J; Valentine, T A

    2015-07-01

    The morphology of roots and root systems influences the efficiency by which plants acquire nutrients and water, anchor themselves and provide stability to the surrounding soil. Plant genotype and the biotic and abiotic environment significantly influence root morphology, growth and ultimately crop yield. The challenge for researchers interested in phenotyping root systems is, therefore, not just to measure roots and link their phenotype to the plant genotype, but also to understand how the growth of roots is influenced by their environment. This review discusses progress in quantifying root system parameters (e.g. in terms of size, shape and dynamics) using imaging and image analysis technologies and also discusses their potential for providing a better understanding of root:soil interactions. Significant progress has been made in image acquisition techniques, however trade-offs exist between sample throughput, sample size, image resolution and information gained. All of these factors impact on downstream image analysis processes. While there have been significant advances in computation power, limitations still exist in statistical processes involved in image analysis. Utilizing and combining different imaging systems, integrating measurements and image analysis where possible, and amalgamating data will allow researchers to gain a better understanding of root:soil interactions.

  6. A critical assessment of the performance criteria in confirmatory analysis for veterinary drug residue analysis using mass spectrometric detection in selected reaction monitoring mode.

    PubMed

    Berendsen, Bjorn J A; Meijer, Thijs; Wegh, Robin; Mol, Hans G J; Smyth, Wesley G; Armstrong Hewitt, S; van Ginkel, Leen; Nielen, Michel W F

    2016-05-01

    Besides the identification point system to assure adequate set-up of instrumentation, European Commission Decision 2002/657/EC includes performance criteria regarding relative ion abundances in mass spectrometry and chromatographic retention time. In confirmatory analysis, the relative abundance of two product ions, acquired in selected reaction monitoring mode, the ion ratio should be within certain ranges for confirmation of the identity of a substance. The acceptable tolerance of the ion ratio varies with the relative abundance of the two product ions and for retention time, CD 2002/657/EC allows a tolerance of 5%. Because of rapid technical advances in analytical instruments and new approaches applied in the field of contaminant testing in food products (multi-compound and multi-class methods) a critical assessment of these criteria is justified. In this study a large number of representative, though challenging sample extracts were prepared, including muscle, urine, milk and liver, spiked with 100 registered and banned veterinary drugs at levels ranging from 0.5 to 100 µg/kg. These extracts were analysed using SRM mode using different chromatographic conditions and mass spectrometers from different vendors. In the initial study, robust data was collected using four different instrumental set-ups. Based on a unique and highly relevant data set, consisting of over 39 000 data points, the ion ratio and retention time criteria for applicability in confirmatory analysis were assessed. The outcomes were verified based on a collaborative trial including laboratories from all over the world. It was concluded that the ion ratio deviation is not related to the value of the ion ratio, but rather to the intensity of the lowest product ion. Therefore a fixed ion ratio deviation tolerance of 50% (relative) is proposed, which also is applicable for compounds present at sub-ppb levels or having poor ionisation efficiency. Furthermore, it was observed that retention time

  7. A critical assessment of the performance criteria in confirmatory analysis for veterinary drug residue analysis using mass spectrometric detection in selected reaction monitoring mode.

    PubMed

    Berendsen, Bjorn J A; Meijer, Thijs; Wegh, Robin; Mol, Hans G J; Smyth, Wesley G; Armstrong Hewitt, S; van Ginkel, Leen; Nielen, Michel W F

    2016-05-01

    Besides the identification point system to assure adequate set-up of instrumentation, European Commission Decision 2002/657/EC includes performance criteria regarding relative ion abundances in mass spectrometry and chromatographic retention time. In confirmatory analysis, the relative abundance of two product ions, acquired in selected reaction monitoring mode, the ion ratio should be within certain ranges for confirmation of the identity of a substance. The acceptable tolerance of the ion ratio varies with the relative abundance of the two product ions and for retention time, CD 2002/657/EC allows a tolerance of 5%. Because of rapid technical advances in analytical instruments and new approaches applied in the field of contaminant testing in food products (multi-compound and multi-class methods) a critical assessment of these criteria is justified. In this study a large number of representative, though challenging sample extracts were prepared, including muscle, urine, milk and liver, spiked with 100 registered and banned veterinary drugs at levels ranging from 0.5 to 100 µg/kg. These extracts were analysed using SRM mode using different chromatographic conditions and mass spectrometers from different vendors. In the initial study, robust data was collected using four different instrumental set-ups. Based on a unique and highly relevant data set, consisting of over 39 000 data points, the ion ratio and retention time criteria for applicability in confirmatory analysis were assessed. The outcomes were verified based on a collaborative trial including laboratories from all over the world. It was concluded that the ion ratio deviation is not related to the value of the ion ratio, but rather to the intensity of the lowest product ion. Therefore a fixed ion ratio deviation tolerance of 50% (relative) is proposed, which also is applicable for compounds present at sub-ppb levels or having poor ionisation efficiency. Furthermore, it was observed that retention time

  8. Uncooled LWIR imaging: applications and market analysis

    NASA Astrophysics Data System (ADS)

    Takasawa, Satomi

    2015-05-01

    The evolution of infrared (IR) imaging sensor technology for defense market has played an important role in developing commercial market, as dual use of the technology has expanded. In particular, technologies of both reduction in pixel pitch and vacuum package have drastically evolved in the area of uncooled Long-Wave IR (LWIR; 8-14 μm wavelength region) imaging sensor, increasing opportunity to create new applications. From the macroscopic point of view, the uncooled LWIR imaging market is divided into two areas. One is a high-end market where uncooled LWIR imaging sensor with sensitivity as close to that of cooled one as possible is required, while the other is a low-end market which is promoted by miniaturization and reduction in price. Especially, in the latter case, approaches towards consumer market have recently appeared, such as applications of uncooled LWIR imaging sensors to night visions for automobiles and smart phones. The appearance of such a kind of commodity surely changes existing business models. Further technological innovation is necessary for creating consumer market, and there will be a room for other companies treating components and materials such as lens materials and getter materials and so on to enter into the consumer market.

  9. Texture Analysis for Classification of Risat-Ii Images

    NASA Astrophysics Data System (ADS)

    Chakraborty, D.; Thakur, S.; Jeyaram, A.; Krishna Murthy, Y. V. N.; Dadhwal, V. K.

    2012-08-01

    RISAT-II or Radar Imaging satellite - II is a microwave-imaging satellite lunched by ISRO to take images of the earth during day and night as well as all weather condition. This satellite enhances the ISRO's capability for disaster management application together with forestry, agricultural, urban and oceanographic applications. The conventional pixel based classification technique cannot classify these type of images since it do not take into account the texture information of the image. This paper presents a method to classify the high-resolution RISAT-II microwave images based on texture analysis. It suppress the speckle noise from the microwave image before analysis the texture of the image since speckle is essentially a form of noise, which degrades the quality of an image; make interpretation (visual or digital) more difficult. A local adaptive median filter is developed that uses local statistics to detect the speckle noise of microwave image and to replace it with a local median value. Local Binary Pattern (LBP) operator is proposed to measure the texture around each pixel of the speckle suppressed microwave image. It considers a series of circles (2D) centered on the pixel with incremental radius values and the intersected pixels on the perimeter of the circles of radius r (where r = 1, 3 and 5) are used for measuring the LBP of the center pixel. The significance of LBP is that it measure the texture around each pixel of the image and computationally simple. ISODATA method is used to cluster the transformed LBP image. The proposed method adequately classifies RISAT-II X band microwave images without human intervention.

  10. Automatic analysis of a skull fracture based on image content

    NASA Astrophysics Data System (ADS)

    Shao, Hong; Zhao, Hong

    2003-09-01

    Automatic analysis based on image content is a hotspot with bright future of medical image diagnosis technology research. Analysis of the fracture of skull can help doctors diagnose. In this paper, a new approach is proposed to automatically detect the fracture of skull based on CT image content. First region growing method, whose seeds and growing rules are chosen by k-means clustering dynamically, is applied for image automatic segmentation. The segmented region boundary is found by boundary tracing. Then the shape of the boundary is analyzed, and the circularity measure is taken as description parameter. At last the rules for computer automatic diagnosis of the fracture of the skull are reasoned by entropy function. This method is used to analyze the images from the third ventricles below layer to cerebral cortex top layer. Experimental result shows that the recognition rate is 100% for the 100 images, which are chosen from medical image database randomly and are not included in the training examples. This method integrates color and shape feature, and isn't affected by image size and position. This research achieves high recognition rate and sets a basis for automatic analysis of brain image.

  11. Ringed impact craters on Venus: An analysis from Magellan images

    NASA Technical Reports Server (NTRS)

    Alexopoulos, Jim S.; Mckinnon, William B.

    1992-01-01

    We have analyzed cycle 1 Magellan images covering approximately 90 percent of the venusian surface and have identified 55 unequivocal peak-ring craters and multiringed impact basins. This comprehensive study (52 peak-ring craters and at least 3 multiringed impact basins) complements our earlier independent analysis of Arecibo and Venera images and initial Magellan data and that of the Magellan team.

  12. Higher Education Institution Image: A Correspondence Analysis Approach.

    ERIC Educational Resources Information Center

    Ivy, Jonathan

    2001-01-01

    Investigated how marketing is used to convey higher education institution type image in the United Kingdom and South Africa. Using correspondence analysis, revealed the unique positionings created by old and new universities and technikons in these countries. Also identified which marketing tools they use in conveying their image. (EV)

  13. Four challenges in medical image analysis from an industrial perspective.

    PubMed

    Weese, Jürgen; Lorenz, Cristian

    2016-10-01

    Today's medical imaging systems produce a huge amount of images containing a wealth of information. However, the information is hidden in the data and image analysis algorithms are needed to extract it, to make it readily available for medical decisions and to enable an efficient work flow. Advances in medical image analysis over the past 20 years mean there are now many algorithms and ideas available that allow to address medical image analysis tasks in commercial solutions with sufficient performance in terms of accuracy, reliability and speed. At the same time new challenges have arisen. Firstly, there is a need for more generic image analysis technologies that can be efficiently adapted for a specific clinical task. Secondly, efficient approaches for ground truth generation are needed to match the increasing demands regarding validation and machine learning. Thirdly, algorithms for analyzing heterogeneous image data are needed. Finally, anatomical and organ models play a crucial role in many applications, and algorithms to construct patient-specific models from medical images with a minimum of user interaction are needed. These challenges are complementary to the on-going need for more accurate, more reliable and faster algorithms, and dedicated algorithmic solutions for specific applications.

  14. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  15. Disability in Physical Education Textbooks: An Analysis of Image Content

    ERIC Educational Resources Information Center

    Taboas-Pais, Maria Ines; Rey-Cao, Ana

    2012-01-01

    The aim of this paper is to show how images of disability are portrayed in physical education textbooks for secondary schools in Spain. The sample was composed of 3,316 images published in 36 textbooks by 10 publishing houses. A content analysis was carried out using a coding scheme based on categories employed in other similar studies and adapted…

  16. System Matrix Analysis for Computed Tomography Imaging

    PubMed Central

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  17. System Matrix Analysis for Computed Tomography Imaging.

    PubMed

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  18. The ImageJ ecosystem: An open platform for biomedical image analysis.

    PubMed

    Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem.

  19. The ImageJ ecosystem: An open platform for biomedical image analysis.

    PubMed

    Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368

  20. Analysis of PETT images in psychiatric disorders

    SciTech Connect

    Brodie, J.D.; Gomez-Mont, F.; Volkow, N.D.; Corona, J.F.; Wolf, A.P.; Wolkin, A.; Russell, J.A.G.; Christman, D.; Jaeger, J.

    1983-01-01

    A quantitative method is presented for studying the pattern of metabolic activity in a set of Positron Emission Transaxial Tomography (PETT) images. Using complex Fourier coefficients as a feature vector for each image, cluster, principal components, and discriminant function analyses are used to empirically describe metabolic differences between control subjects and patients with DSM III diagnosis for schizophrenia or endogenous depression. We also present data on the effects of neuroleptic treatment on the local cerebral metabolic rate of glucose utilization (LCMRGI) in a group of chronic schizophrenics using the region of interest approach. 15 references, 4 figures, 3 tables.

  1. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  2. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    PubMed

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.

  3. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    PubMed

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making. PMID:26739955

  4. Geostationary microwave imagers detection criteria

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1986-01-01

    Geostationary orbit is investigated as a vantage point from which to sense remotely the surface features of the planet and its atmosphere, with microwave sensors. The geometrical relationships associated with geostationary altitude are developed to produce an efficient search pattern for the detection of emitting media and metal objects. Power transfer equations are derived from the roots of first principles and explain the expected values of the signal-to-clutter ratios for the detection of aircraft, ships, and buoys and for the detection of natural features where they are manifested as cold and warm eddies. The transport of microwave power is described for modeled detection where the direction of power flow is explained by the Zeroth and Second Laws of Thermodynamics. Mathematical expressions are derived that elucidate the detectability of natural emitting media and metal objects. Signal-to-clutter ratio comparisons are drawn among detectable objects that show relative detectability with a thermodynamic sensor and with a short-pulse radar.

  5. Electron Microscopy and Image Analysis for Selected Materials

    NASA Technical Reports Server (NTRS)

    Williams, George

    1999-01-01

    This particular project was completed in collaboration with the metallurgical diagnostics facility. The objective of this research had four major components. First, we required training in the operation of the environmental scanning electron microscope (ESEM) for imaging of selected materials including biological specimens. The types of materials range from cyanobacteria and diatoms to cloth, metals, sand, composites and other materials. Second, to obtain training in surface elemental analysis technology using energy dispersive x-ray (EDX) analysis, and in the preparation of x-ray maps of these same materials. Third, to provide training for the staff of the metallurgical diagnostics and failure analysis team in the area of image processing and image analysis technology using NIH Image software. Finally, we were to assist in the sample preparation, observing, imaging, and elemental analysis for Mr. Richard Hoover, one of NASA MSFC's solar physicists and Marshall's principal scientist for the agency-wide virtual Astrobiology Institute. These materials have been collected from various places around the world including the Fox Tunnel in Alaska, Siberia, Antarctica, ice core samples from near Lake Vostoc, thermal vents in the ocean floor, hot springs and many others. We were successful in our efforts to obtain high quality, high resolution images of various materials including selected biological ones. Surface analyses (EDX) and x-ray maps were easily prepared with this technology. We also discovered and used some applications for NIH Image software in the metallurgical diagnostics facility.

  6. Spatially Weighted Principal Component Analysis for Imaging Classification

    PubMed Central

    Guo, Ruixin; Ahn, Mihye; Zhu, Hongtu

    2014-01-01

    The aim of this paper is to develop a supervised dimension reduction framework, called Spatially Weighted Principal Component Analysis (SWPCA), for high dimensional imaging classification. Two main challenges in imaging classification are the high dimensionality of the feature space and the complex spatial structure of imaging data. In SWPCA, we introduce two sets of novel weights including global and local spatial weights, which enable a selective treatment of individual features and incorporation of the spatial structure of imaging data and class label information. We develop an e cient two-stage iterative SWPCA algorithm and its penalized version along with the associated weight determination. We use both simulation studies and real data analysis to evaluate the finite-sample performance of our SWPCA. The results show that SWPCA outperforms several competing principal component analysis (PCA) methods, such as supervised PCA (SPCA), and other competing methods, such as sparse discriminant analysis (SDA). PMID:26089629

  7. Assessing the vulnerability of Brazilian municipalities to the vectorial transmission of Trypanosoma cruzi using multi-criteria decision analysis.

    PubMed

    Vinhaes, Márcio Costa; de Oliveira, Stefan Vilges; Reis, Priscilleyne Ouverney; de Lacerda Sousa, Ana Carolina; Silva, Rafaella Albuquerque E; Obara, Marcos Takashi; Bezerra, Cláudia Mendonça; da Costa, Veruska Maia; Alves, Renato Vieira; Gurgel-Gonçalves, Rodrigo

    2014-09-01

    Despite the dramatic reduction in Trypanosoma cruzi vectorial transmission in Brazil, acute cases of Chagas disease (CD) continue to be recorded. The identification of areas with greater vulnerability to the occurrence of vector-borne CD is essential to prevention, control, and surveillance activities. In the current study, data on the occurrence of domiciliated triatomines in Brazil (non-Amazonian regions) between 2007 and 2011 were analyzed. Municipalities' vulnerability was assessed based on socioeconomic, demographic, entomological, and environmental indicators using multi-criteria decision analysis (MCDA). Overall, 2275 municipalities were positive for at least one of the six triatomine species analyzed (Panstrongylus megistus, Triatoma infestans, Triatoma brasiliensis, Triatoma pseudomaculata, Triatoma rubrovaria, and Triatoma sordida). The municipalities that were most vulnerable to vector-borne CD were mainly in the northeast region and exhibited a higher occurrence of domiciliated triatomines, lower socioeconomic levels, and more extensive anthropized areas. Most of the 39 new vector-borne CD cases confirmed between 2001 and 2012 in non-Amazonian regions occurred within the more vulnerable municipalities. Thus, MCDA can help to identify the states and municipalities that are most vulnerable to the transmission of T. cruzi by domiciliated triatomines, which is critical for directing adequate surveillance, prevention, and control activities. The methodological approach and results presented here can be used to enhance CD surveillance in Brazil.

  8. Assessing the value of healthcare interventions using multi-criteria decision analysis: a review of the literature.

    PubMed

    Marsh, Kevin; Lanitis, Tereza; Neasham, David; Orfanos, Panagiotis; Caro, Jaime

    2014-04-01

    The objective of this study is to support those undertaking a multi-criteria decision analysis (MCDA) by reviewing the approaches adopted in healthcare MCDAs to date, how these varied with the objective of the study, and the lessons learned from this experience. Searches of EMBASE and MEDLINE identified 40 studies that provided 41 examples of MCDA in healthcare. Data were extracted on the objective of the study, methods employed, and decision makers' and study authors' reflections on the advantages and disadvantages of the methods. The recent interest in MCDA in healthcare is mirrored in an increase in the application of MCDA to evaluate healthcare interventions. Of the studies identified, the first was published in 1990, but more than half were published since 2011. They were undertaken in 18 different countries, and were designed to support investment (coverage and reimbursement), authorization, prescription, and research funding allocation decisions. Many intervention types were assessed: pharmaceuticals, public health interventions, screening, surgical interventions, and devices. Most used the value measurement approach and scored performance using predefined scales. Beyond these similarities, a diversity of different approaches were adopted, with only limited correspondence between the approach and the type of decision or product. Decision makers consulted as part of these studies, as well as the authors of the studies are positive about the potential of MCDA to improve decision making. Further work is required, however, to develop guidance for those undertaking MCDA.

  9. Segmentation and learning in the quantitative analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Ruggiero, Christy; Ross, Amy; Porter, Reid

    2015-02-01

    In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.

  10. Image analysis: Applications in materials engineering

    SciTech Connect

    Wojnar, L.

    1999-07-01

    This new practical book describes the basic principles of image acquisition, enhancement, measurement, and interpretation in very simple nonmathematical terms. it also provides solution-oriented algorithms and examples and case histories from industry and research, along with quick reference information on various specific problems. Included are numerous tables, graphs, charts, and working examples in detection of grain boundaries, pores, and chain structures.

  11. Image Segmentation Analysis for NASA Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    2010-01-01

    NASA collects large volumes of imagery data from satellite-based Earth remote sensing sensors. Nearly all of the computerized image analysis of this data is performed pixel-by-pixel, in which an algorithm is applied directly to individual image pixels. While this analysis approach is satisfactory in many cases, it is usually not fully effective in extracting the full information content from the high spatial resolution image data that s now becoming increasingly available from these sensors. The field of object-based image analysis (OBIA) has arisen in recent years to address the need to move beyond pixel-based analysis. The Recursive Hierarchical Segmentation (RHSEG) software developed by the author is being used to facilitate moving from pixel-based image analysis to OBIA. The key unique aspect of RHSEG is that it tightly intertwines region growing segmentation, which produces spatially connected region objects, with region object classification, which groups sets of region objects together into region classes. No other practical, operational image segmentation approach has this tight integration of region growing object finding with region classification This integration is made possible by the recursive, divide-and-conquer implementation utilized by RHSEG, in which the input image data is recursively subdivided until the image data sections are small enough to successfully mitigat the combinatorial explosion caused by the need to compute the dissimilarity between each pair of image pixels. RHSEG's tight integration of region growing object finding and region classification is what enables the high spatial fidelity of the image segmentations produced by RHSEG. This presentation will provide an overview of the RHSEG algorithm and describe how it is currently being used to support OBIA or Earth Science applications such as snow/ice mapping and finding archaeological sites from remotely sensed data.

  12. Automated Analysis of Mammography Phantom Images

    NASA Astrophysics Data System (ADS)

    Brooks, Kenneth Wesley

    The present work stems from the hypothesis that humans are inconsistent when making subjective analyses of images and that human decisions for moderately complex images may be performed by a computer with complete objectivity, once a human acceptance level has been established. The following goals were established to test the hypothesis: (1) investigate observer variability within the standard mammographic phantom evaluation process; (2) evaluate options for high-resolution image digitization and utilize the most appropriate technology for standard mammographic phantom film digitization; (3) develop a machine-based vision system for evaluating standard mammographic phantom images to eliminate effects of human variabilities; and (4) demonstrate the completed system's performance against human observers for accreditation and for manufacturing quality control of standard mammographic phantom images. The following methods and procedures were followed to achieve the goals of the research: (1) human variabilities in the American College of Radiology accreditation process were simulated by observer studies involving 30 medical physicists and these were compared to the same number of diagnostic radiologists and untrained control group of observers; (2) current digitization technologies were presented and performance test procedures were developed; three devices were tested which represented commercially available high, intermediate and low-end contrast and spatial resolution capabilities; (3) optimal image processing schemes were applied and tested which performed low, intermediate and high-level computer vision tasks; and (4) the completed system's performance was tested against human observers for accreditation and for manufacturing quality control of standard mammographic phantom images. The results from application of the procedures were as follows: (1) the simulated American College of Radiology mammography accreditation program phantom evaluation process demonstrated

  13. Blind image analysis for the compositional and structural characterization of plant cell walls.

    PubMed

    Perera, Pradeep N; Schmidt, Martin; Schuck, P James; Adams, Paul D

    2011-09-30

    A new image analysis strategy is introduced to determine the composition and the structural characteristics of plant cell walls by combining Raman microspectroscopy and unsupervised data mining methods. The proposed method consists of three main steps: spectral preprocessing, spatial clustering of the image and finally estimation of spectral profiles of pure components and their weights. Point spectra of Raman maps of cell walls were preprocessed to remove noise and fluorescence contributions and compressed with PCA. Processed spectra were then subjected to k-means clustering to identify spatial segregations in the images. Cell wall images were reconstructed with cluster identities and each cluster was represented by the average spectrum of all the pixels in the cluster. Pure components spectra were estimated by spectral entropy minimization criteria with simulated annealing optimization. Two pure spectral estimates that represent lignin and carbohydrates were recovered and their spatial distributions were calculated. Our approach partitioned the cell walls into many sublayers, based on their composition, thus enabling composition analysis at subcellular levels. It also overcame the well known problem that native lignin spectra in lignocellulosics have high spectral overlap with contributions from cellulose and hemicelluloses, thus opening up new avenues for microanalyses of monolignol composition of native lignin and carbohydrates without chemical or mechanical extraction of the cell wall materials.

  14. Independent component analysis applications on THz sensing and imaging

    NASA Astrophysics Data System (ADS)

    Balci, Soner; Maleski, Alexander; Nascimento, Matheus Mello; Philip, Elizabath; Kim, Ju-Hyung; Kung, Patrick; Kim, Seongsin M.

    2016-05-01

    We report Independent Component Analysis (ICA) technique applied to THz spectroscopy and imaging to achieve a blind source separation. A reference water vapor absorption spectrum was extracted via ICA, then ICA was utilized on a THz spectroscopic image in order to clean the absorption of water molecules from each pixel. For this purpose, silica gel was chosen as the material of interest for its strong water absorption. The resulting image clearly showed that ICA effectively removed the water content in the detected signal allowing us to image the silica gel beads distinctively even though it was totally embedded in water before ICA was applied.

  15. Hyperspectral image analysis using artificial color

    NASA Astrophysics Data System (ADS)

    Fu, Jian; Caulfield, H. John; Wu, Dongsheng; Tadesse, Wubishet

    2010-03-01

    By definition, HSC (HyperSpectral Camera) images are much richer in spectral data than, say, a COTS (Commercial-Off-The-Shelf) color camera. But data are not information. If we do the task right, useful information can be derived from the data in HSC images. Nature faced essentially the identical problem. The incident light is so complex spectrally that measuring it with high resolution would provide far more data than animals can handle in real time. Nature's solution was to do irreversible POCS (Projections Onto Convex Sets) to achieve huge reductions in data with minimal reduction in information. Thus we can arrange for our manmade systems to do what nature did - project the HSC image onto two or more broad, overlapping curves. The task we have undertaken in the last few years is to develop this idea that we call Artificial Color. What we report here is the use of the measured HSC image data projected onto two or three convex, overlapping, broad curves in analogy with the sensitivity curves of human cone cells. Testing two quite different HSC images in that manner produced the desired result: good discrimination or segmentation that can be done very simply and hence are likely to be doable in real time with specialized computers. Using POCS on the HSC data to reduce the processing complexity produced excellent discrimination in those two cases. For technical reasons discussed here, the figures of merit for the kind of pattern recognition we use is incommensurate with the figures of merit of conventional pattern recognition. We used some force fitting to make a comparison nevertheless, because it shows what is also obvious qualitatively. In our tasks our method works better.

  16. Analysis of Multipath Pixels in SAR Images

    NASA Astrophysics Data System (ADS)

    Zhao, J. W.; Wu, J. C.; Ding, X. L.; Zhang, L.; Hu, F. M.

    2016-06-01

    As the received radar signal is the sum of signal contributions overlaid in one single pixel regardless of the travel path, the multipath effect should be seriously tackled as the multiple bounce returns are added to direct scatter echoes which leads to ghost scatters. Most of the existing solution towards the multipath is to recover the signal propagation path. To facilitate the signal propagation simulation process, plenty of aspects such as sensor parameters, the geometry of the objects (shape, location, orientation, mutual position between adjacent buildings) and the physical parameters of the surface (roughness, correlation length, permittivity)which determine the strength of radar signal backscattered to the SAR sensor should be given in previous. However, it's not practical to obtain the highly detailed object model in unfamiliar area by field survey as it's a laborious work and time-consuming. In this paper, SAR imaging simulation based on RaySAR is conducted at first aiming at basic understanding of multipath effects and for further comparison. Besides of the pre-imaging simulation, the product of the after-imaging, which refers to radar images is also taken into consideration. Both Cosmo-SkyMed ascending and descending SAR images of Lupu Bridge in Shanghai are used for the experiment. As a result, the reflectivity map and signal distribution map of different bounce level are simulated and validated by 3D real model. The statistic indexes such as the phase stability, mean amplitude, amplitude dispersion, coherence and mean-sigma ratio in case of layover are analyzed with combination of the RaySAR output.

  17. PML diagnostic criteria

    PubMed Central

    Aksamit, Allen J.; Clifford, David B.; Davis, Larry; Koralnik, Igor J.; Sejvar, James J.; Bartt, Russell; Major, Eugene O.; Nath, Avindra

    2013-01-01

    Objective: To establish criteria for the diagnosis of progressive multifocal leukoencephalopathy (PML). Methods: We reviewed available literature to identify various diagnostic criteria employed. Several search strategies employing the terms “progressive multifocal leukoencephalopathy” with or without “JC virus” were performed with PubMed, SCOPUS, and EMBASE search engines. The articles were reviewed by a committee of individuals with expertise in the disorder in order to determine the most useful applicable criteria. Results: A consensus statement was developed employing clinical, imaging, pathologic, and virologic evidence in support of the diagnosis of PML. Two separate pathways, histopathologic and clinical, for PML diagnosis are proposed. Diagnostic classification includes certain, probable, possible, and not PML. Conclusion: Definitive diagnosis of PML requires neuropathologic demonstration of the typical histopathologic triad (demyelination, bizarre astrocytes, and enlarged oligodendroglial nuclei) coupled with the techniques to show the presence of JC virus. The presence of clinical and imaging manifestations consistent with the diagnosis and not better explained by other disorders coupled with the demonstration of JC virus by PCR in CSF is also considered diagnostic. Algorithms for establishing the diagnosis have been recommended. PMID:23568998

  18. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images

    PubMed Central

    Khansari, Maziyar M; O’Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-01-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method’s discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  19. Parameter-Based Performance Analysis of Object-Based Image Analysis Using Aerial and Quikbird-2 Images

    NASA Astrophysics Data System (ADS)

    Kavzoglu, T.; Yildiz, M.

    2014-09-01

    Opening new possibilities for research, very high resolution (VHR) imagery acquired by recent commercial satellites and aerial systems requires advanced approaches and techniques that can handle large volume of data with high local variance. Delineation of land use/cover information from VHR images is a hot research topic in remote sensing. In recent years, object-based image analysis (OBIA) has become a popular solution for image analysis tasks as it considers shape, texture and content information associated with the image objects. The most important stage of OBIA is the image segmentation process applied prior to classification. Determination of optimal segmentation parameters is of crucial importance for the performance of the selected classifier. In this study, effectiveness and applicability of the segmentation method in relation to its parameters was analysed using two VHR images, an aerial photo and a Quickbird-2 image. Multi-resolution segmentation technique was employed with its optimal parameters of scale, shape and compactness that were defined after an extensive trail process on the data sets. Nearest neighbour classifier was applied on the segmented images, and then the accuracy assessment was applied. Results show that segmentation parameters have a direct effect on the classification accuracy, and low values of scale-shape combinations produce the highest classification accuracies. Also, compactness parameter was found to be having minimal effect on the construction of image objects, hence it can be set to a constant value in image classification.

  20. Scale up of a viscous fungal fermentation: application of scale-up criteria with regime analysis and operating boundary conditions.

    PubMed

    Pollard, D J; Kirschner, T F; Hunt, G R; Tong, I-T; Stieber, R; Salmon, P M

    2007-02-01

    The scale up of the novel, pharmaceutically important pneumocandin (B(0)), from the filamentous fungus Glarea lozoyensis was successfully completed from pilot scale (0.07, 0.8, and 19 m(3)) to production scale (57 m(3)). This was accomplished, despite dissimilar reactor geometry, employing a combination of scale-up criteria, process sensitivity studies, and regime analysis using characteristic time constants for both oxygen mass transfer and bulk mixing. Dissolved oxygen tension, separated from the influence of agitation by gas blending at the 0.07 m(3)-scale, had a marked influence on the concentrations of pneumocandin analogs with different levels of hydroxylation, and these concentrations were used as an indicator of bulk mixing upon scale up. The profound impact of dissolved oxygen tension (DOT) (low and high levels) on analog formation dictated the use of constant DOT, at 80% air saturation, as a scale-up criterion. As a result k(L)a, Oxygen uptake rate (OUR) and hence the OTR were held constant, which were effectively conserved across the scales, while the use of other criterion such as P(g)/V(L), or mixing time were less effective. Production scale (57 m(3)) mixing times were found to be faster than those at 19 m(3) due to a difference in liquid height/tank diameter ratio (H(L)/D(T)). Regime analysis at 19 and 57 m(3) for bulk mixing (t(c)) and oxygen transfer (1/k(L)a) showed that oxygen transfer was the rate-limiting step for this highly shear thinning fermentation, providing additional support for the choice of scale-up criterion.

  1. Infrared thermal facial image sequence registration analysis and verification

    NASA Astrophysics Data System (ADS)

    Chen, Chieh-Li; Jian, Bo-Lin

    2015-03-01

    To study the emotional responses of subjects to the International Affective Picture System (IAPS), infrared thermal facial image sequence is preprocessed for registration before further analysis such that the variance caused by minor and irregular subject movements is reduced. Without affecting the comfort level and inducing minimal harm, this study proposes an infrared thermal facial image sequence registration process that will reduce the deviations caused by the unconscious head shaking of the subjects. A fixed image for registration is produced through the localization of the centroid of the eye region as well as image translation and rotation processes. Thermal image sequencing will then be automatically registered using the two-stage genetic algorithm proposed. The deviation before and after image registration will be demonstrated by image quality indices. The results show that the infrared thermal image sequence registration process proposed in this study is effective in localizing facial images accurately, which will be beneficial to the correlation analysis of psychological information related to the facial area.

  2. Segmented infrared image analysis for rotating machinery fault diagnosis

    NASA Astrophysics Data System (ADS)

    Duan, Lixiang; Yao, Mingchao; Wang, Jinjiang; Bai, Tangbo; Zhang, Laibin

    2016-07-01

    As a noncontact and non-intrusive technique, infrared image analysis becomes promising for machinery defect diagnosis. However, the insignificant information and strong noise in infrared image limit its performance. To address this issue, this paper presents an image segmentation approach to enhance the feature extraction in infrared image analysis. A region selection criterion named dispersion degree is also formulated to discriminate fault representative regions from unrelated background information. Feature extraction and fusion methods are then applied to obtain features from selected regions for further diagnosis. Experimental studies on a rotor fault simulator demonstrate that the presented segmented feature enhancement approach outperforms the one from the original image using both Naïve Bayes classifier and support vector machine.

  3. Person identification using fractal analysis of retina images

    NASA Astrophysics Data System (ADS)

    Ungureanu, Constantin; Corniencu, Felicia

    2004-10-01

    Biometric is automated method of recognizing a person based on physiological or behavior characteristics. Among the features measured are retina scan, voice, and fingerprint. A retina-based biometric involves the analysis of the blood vessels situated at the back of the eye. In this paper we present a method, which uses the fractal analysis to characterize the retina images. The Fractal Dimension (FD) of retina vessels was measured for a number of 20 images and have been obtained different values of FD for each image. This algorithm provides a good accuracy is cheap and easy to implement.

  4. (Hyper)-graphical models in biomedical image analysis.

    PubMed

    Paragios, Nikos; Ferrante, Enzo; Glocker, Ben; Komodakis, Nikos; Parisot, Sarah; Zacharaki, Evangelia I

    2016-10-01

    Computational vision, visual computing and biomedical image analysis have made tremendous progress over the past two decades. This is mostly due the development of efficient learning and inference algorithms which allow better and richer modeling of image and visual understanding tasks. Hyper-graph representations are among the most prominent tools to address such perception through the casting of perception as a graph optimization problem. In this paper, we briefly introduce the importance of such representations, discuss their strength and limitations, provide appropriate strategies for their inference and present their application to address a variety of problems in biomedical image analysis. PMID:27377331

  5. (Hyper)-graphical models in biomedical image analysis.

    PubMed

    Paragios, Nikos; Ferrante, Enzo; Glocker, Ben; Komodakis, Nikos; Parisot, Sarah; Zacharaki, Evangelia I

    2016-10-01

    Computational vision, visual computing and biomedical image analysis have made tremendous progress over the past two decades. This is mostly due the development of efficient learning and inference algorithms which allow better and richer modeling of image and visual understanding tasks. Hyper-graph representations are among the most prominent tools to address such perception through the casting of perception as a graph optimization problem. In this paper, we briefly introduce the importance of such representations, discuss their strength and limitations, provide appropriate strategies for their inference and present their application to address a variety of problems in biomedical image analysis.

  6. The Land Analysis System (LAS) for multispectral image processing

    USGS Publications Warehouse

    Wharton, S. W.; Lu, Y. C.; Quirk, Bruce K.; Oleson, Lyndon R.; Newcomer, J. A.; Irani, Frederick M.

    1988-01-01

    The Land Analysis System (LAS) is an interactive software system available in the public domain for the analysis, display, and management of multispectral and other digital image data. LAS provides over 240 applications functions and utilities, a flexible user interface, complete online and hard-copy documentation, extensive image-data file management, reformatting, conversion utilities, and high-level device independent access to image display hardware. The authors summarize the capabilities of the current release of LAS (version 4.0) and discuss plans for future development. Particular emphasis is given to the issue of system portability and the importance of removing and/or isolating hardware and software dependencies.

  7. An investigation of image compression on NIIRS rating degradation through automated image analysis

    NASA Astrophysics Data System (ADS)

    Chen, Hua-Mei; Blasch, Erik; Pham, Khanh; Wang, Zhonghai; Chen, Genshe

    2016-05-01

    The National Imagery Interpretability Rating Scale (NIIRS) is a subjective quantification of static image widely adopted by the Geographic Information System (GIS) community. Efforts have been made to relate NIIRS image quality to sensor parameters using the general image quality equations (GIQE), which make it possible to automatically predict the NIIRS rating of an image through automated image analysis. In this paper, we present an automated procedure to extract line edge profile based on which the NIIRS rating of a given image can be estimated through the GIQEs if the ground sampling distance (GSD) is known. Steps involved include straight edge detection, edge stripes determination, and edge intensity determination, among others. Next, we show how to employ GIQEs to estimate NIIRS degradation without knowing the ground truth GSD and investigate the effects of image compression on the degradation of an image's NIIRS rating. Specifically, we consider JPEG and JPEG2000 image compression standards. The extensive experimental results demonstrate the effect of image compression on the ground sampling distance and relative edge response, which are the major factors effecting NIIRS rating.

  8. Analysis of imaging quality under the systematic parameters for thermal imaging system

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Jin, Weiqi

    2009-07-01

    The integration of thermal imaging system and radar system could increase the range of target identification as well as strengthen the accuracy and reliability of detection, which is a state-of-the-art and mainstream integrated system to search any invasive target and guard homeland security. When it works, there is, however, one defect existing of what the thermal imaging system would produce affected images which could cause serious consequences when searching and detecting. In this paper, we study and reveal the reason why and how the affected images would occur utilizing the principle of lightwave before establishing mathematical imaging model which could meet the course of ray transmitting. In the further analysis, we give special attentions to the systematic parameters of the model, and analyse in detail all parameters which could possibly affect the imaging process and the function how it does respectively. With comprehensive research, we obtain detailed information about the regulation of diffractive phenomena shaped by these parameters. Analytical results have been convinced through the comparison between experimental images and MATLAB simulated images, while simulated images based on the parameters we revised to judge our expectation have good comparability with images acquired in reality.

  9. Multispectral/hyperspectral image enhancement for biological cell analysis

    SciTech Connect

    Nuffer, Lisa L.; Medvick, Patricia A.; Foote, Harlan P.; Solinsky, James C.

    2006-08-01

    The paper shows new techniques for analyzing cell images taken with a microscope using multiple filters to form a datacube of spectral image planes. Because of the many neighboring spectral samples, much of the datacube appears as redundant, similar tissue. The analysis is based on the nonGaussian statistics of the image data, allowing for remapping of the data into image components that are dissimilar, and hence isolate subtle, spatial object regions of interest in the tissues. This individual component image set can be recombined into a single RGB color image useful in real-time location of regions of interest. The algorithms are susceptible to parallelization using Field Programmable Gate Array hardware processing.

  10. Simulation and analysis about noisy range images of laser radar

    NASA Astrophysics Data System (ADS)

    Zhao, Mingbo; He, Jun; Fu, Qiang; Xi, Dan

    2011-06-01

    A measured range image of imaging laser radar (ladar) is usually disturbed by dropouts and outliers. For the difficulty of obtaining measured data and controlling noise level of dropouts and outliers, a new simulation method for range image with noise is proposed. Based on the noise formation mechanism of ladar range image, an accurate ladar range imaging model is formulated, including three major influencing factors: speckle, atmospheric turbulence and receiver noise. The noisy range images under different scenarios are obtained using MATLABTM. Analysis on simulation results reveals that: (1) Despite of detection strategy, the speckle, the atmospheric turbulence and the receiver noise are major factors which cause dropouts and outliers. (2) The receiver noise itself has limited effect on outliers. However, if other factors (speckle, atmospheric turbulence, etc.) also exist, the effect will be sharply enhanced. (3) Both dropouts and outliers exist in background and target regions.

  11. Method for measuring anterior chamber volume by image analysis

    NASA Astrophysics Data System (ADS)

    Zhai, Gaoshou; Zhang, Junhong; Wang, Ruichang; Wang, Bingsong; Wang, Ningli

    2007-12-01

    Anterior chamber volume (ACV) is very important for an oculist to make rational pathological diagnosis as to patients who have some optic diseases such as glaucoma and etc., yet it is always difficult to be measured accurately. In this paper, a method is devised to measure anterior chamber volumes based on JPEG-formatted image files that have been transformed from medical images using the anterior-chamber optical coherence tomographer (AC-OCT) and corresponding image-processing software. The corresponding algorithms for image analysis and ACV calculation are implemented in VC++ and a series of anterior chamber images of typical patients are analyzed, while anterior chamber volumes are calculated and are verified that they are in accord with clinical observation. It shows that the measurement method is effective and feasible and it has potential to improve accuracy of ACV calculation. Meanwhile, some measures should be taken to simplify the handcraft preprocess working as to images.

  12. Neural maps in remote sensing image analysis.

    PubMed

    Villmann, Thomas; Merényi, Erzsébet; Hammer, Barbara

    2003-01-01

    We study the application of self-organizing maps (SOMs) for the analyses of remote sensing spectral images. Advanced airborne and satellite-based imaging spectrometers produce very high-dimensional spectral signatures that provide key information to many scientific investigations about the surface and atmosphere of Earth and other planets. These new, sophisticated data demand new and advanced approaches to cluster detection, visualization, and supervised classification. In this article we concentrate on the issue of faithful topological mapping in order to avoid false interpretations of cluster maps created by an SOM. We describe several new extensions of the standard SOM, developed in the past few years: the growing SOM, magnification control, and generalized relevance learning vector quantization, and demonstrate their effect on both low-dimensional traditional multi-spectral imagery and approximately 200-dimensional hyperspectral imagery.

  13. [Diagnostic criteria in acute neuromyelitis].

    PubMed

    Panea, Cristina; Petrescu, Simona; Monica, Pop; Voinea, Liliana; Dascălu, Ana-Maria; Nicolae, Miruna; Ungureanu, E; Panca, Aida; Grădinaru, Sânziana

    2007-01-01

    Neuromyelitis optica, also known as Devic disease, was identified in the 19th century, is one of the inflammatory idiopathic demyelinating diseases of the central nervous system, often mistaken for severe multiple sclerosis. In 1999 it had been proposed diagnostic criteria for neuromyelitis optica, but in 2006 these criteria were revised by Dean Wingerchuck. These criteria are 99% sensitive and 90% specific for differentiating neuromyelitis optica from multiple sclerosis that present with optic neuritis or a myelitis syndrome. In the following article we present clinical, spinal and cerebral MR imaging, serological and aspects of cerebrospinal fluid examination features of neuromyelitis optica and the revised criteria of neuromyelitis optica established in 2006. The recently identified serum antibody biomarker: neuromyelitis optica immunoglobulin G (NMO Ig G), which target aquaporin 4 water channel, distinguish neuromyelitis optica from multiple sclerosis, is one of the revised criteria of neuromyelitis optica. PMID:18543687

  14. Computerized microscopic image analysis of follicular lymphoma

    NASA Astrophysics Data System (ADS)

    Sertel, Olcay; Kong, Jun; Lozanski, Gerard; Catalyurek, Umit; Saltz, Joel H.; Gurcan, Metin N.

    2008-03-01

    Follicular Lymphoma (FL) is a cancer arising from the lymphatic system. Originating from follicle center B cells, FL is mainly comprised of centrocytes (usually middle-to-small sized cells) and centroblasts (relatively large malignant cells). According to the World Health Organization's recommendations, there are three histological grades of FL characterized by the number of centroblasts per high-power field (hpf) of area 0.159 mm2. In current practice, these cells are manually counted from ten representative fields of follicles after visual examination of hematoxylin and eosin (H&E) stained slides by pathologists. Several studies clearly demonstrate the poor reproducibility of this grading system with very low inter-reader agreement. In this study, we are developing a computerized system to assist pathologists with this process. A hybrid approach that combines information from several slides with different stains has been developed. Thus, follicles are first detected from digitized microscopy images with immunohistochemistry (IHC) stains, (i.e., CD10 and CD20). The average sensitivity and specificity of the follicle detection tested on 30 images at 2×, 4× and 8× magnifications are 85.5+/-9.8% and 92.5+/-4.0%, respectively. Since the centroblasts detection is carried out in the H&E-stained slides, the follicles in the IHC-stained images are mapped to H&E-stained counterparts. To evaluate the centroblast differentiation capabilities of the system, 11 hpf images have been marked by an experienced pathologist who identified 41 centroblast cells and 53 non-centroblast cells. A non-supervised clustering process differentiates the centroblast cells from noncentroblast cells, resulting in 92.68% sensitivity and 90.57% specificity.

  15. Measurement and analysis of image sensors

    NASA Astrophysics Data System (ADS)

    Vitek, Stanislav

    2005-06-01

    For astronomical applications is necessary to have high precision in sensing and processing the image data. In this time are used the large CCD sensors from the various reasons. For the replacement of CCD sensors with CMOS sensing devices is important to know transfer characteristics of used CCD sensors. In the special applications like the robotic telescopes (fully automatic, without human interactions) seems to be good using of specially designed smart sensors, which have integrated more functions and have more features than CCDs.

  16. Multispectral image analysis of bruise age

    NASA Astrophysics Data System (ADS)

    Sprigle, Stephen; Yi, Dingrong; Caspall, Jayme; Linden, Maureen; Kong, Linghua; Duckworth, Mark

    2007-03-01

    The detection and aging of bruises is important within clinical and forensic environments. Traditionally, visual and photographic assessment of bruise color is used to determine age, but this substantially subjective technique has been shown to be inaccurate and unreliable. The purpose of this study was to develop a technique to spectrally-age bruises using a reflective multi-spectral imaging system that minimizes the filtering and hardware requirements while achieving acceptable accuracy. This approach will then be incorporated into a handheld, point-of-care technology that is clinically-viable and affordable. Sixteen bruises from elder residents of a long term care facility were imaged over time. A multi-spectral system collected images through eleven narrow band (~10 nm FWHM) filters having center wavelengths ranging between 370-970 nm corresponding to specific skin and blood chromophores. Normalized bruise reflectance (NBR)- defined as the ratio of optical reflectance coefficient of bruised skin over that of normal skin- was calculated for all bruises at all wavelengths. The smallest mean NBR, regardless of bruise age, was found at wavelength between 555 & 577nm suggesting that contrast in bruises are from the hemoglobin, and that they linger for a long duration. A contrast metric, based on the NBR at 460nm and 650nm, was found to be sensitive to age and requires further investigation. Overall, the study identified four key wavelengths that have promise to characterize bruise age. However, the high variability across the bruises imaged in this study complicates the development of a handheld detection system until additional data is available.

  17. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic

  18. Seismoelectric beamforming imaging: a sensitivity analysis

    NASA Astrophysics Data System (ADS)

    El Khoury, P.; Revil, A.; Sava, P.

    2015-06-01

    The electrical current density generated by the propagation of a seismic wave at the interface characterized by a drop in electrical, hydraulic or mechanical properties produces an electrical field of electrokinetic nature. This field can be measured remotely with a signal-to-noise ratio depending on the background noise and signal attenuation. The seismoelectric beamforming approach is an emerging imaging technique based on scanning a porous material using appropriately delayed seismic sources. The idea is to focus the hydromechanical energy on a regular spatial grid and measure the converted electric field remotely at each focus time. This method can be used to image heterogeneities with a high definition and to provide structural information to classical geophysical methods. A numerical experiment is performed to investigate the resolution of the seismoelectric beamforming approach with respect to the main wavelength of the seismic waves. The 2-D model consists of a fictitious water-filled bucket in which a cylindrical sandstone core sample is set up vertically. The hydrophones/seismic sources are located on a 50-cm diameter circle in the bucket and the seismic energy is focused on the grid points in order to scan the medium and determine the geometry of the porous plug using the output electric potential image. We observe that the resolution of the method is given by a density of eight scanning points per wavelength. Additional numerical tests were also performed to see the impact of a wrong velocity model upon the seismoelectric map displaying the heterogeneities of the material.

  19. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  20. Application of image analysis in the myocardial biopsies of patients with dilated cardiomyopathy

    NASA Astrophysics Data System (ADS)

    Agapitos, Emanuel; Kavantzas, Nikolaos; Bakouris, M. G.; Kassis, Kyriakos A.; Nanas, J.; Margari, Z.; Davaris, P.

    1996-04-01

    The aim of our study is to investigate if myocardial fibrosis measured by image analysis may be considered as an important and accurate index of dilated cardiomyopathy and its prognosis. The study group consisted of 24 patients with dilated cardiomyopathy which was diagnosed by echocardiography, radionuclide ventriculography, cardiac catheterization and left ventricular endomyocardial biopsy. The patients' overall disability was conventionally expressed with the criteria for functional capacity. Using image analysis the percentage of fibrosis in a total of 35 myocardial biopsies was measured accurately. A comparison study between the percentage of myocardial fibrosis and the clinical parameters (left ventricular ejection fraction and overall functional capacity) showing the degree of each patient's heart failure followed. A correlation was found among fibrosis, left ventricular ejection fraction and overall functional capacity. The cases with small values of fibrosis (less than 10%) have big values of ejection fraction and belong in Class I of overall functional capacity. The cases with big values of fibrosis (greater than 10%) belong in Classes III and IV of overall functional capacity and have small values of ejection fraction. The results of the comparison study were presented graphically and were considered significant. Myocardial fibrosis measured by image analysis might be considered an important prognostic index of dilated cardiomyopathy.

  1. A guide to human in vivo microcirculatory flow image analysis.

    PubMed

    Massey, Michael J; Shapiro, Nathan I

    2016-01-01

    Various noninvasive microscopic camera technologies have been used to visualize the sublingual microcirculation in patients. We describe a comprehensive approach to bedside in vivo sublingual microcirculation video image capture and analysis techniques in the human clinical setting. We present a user perspective and guide suitable for clinical researchers and developers interested in the capture and analysis of sublingual microcirculatory flow videos. We review basic differences in the cameras, optics, light sources, operation, and digital image capture. We describe common techniques for image acquisition and discuss aspects of video data management, including data transfer, metadata, and database design and utilization to facilitate the image analysis pipeline. We outline image analysis techniques and reporting including video preprocessing and image quality evaluation. Finally, we propose a framework for future directions in the field of microcirculatory flow videomicroscopy acquisition and analysis. Although automated scoring systems have not been sufficiently robust for widespread clinical or research use to date, we discuss promising innovations that are driving new development. PMID:26861691

  2. Impact of UCSF criteria according to pre- and post-OLT tumor features: analysis of 479 patients listed for HCC with a short waiting time.

    PubMed

    Decaens, Thomas; Roudot-Thoraval, Françoise; Hadni-Bresson, Solange; Meyer, Carole; Gugenheim, Jean; Durand, Francois; Bernard, Pierre-Henri; Boillot, Olivier; Sulpice, Laurent; Calmus, Yvon; Hardwigsen, Jean; Ducerf, Christian; Pageaux, Georges-Philippe; Dharancy, Sebastien; Chazouilleres, Olivier; Cherqui, Daniel; Duvoux, Christophe

    2006-12-01

    Orthotopic liver transplantation (OLT) indication for hepatocellular carcinoma (HCC) is currently based on the Milan criteria. The University of California, San Francisco (UCSF) recently proposed an expansion of the selection criteria according to tumors characteristics on the explanted liver. This study: 1) assessed the validity of these criteria in an independent large series and 2) tested for the usefulness of these criteria when applied to pre-OLT tumor evaluation. Between 1985 and 1998, 479 patients were listed for liver transplantation (LT) for HCC and 467 were transplanted. According to pre-OLT (imaging at date of listing) or post-OLT (explanted liver) tumor characteristics, patients were retrospectively classified according to both the Milan and UCSF criteria. The 5-yr survival statistics were assessed by the Kaplan-Meier method and compared by the log-rank test. Pre-OLT UCSF criteria were analyzed according to an intention-to-treat principle. Based on the pre-OLT evaluation, 279 patients were Milan+, 44 patients were UCSF+ but Milan- (subgroup of patients that might benefit from the expansion), and 145 patients were UCSF- and Milan-. With a short median waiting time of 4 months, 5-yr survival was 60.1 +/- 3.0%, 45.6 +/- 7.8%, and 34.7 +/- 4.0%, respectively (P < 0.001). The 5-yr survival was arithmetically lower in UCSF+ Milan- patients compared to Milan+ but this difference was not significant (P = 0.10). Based on pathological features of the explanted liver, 5-yr survival was 70.4 +/- 3.4%, 63.6 +/- 7.8%, and 34.1 +/- 3.1%, in Milan+ patients (n = 184), UCSF+ Milan- patients (n = 39), and UCSF- Milan- patients (n = 238), respectively (P < 0.001). However, the 5-yr survival did not differ between Milan+ and UCSF+ Milan- patients (P = 0.33). In conclusion, these results show that when applied to pre-OLT evaluation, the UCSF criteria are associated with a 5-yr survival below 50%. Their applicability is therefore limited, despite similar survival rates

  3. Eliciting and Combining Decision Criteria Using a Limited Palette of Utility Functions and Uncertainty Distributions: Illustrated by Application to Pest Risk Analysis.

    PubMed

    Holt, Johnson; Leach, Adrian W; Schrader, Gritta; Petter, Françoise; MacLeod, Alan; van der Gaag, Dirk Jan; Baker, Richard H A; Mumford, John D

    2014-01-01

    Utility functions in the form of tables or matrices have often been used to combine discretely rated decision-making criteria. Matrix elements are usually specified individually, so no one rule or principle can be easily stated for the utility function as a whole. A series of five matrices are presented that aggregate criteria two at a time using simple rules that express a varying degree of constraint of the lower rating over the higher. A further nine possible matrices were obtained by using a different rule either side of the main axis of the matrix to describe situations where the criteria have a differential influence on the outcome. Uncertainties in the criteria are represented by three alternative frequency distributions from which the assessors select the most appropriate. The output of the utility function is a distribution of rating frequencies that is dependent on the distributions of the input criteria. In pest risk analysis (PRA), seven of these utility functions were required to mimic the logic by which assessors for the European and Mediterranean Plant Protection Organization arrive at an overall rating of pest risk. The framework enables the development of PRAs that are consistent and easy to understand, criticize, compare, and change. When tested in workshops, PRA practitioners thought that the approach accorded with both the logic and the level of resolution that they used in the risk assessments.

  4. Eliciting and Combining Decision Criteria Using a Limited Palette of Utility Functions and Uncertainty Distributions: Illustrated by Application to Pest Risk Analysis.

    PubMed

    Holt, Johnson; Leach, Adrian W; Schrader, Gritta; Petter, Françoise; MacLeod, Alan; van der Gaag, Dirk Jan; Baker, Richard H A; Mumford, John D

    2014-01-01

    Utility functions in the form of tables or matrices have often been used to combine discretely rated decision-making criteria. Matrix elements are usually specified individually, so no one rule or principle can be easily stated for the utility function as a whole. A series of five matrices are presented that aggregate criteria two at a time using simple rules that express a varying degree of constraint of the lower rating over the higher. A further nine possible matrices were obtained by using a different rule either side of the main axis of the matrix to describe situations where the criteria have a differential influence on the outcome. Uncertainties in the criteria are represented by three alternative frequency distributions from which the assessors select the most appropriate. The output of the utility function is a distribution of rating frequencies that is dependent on the distributions of the input criteria. In pest risk analysis (PRA), seven of these utility functions were required to mimic the logic by which assessors for the European and Mediterranean Plant Protection Organization arrive at an overall rating of pest risk. The framework enables the development of PRAs that are consistent and easy to understand, criticize, compare, and change. When tested in workshops, PRA practitioners thought that the approach accorded with both the logic and the level of resolution that they used in the risk assessments. PMID:23834916

  5. Evaluating dot and Western blots using image analysis and pixel quantification of electronic images.

    PubMed

    Vierck, J L; Bryne, K M; Dodson, M V

    2000-01-01

    Inexpensive computer imaging technology was used to assess levels of insulin-like growth factor-I (IGF-I) on dot blots (DB) and alpha-Actinin on Western blots (WB). In the first procedure, known IGF-I samples were dotted on nitrocellulose membranes using a vacuum manifold. After the DB were developed and dried, the images were digitized using an HP Deskscan II flat bed scanner, exported into Image-Pro Plus and analyzed by taking the combined mean of 45 degrees and 135 degrees sample lines drawn through each dot. Dot blots corresponding to a linear concentration range from 10 to 300 ng IGF-I were assessed by this method. In the second procedure, WB were scanned with a ScanJet 3c flat bed scanner and their backgrounds were clarified using Image-Pro Plus. A second image analysis program, Alpha Imager 2000, was then used to define the boundaries of protein bands, assess pixel number and density, and to obtain final numerical data for quantifying alpha-Actinin on the WB. Collectively, the results of these two studies suggest that specific proteins may be evaluated by using relatively inexpensive image analysis software systems via pixel quantification of electronic images. PMID:11549944

  6. Open microscopy environment and findspots: integrating image informatics with quantitative multidimensional image analysis.

    PubMed

    Schiffmann, David A; Dikovskaya, Dina; Appleton, Paul L; Newton, Ian P; Creager, Douglas A; Allan, Chris; Näthke, Inke S; Goldberg, Ilya G

    2006-08-01

    Biomedical research and drug development increasingly involve the extraction of quantitative data from digital microscope images, such as those obtained using fluorescence microscopy. Here, we describe a novel approach for both managing and analyzing such images. The Open Microscopy Environment (OME) is a sophisticated open-source scientific image management database that coordinates the organization, storage, and analysis of the large volumes of image data typically generated by modern imaging methods. We describe FindSpots, a powerful image-analysis package integrated in OME that will be of use to those who wish to identify and measure objects within microscope images or time-lapse movies. The algorithm used in FindSpots is in fact only one of many possible segmentation (object detection) algorithms, and the underlying data model used by OME to capture and store its results can also be used to store results from other segmentation algorithms. In this report, we illustrate how image segmentation can be achieved in OME using one such implementation of a segmentation algorithm, and how this output subsequently can be displayed graphically or processed numerically using a spreadsheet.

  7. A criticism of applications with multi-criteria decision analysis that are used for the site selection for the disposal of municipal solid wastes

    SciTech Connect

    Kemal Korucu, M.; Erdagi, Bora

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer The existing structure of the multi-criteria decision analysis for site selection is criticized. Black-Right-Pointing-Pointer Fundamental problematic points based on the critics are defined. Black-Right-Pointing-Pointer Some modifications are suggested in order to provide solutions to these problematical points. Black-Right-Pointing-Pointer A new structure for the decision making mechanism is proposed. Black-Right-Pointing-Pointer The feasibility of the new method is subjected to an evaluation process. - Abstract: The main aim of this study is to criticize the process of selecting the most appropriate site for the disposal of municipal solid wastes which is one of the problematic issues of waste management operations. These kinds of problems are pathological symptoms of existing problematical human-nature relationship which is related to the syndrome called ecological crisis. In this regard, solving the site selection problem, which is just a small part of a larger entity, for the good of ecological rationality and social justice is only possible by founding a new and extensive type of human-nature relationship. In this study, as a problematic point regarding the discussions on ecological problems, the existing structure of the applications using multi-criteria decision analysis in the process of site selection with three main criteria is criticized. Based on this critique, fundamental problematic points (to which applications are insufficient to find solutions) will be defined. Later, some modifications will be suggested in order to provide solutions to these problematical points. Finally, the criticism addressed to the structure of the method with three main criteria and the feasibility of the new method with four main criteria is subjected to an evaluation process. As a result, it is emphasized that the new structure with four main criteria may be effective in solution of the fundamental problematic points.

  8. Analysis of pregerminated barley using hyperspectral image analysis.

    PubMed

    Arngren, Morten; Hansen, Per Waaben; Eriksen, Birger; Larsen, Jan; Larsen, Rasmus

    2011-11-01

    Pregermination is one of many serious degradations to barley when used for malting. A pregerminated barley kernel can under certain conditions not regerminate and is reduced to animal feed of lower quality. Identifying pregermination at an early stage is therefore essential in order to segregate the barley kernels into low or high quality. Current standard methods to quantify pregerminated barley include visual approaches, e.g. to identify the root sprout, or using an embryo staining method, which use a time-consuming procedure. We present an approach using a near-infrared (NIR) hyperspectral imaging system in a mathematical modeling framework to identify pregerminated barley at an early stage of approximately 12 h of pregermination. Our model only assigns pregermination as the cause for a single kernel's lack of germination and is unable to identify dormancy, kernel damage etc. The analysis is based on more than 750 Rosalina barley kernels being pregerminated at 8 different durations between 0 and 60 h based on the BRF method. Regerminating the kernels reveals a grouping of the pregerminated kernels into three categories: normal, delayed and limited germination. Our model employs a supervised classification framework based on a set of extracted features insensitive to the kernel orientation. An out-of-sample classification error of 32% (CI(95%): 29-35%) is obtained for single kernels when grouped into the three categories, and an error of 3% (CI(95%): 0-15%) is achieved on a bulk kernel level. The model provides class probabilities for each kernel, which can assist in achieving homogeneous germination profiles. This research can further be developed to establish an automated and faster procedure as an alternative to the standard procedures for pregerminated barley.

  9. Imaging for dismantlement verification: information management and analysis algorithms

    SciTech Connect

    Seifert, Allen; Miller, Erin A.; Myjak, Mitchell J.; Robinson, Sean M.; Jarman, Kenneth D.; Misner, Alex C.; Pitts, W. Karl; Woodring, Mitchell L.

    2010-09-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute. However, this process must be performed with care. Computing the perimeter, area, and intensity of an object, for example, might reveal sensitive information relating to shape, size, and material composition. This paper presents three analysis algorithms that reduce full image information to non-sensitive feature information. Ultimately, the algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We evaluate the algorithms on both their technical performance in image analysis, and their application with and without an explicitly constructed information barrier. The underlying images can be highly detailed, since they are dynamically generated behind the information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography.

  10. Enhanced bone structural analysis through pQCT image preprocessing.

    PubMed

    Cervinka, T; Hyttinen, J; Sievanen, H

    2010-05-01

    Several factors, including preprocessing of the image, can affect the reliability of pQCT-measured bone traits, such as cortical area and trabecular density. Using repeated scans of four different liquid phantoms and repeated in vivo scans of distal tibiae from 25 subjects, the performance of two novel preprocessing methods, based on the down-sampling of grayscale intensity histogram and the statistical approximation of image data, was compared to 3 x 3 and 5 x 5 median filtering. According to phantom measurements, the signal to noise ratio in the raw pQCT images (XCT 3000) was low ( approximately 20dB) which posed a challenge for preprocessing. Concerning the cortical analysis, the reliability coefficient (R) was 67% for the raw image and increased to 94-97% after preprocessing without apparent preference for any method. Concerning the trabecular density, the R-values were already high ( approximately 99%) in the raw images leaving virtually no room for improvement. However, some coarse structural patterns could be seen in the preprocessed images in contrast to a disperse distribution of density levels in the raw image. In conclusion, preprocessing cannot suppress the high noise level to the extent that the analysis of mean trabecular density is essentially improved, whereas preprocessing can enhance cortical bone analysis and also facilitate coarse structural analyses of the trabecular region.

  11. A TSVD Analysis of Microwave Inverse Scattering for Breast Imaging

    PubMed Central

    Shea, Jacob D.; Van Veen, Barry D.; Hagness, Susan C.

    2013-01-01

    A variety of methods have been applied to the inverse scattering problem for breast imaging at microwave frequencies. While many techniques have been leveraged toward a microwave imaging solution, they are all fundamentally dependent on the quality of the scattering data. Evaluating and optimizing the information contained in the data are, therefore, instrumental in understanding and achieving optimal performance from any particular imaging method. In this paper, a method of analysis is employed for the evaluation of the information contained in simulated scattering data from a known dielectric profile. The method estimates optimal imaging performance by mapping the data through the inverse of the scattering system. The inverse is computed by truncated singular-value decomposition of a system of scattering equations. The equations are made linear by use of the exact total fields in the imaging volume, which are available in the computational domain. The analysis is applied to anatomically realistic numerical breast phantoms. The utility of the method is demonstrated for a given imaging system through the analysis of various considerations in system design and problem formulation. The method offers an avenue for decoupling the problem of data selection from the problem of image formation from that data. PMID:22113770

  12. The Spectral Image Processing System (SIPS) - Interactive visualization and analysis of imaging spectrometer data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1993-01-01

    The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).

  13. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    SciTech Connect

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin; Hollingsworth, Alan B.; Qian, Wei

    2015-11-15

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy.

  14. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    PubMed Central

    Aghaei, Faranak; Tan, Maxine; Hollingsworth, Alan B.; Qian, Wei; Liu, Hong; Zheng, Bin

    2015-01-01

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy. PMID:26520742

  15. CrystPro: Spatiotemporal Analysis of Protein Crystallization Images

    PubMed Central

    2015-01-01

    Thousands of experiments corresponding to different combinations of conditions are set up to determine the relevant conditions for successful protein crystallization. In recent years, high throughput robotic set-ups have been developed to automate the protein crystallization experiments, and imaging techniques are used to monitor the crystallization progress. Images are collected multiple times during the course of an experiment. Huge number of collected images make manual review of images tedious and discouraging. In this paper, utilizing trace fluorescence labeling, we describe an automated system called CrystPro for monitoring the protein crystal growth in crystallization trial images by analyzing the time sequence images. Given the sets of image sequences, the objective is to develop an efficient and reliable system to detect crystal growth changes such as new crystal formation and increase of crystal size. CrystPro consists of three major steps- identification of crystallization trials proper for spatio-temporal analysis, spatio-temporal analysis of identified trials, and crystal growth analysis. We evaluated the performance of our system on 3 crystallization image datasets (PCP-ILopt-11, PCP-ILopt-12, and PCP-ILopt-13) and compared our results with expert scores. Our results indicate a) 98.3% accuracy and .896 sensitivity on identification of trials for spatio-temporal analysis, b) 77.4% accuracy and .986 sensitivity of identifying crystal pairs with new crystal formation, and c) 85.8% accuracy and 0.667 sensitivity on crystal size increase detection. The results show that our method is reliable and efficient for tracking growth of crystals and determining useful image sequences for further review by the crystallographers. PMID:26640418

  16. Effect of nutrition survey 'cleaning criteria' on estimates of malnutrition prevalence and disease burden: secondary data analysis.

    PubMed

    Crowe, Sonya; Seal, Andrew; Grijalva-Eternod, Carlos; Kerac, Marko

    2014-01-01

    Tackling childhood malnutrition is a global health priority. A key indicator is the estimated prevalence of malnutrition, measured by nutrition surveys. Most aspects of survey design are standardised, but data 'cleaning criteria' are not. These aim to exclude extreme values which may represent measurement or data-entry errors. The effect of different cleaning criteria on malnutrition prevalence estimates was unknown. We applied five commonly used data cleaning criteria (WHO 2006; EPI-Info; WHO 1995 fixed; WHO 1995 flexible; SMART) to 21 national Demographic and Health Survey datasets. These included a total of 163,228 children, aged 6-59 months. We focused on wasting (low weight-for-height), a key indicator for treatment programmes. Choice of cleaning criteria had a marked effect: SMART were least inclusive, resulting in the lowest reported malnutrition prevalence, while WHO 2006 were most inclusive, resulting in the highest. Across the 21 countries, the proportion of records excluded was 3 to 5 times greater when using SMART compared to WHO 2006 criteria, resulting in differences in the estimated prevalence of total wasting of between 0.5 and 3.8%, and differences in severe wasting of 0.4-3