Science.gov

Sample records for image analysis criteria

  1. Theoretical analysis of multispectral image segmentation criteria.

    PubMed

    Kerfoot, I B; Bresler, Y

    1999-01-01

    Markov random field (MRF) image segmentation algorithms have been extensively studied, and have gained wide acceptance. However, almost all of the work on them has been experimental. This provides a good understanding of the performance of existing algorithms, but not a unified explanation of the significance of each component. To address this issue, we present a theoretical analysis of several MRF image segmentation criteria. Standard methods of signal detection and estimation are used in the theoretical analysis, which quantitatively predicts the performance at realistic noise levels. The analysis is decoupled into the problems of false alarm rate, parameter selection (Neyman-Pearson and receiver operating characteristics), detection threshold, expected a priori boundary roughness, and supervision. Only the performance inherent to a criterion, with perfect global optimization, is considered. The analysis indicates that boundary and region penalties are very useful, while distinct-mean penalties are of questionable merit. Region penalties are far more important for multispectral segmentation than for greyscale. This observation also holds for Gauss-Markov random fields, and for many separable within-class PDFs. To validate the analysis, we present optimization algorithms for several criteria. Theoretical and experimental results agree fairly well. PMID:18267494

  2. Design Criteria For Networked Image Analysis System

    NASA Astrophysics Data System (ADS)

    Reader, Cliff; Nitteberg, Alan

    1982-01-01

    Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.

  3. Difference image analysis: automatic kernel design using information criteria

    NASA Astrophysics Data System (ADS)

    Bramich, D. M.; Horne, Keith; Alsubai, K. A.; Bachelet, E.; Mislis, D.; Parley, N.

    2016-03-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularization. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unregularized delta basis functions, combined with either the Akaike or Takeuchi information criterion, is the best kernel solution method in terms of photometric accuracy. Our results are validated by tests performed on two independent sets of real data. Finally, we provide some important recommendations for software implementations of difference image analysis.

  4. Classification of functional bowel disorders by objective physiological criteria based on endoluminal image analysis.

    PubMed

    Malagelada, Carolina; Drozdzal, Michal; Seguí, Santi; Mendez, Sara; Vitrià, Jordi; Radeva, Petia; Santos, Javier; Accarino, Anna; Malagelada, Juan-R; Azpiroz, Fernando

    2015-09-15

    We have previously developed an original method to evaluate small bowel motor function based on computer vision analysis of endoluminal images obtained by capsule endoscopy. Our aim was to demonstrate intestinal motor abnormalities in patients with functional bowel disorders by endoluminal vision analysis. Patients with functional bowel disorders (n = 205) and healthy subjects (n = 136) ingested the endoscopic capsule (Pillcam-SB2, Given-Imaging) after overnight fast and 45 min after gastric exit of the capsule a liquid meal (300 ml, 1 kcal/ml) was administered. Endoluminal image analysis was performed by computer vision and machine learning techniques to define the normal range and to identify clusters of abnormal function. After training the algorithm, we used 196 patients and 48 healthy subjects, completely naive, as test set. In the test set, 51 patients (26%) were detected outside the normal range (P < 0.001 vs. 3 healthy subjects) and clustered into hypo- and hyperdynamic subgroups compared with healthy subjects. Patients with hypodynamic behavior (n = 38) exhibited less luminal closure sequences (41 ± 2% of the recording time vs. 61 ± 2%; P < 0.001) and more static sequences (38 ± 3 vs. 20 ± 2%; P < 0.001); in contrast, patients with hyperdynamic behavior (n = 13) had an increased proportion of luminal closure sequences (73 ± 4 vs. 61 ± 2%; P = 0.029) and more high-motion sequences (3 ± 1 vs. 0.5 ± 0.1%; P < 0.001). Applying an original methodology, we have developed a novel classification of functional gut disorders based on objective, physiological criteria of small bowel function. PMID:26251472

  5. Terahertz Wide-Angle Imaging and Analysis on Plane-wave Criteria Based on Inverse Synthetic Aperture Techniques

    NASA Astrophysics Data System (ADS)

    Gao, Jing Kun; Qin, Yu Liang; Deng, Bin; Wang, Hong Qiang; Li, Jin; Li, Xiang

    2016-04-01

    This paper presents two parts of work around terahertz imaging applications. The first part aims at solving the problems occurred with the increasing of the rotation angle. To compensate for the nonlinearity of terahertz radar systems, a calibration signal acquired from a bright target is always used. Generally, this compensation inserts an extra linear phase term in the intermediate frequency (IF) echo signal which is not expected in large-rotation angle imaging applications. We carried out a detailed theoretical analysis on this problem, and a minimum entropy criterion was employed to estimate and compensate for the linear-phase errors. In the second part, the effects of spherical wave on terahertz inverse synthetic aperture imaging are analyzed. Analytic criteria of plane-wave approximation were derived in the cases of different rotation angles. Experimental results of corner reflectors and an aircraft model based on a 330-GHz linear frequency-modulated continuous wave (LFMCW) radar system validated the necessity and effectiveness of the proposed compensation. By comparing the experimental images obtained under plane-wave assumption and spherical-wave correction, it also showed to be highly consistent with the analytic criteria we derived.

  6. Epilepsy Imaging Study Guideline Criteria

    PubMed Central

    Gaillard, William D; Cross, J Helen; Duncan, John S; Stefan, Hermann; Theodore, William H

    2011-01-01

    Recognition of limited economic resources, as well as potential adverse effects of ‘over testing,’ has increased interest in ‘evidence-based’ assessment of new medical technology. This creates a particular problem for evaluation and treatment of epilepsy, increasingly dependent on advanced imaging and electrophysiology, since there is a marked paucity of epilepsy diagnostic and prognostic studies that meet rigorous standards for evidence classification. The lack of high quality data reflects fundamental weaknesses in many imaging studies but also limitations in the assumptions underlying evidence classification schemes as they relate to epilepsy, and to the practicalities of conducting adequately powered studies of rapidly evolving technologies. We review the limitations of current guidelines and propose elements for imaging studies that can contribute meaningfully to the epilepsy literature. PMID:21740417

  7. Users' Relevance Criteria in Image Retrieval in American History.

    ERIC Educational Resources Information Center

    Choi, Youngok; Rasmussen, Edie M.

    2002-01-01

    Discussion of the availability of digital images focuses on a study of American history faculty and graduate students that investigated the criteria which image users apply when making judgments about the relevance of an image. Considers topicality and image quality and suggests implications for image retrieval system design. (Contains 63…

  8. Magnetic Resonance Imaging Criteria for Thrombolysis in Hyperacute Cerebral Infarction

    PubMed Central

    AHMETGJEKAJ, ILIR; KABASHI-MUÇAJ, SERBEZE; LASCU, LUANA CORINA; KABASHI, ANTIGONA; BONDARI, A.; BONDARI, SIMONA; DEDUSHI-HOTI, KRESHNIKE; BIÇAKU, ARDIAN; SHATRI, JETON

    2014-01-01

    Purpose: Selection of patients with cerebral infarction for MRI that is suitable for thrombolytic therapy as an emerging application. Although the efficiency of the therapy with i.v. tissue plasminogen activator (tPA) within 3 hours after onset of symptoms has been proven in selected patients with CT, now these criteria are determined by MRI, as the data we gather are fast and accurate in the first hours. Material and methods: MRI screening in patients with acute cerebral infarction before application of thrombolytic therapy was done in a UCC Mannheim in Germany. Unlike trials with CT, MRI studies demonstrated the benefits of therapy up to 6 hours after the onset of symptoms. We studied 21 patients hospitalized in Clinic of Neuroradiology at University Clinical Centre in Mannheim-Germany. They all undergo brain MRI evaluation for stroke. This article reviews literature that has followed application of thrombolysis in patients with cerebral infarction based on MRI. Results: We have analyzed the MRI criteria for i.v. application of tPA at this University Centre. Alongside the personal viewpoints of clinicians, survey reveals a variety of clinical aspects and MRI features that are opened for further more exploration: therapeutic effects, the use of the MRI angiography, dynamics, and other. Conclusions: MRI is a tested imaging method for rapid evaluation of patients with hyperacute cerebral infarction, replacing the use of CT imaging and clinical features. MRI criteria for thrombolytic therapy are being applied in some cerebral vascular centres. In Kosovo, the application of thrombolytic therapy has not started yet. PMID:25729591

  9. Retinal Imaging and Image Analysis

    PubMed Central

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:21743764

  10. Analysis of the impact of safeguards criteria

    SciTech Connect

    Mullen, M.F.; Reardon, P.T.

    1981-01-01

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) of the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.

  11. Statistical-information-based performance criteria for Richardson-Lucy image deblurring.

    PubMed

    Prasad, Sudhakar

    2002-07-01

    Iterative image deconvolution algorithms generally lack objective criteria for deciding when to terminate iterations, often relying on ad hoc metrics for determining optimal performance. A statistical-information-based analysis of the popular Richardson-Lucy iterative deblurring algorithm is presented after clarification of the detailed nature of noise amplification and resolution recovery as the algorithm iterates. Monitoring the information content of the reconstructed image furnishes an alternative criterion for assessing and stopping such an iterative algorithm. It is straightforward to implement prior knowledge and other conditioning tools in this statistical approach. PMID:12095196

  12. Adaptive lifting scheme with sparse criteria for image coding

    NASA Astrophysics Data System (ADS)

    Kaaniche, Mounir; Pesquet-Popescu, Béatrice; Benazza-Benyahia, Amel; Pesquet, Jean-Christophe

    2012-12-01

    Lifting schemes (LS) were found to be efficient tools for image coding purposes. Since LS-based decompositions depend on the choice of the prediction/update operators, many research efforts have been devoted to the design of adaptive structures. The most commonly used approaches optimize the prediction filters by minimizing the variance of the detail coefficients. In this article, we investigate techniques for optimizing sparsity criteria by focusing on the use of an ℓ 1 criterion instead of an ℓ 2 one. Since the output of a prediction filter may be used as an input for the other prediction filters, we then propose to optimize such a filter by minimizing a weighted ℓ 1 criterion related to the global rate-distortion performance. More specifically, it will be shown that the optimization of the diagonal prediction filter depends on the optimization of the other prediction filters and vice-versa. Related to this fact, we propose to jointly optimize the prediction filters by using an algorithm that alternates between the optimization of the filters and the computation of the weights. Experimental results show the benefits which can be drawn from the proposed optimization of the lifting operators.

  13. A new tool for analysis of cleanup criteria decisions.

    PubMed

    Klemic, Gladys A; Bailey, Paul; Elcock, Deborah

    2003-08-01

    Radionuclides and other hazardous materials resulting from processes used in nuclear weapons production contaminate soil, groundwater, and buildings around the United States. Cleanup criteria for environmental contaminants are agreed on prior to remediation and underpin the scope and legacy of the cleanup process. Analysis of cleanup criteria can be relevant for future agreements and may also provide insight into a complex decision making process where science and policy issues converge. An Internet accessible database has been established to summarize cleanup criteria and related factors involved in U.S. Department of Energy remediation decisions. This paper reports on a new user interface for the database that is designed to integrate related information into graphic displays and tables with interactive features that allow exploratory data analysis of cleanup criteria. Analysis of 137Cs in surface soil is presented as an example. PMID:12865746

  14. EACVI appropriateness criteria for the use of cardiovascular imaging in heart failure derived from European National Imaging Societies voting.

    PubMed

    Garbi, Madalina; Edvardsen, Thor; Bax, Jeroen; Petersen, Steffen E; McDonagh, Theresa; Filippatos, Gerasimos; Lancellotti, Patrizio

    2016-07-01

    This paper presents the first European appropriateness criteria for the use of cardiovascular imaging in heart failure, derived from voting of the European National Imaging Societies representatives. The paper describes the development process and discusses the results. PMID:27129538

  15. Alternative Test Criteria in Covariance Structure Analysis: A Unified Approach.

    ERIC Educational Resources Information Center

    Satorra, Albert

    1989-01-01

    Within covariance structural analysis, a unified approach to asymptotic theory of alternative test criteria for testing parametric restrictions is provided. More general statistics for addressing the case where the discrepancy function is not asymptotically optimal, and issues concerning power analysis and the asymptotic theory of testing-related…

  16. GIS Based Multi-Criteria Decision Analysis For Cement Plant Site Selection For Cuddalore District

    NASA Astrophysics Data System (ADS)

    Chhabra, A.

    2015-12-01

    India's cement industry is a vital part of its economy, providing employment to more than a million people. On the back of growing demands, due to increased construction and infrastructural activities cement market in India is expected to grow at a compound annual growth rate (CAGR) of 8.96 percent during the period 2014-2019. In this study, GIS-based spatial Multi Criteria Decision Analysis (MCDA) is used to determine the optimum and alternative sites to setup a cement plant. This technique contains a set of evaluation criteria which are quantifiable indicators of the extent to which decision objectives are realized. In intersection with available GIS (Geographical Information System) and local ancillary data, the outputs of image analysis serves as input for the multi-criteria decision making system. Moreover, the following steps were performed so as to represent the criteria in GIS layers, which underwent the GIS analysis in order to get several potential sites. Satellite imagery from LANDSAT 8 and ASTER DEM were used for the analysis. Cuddalore District in Tamil Nadu was selected as the study site as limestone mining is already being carried out in that region which meets the criteria of raw material for cement production. Several other criteria considered were land use land cover (LULC) classification (built-up area, river, forest cover, wet land, barren land, harvest land and agriculture land), slope, proximity to road, railway and drainage networks.

  17. Image-analysis library

    NASA Technical Reports Server (NTRS)

    1980-01-01

    MATHPAC image-analysis library is collection of general-purpose mathematical and statistical routines and special-purpose data-analysis and pattern-recognition routines for image analysis. MATHPAC library consists of Linear Algebra, Optimization, Statistical-Summary, Densities and Distribution, Regression, and Statistical-Test packages.

  18. Electronic image analysis

    NASA Astrophysics Data System (ADS)

    Gahm, J.; Grosskopf, R.; Jaeger, H.; Trautwein, F.

    1980-12-01

    An electronic system for image analysis was developed on the basis of low and medium cost integrated circuits. The printed circuit boards were designed, using the principles of modern digital electronics and data processing. The system consists of modules for automatic, semiautomatic and visual image analysis. They can be used for microscopical and macroscopical observations. Photographs can be evaluated, too. The automatic version is controlled by software modules adapted to various applications. The result is a system for image analysis suitable for many different measurement problems. The features contained in large image areas can be measured. For automatic routine analysis controlled by processing calculators the necessary software and hardware modules are available.

  19. Basics of image analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hyperspectral imaging technology has emerged as a powerful tool for quality and safety inspection of food and agricultural products and in precision agriculture over the past decade. Image analysis is a critical step in implementing hyperspectral imaging technology; it is aimed to improve the qualit...

  20. Improving diagnostic criteria for Propionibacterium acnes osteomyelitis: a retrospective analysis.

    PubMed

    Asseray, Nathalie; Papin, Christophe; Touchais, Sophie; Bemer, Pascale; Lambert, Chantal; Boutoille, David; Tequi, Brigitte; Gouin, François; Raffi, François; Passuti, Norbert; Potel, Gilles

    2010-07-01

    The identification of Propionibacterium acnes in cultures of bone and joint samples is always difficult to interpret because of the ubiquity of this microorganism. The aim of this study was to propose a diagnostic strategy to distinguish infections from contaminations. This was a retrospective analysis of all patient charts of those patients with >or=1 deep samples culture-positive for P. acnes. Every criterion was tested for sensitivity, specificity, and positive likelihood ratio, and then the diagnostic probability of combinations of criteria was calculated. Among 65 patients, 52 (80%) were considered truly infected with P. acnes, a diagnosis based on a multidisciplinary process. The most valuable diagnostic criteria were: >or=2 positive deep samples, peri-operative findings (necrosis, hardware loosening, etc.), and >or=2 surgical procedures. However, no single criterion was sufficient to ascertain the diagnosis. The following combinations of criteria had a diagnostic probability of >90%: >or=2 positive cultures + 1 criterion among: peri-operative findings, local signs of infection, >or=2 previous operations, orthopaedic devices; 1 positive culture + 3 criteria among: peri-operative findings, local signs of infection, >or=2 previous surgical operations, orthopaedic devices, inflammatory syndrome. The diagnosis of P. acnes osteomyelitis was greatly improved by combining different criteria, allowing differentiation between infection and contamination. PMID:20141491

  1. Canine intracranial gliomas: Relationship between magnetic resonance imaging criteria and tumor type and grade

    PubMed Central

    Bentley, R.T.; Ober, C.P.; Anderson, K.L.; Feeney, D.A.; Naughton, J.F.; Ohlfest, J.R.; O’Sullivan, M.G.; Miller, M.A.; Constable, P.D.; Pluhar, G.E.

    2013-01-01

    Limited information is available to assist in the ante-mortem prediction of tumor type and grade for dogs with primary brain tumors. The objective of the current study was to identify magnetic resonance imaging (MRI) criteria related to the histopathological type and grade of gliomas in dogs. A convenience sample utilizing client-owned dogs (n=31) with gliomas was used. Medical records of dogs with intracranial lesions admitted to two veterinary referral hospitals were reviewed and cases with a complete brain MRI and definitive histopathological diagnosis were retrieved for analysis. Each MRI was independently interpreted by five investigators who were provided with standardized grading instructions and remained blinded to the histopathological diagnosis. Mild to no contrast enhancement, an absence of cystic structures (single or multiple), and a tumor location other than the thalamo-capsular region were independently associated with grade II tumors compared to higher grade tumors. In comparison to oligodendrogliomas, astrocytomas were independently associated with the presence of moderate to extensive peri-tumoral edema, a lack of ventricular distortion, and an isointense or hyperintense T1W-signal. When clinical and MRI features indicate that a glioma is most likely, certain MRI criteria can be used to inform the level of suspicion for low tumor grade, particularly poor contrast enhancement. Information obtained from the MRI of such dogs can also assist in predicting an astrocytoma or an oligodendroglioma, but no single imaging characteristic allows for a particular tumor type to be ruled out. PMID:24051197

  2. Improvement and Extension of Shape Evaluation Criteria in Multi-Scale Image Segmentation

    NASA Astrophysics Data System (ADS)

    Sakamoto, M.; Honda, Y.; Kondo, A.

    2016-06-01

    From the last decade, the multi-scale image segmentation is getting a particular interest and practically being used for object-based image analysis. In this study, we have addressed the issues on multi-scale image segmentation, especially, in improving the performances for validity of merging and variety of derived region's shape. Firstly, we have introduced constraints on the application of spectral criterion which could suppress excessive merging between dissimilar regions. Secondly, we have extended the evaluation for smoothness criterion by modifying the definition on the extent of the object, which was brought for controlling the shape's diversity. Thirdly, we have developed new shape criterion called aspect ratio. This criterion helps to improve the reproducibility on the shape of object to be matched to the actual objectives of interest. This criterion provides constraint on the aspect ratio in the bounding box of object by keeping properties controlled with conventional shape criteria. These improvements and extensions lead to more accurate, flexible, and diverse segmentation results according to the shape characteristics of the target of interest. Furthermore, we also investigated a technique for quantitative and automatic parameterization in multi-scale image segmentation. This approach is achieved by comparing segmentation result with training area specified in advance by considering the maximization of the average area in derived objects or satisfying the evaluation index called F-measure. Thus, it has been possible to automate the parameterization that suited the objectives especially in the view point of shape's reproducibility.

  3. Criteria for recognition of bedding structures utilizing imaging devices

    SciTech Connect

    Fett, T. )

    1990-09-01

    Wireline logs, especially dipmeter imaging devices, can bridge the gap between the wealth of outcrop and recent information about bedding structures and the immense data base available from mineral industry boreholes. The recognition of bedding structures as observed in ancient and recent exposures is not always straightforward with wireline measurements. A major strength of outcrop studies is their two- and sometimes three-dimensional aspect and the availability of very fine details. A strength of wireline measurements is their great abundance. Wireline logs provide a variety of petrophysical information, including lithology The dipmeter provides much finer detail (down to 1 cm or less in the case of imaging devices) and the information is three-dimensional. With remarkable detail, the microresistivity imaging devices can delineate and precisely orient bedding surfaces and hence bedding structures. The nature of contacts (abrupt, tangential) and of surfaces (planar, wavy), as well as the size and shape of objects are routinely available. Lithology, texture, and other sedimentary features such as cross-bedding (trough and planar), slumping, bioturbation, graded beds, erosional contacts, lag, etc., are seen. Image workstations are greatly facilitating the recognition of bedding structures from wireline microresistivity images. The synergistic merging of the wealth of information available from field studies with this type of quantitative detail, available from boreholes, offers great opportunity for defining sedimentary structures and depositional environments and hence for delineating reservoirs.

  4. Validating new diagnostic imaging criteria for primary progressive aphasia via anatomical likelihood estimation meta-analyses.

    PubMed

    Bisenius, S; Neumann, J; Schroeter, M L

    2016-04-01

    Recently, diagnostic clinical and imaging criteria for primary progressive aphasia (PPA) have been revised by an international consortium (Gorno-Tempini et al. Neurology 2011;76:1006-14). The aim of this study was to validate the specificity of the new imaging criteria and investigate whether different imaging modalities [magnetic resonance imaging (MRI) and fluorodeoxyglucose positron emission tomography (FDG-PET)] require different diagnostic subtype-specific imaging criteria. Anatomical likelihood estimation meta-analyses were conducted for PPA subtypes across a large cohort of 396 patients: firstly, across MRI studies for each of the three PPA subtypes followed by conjunction and subtraction analyses to investigate the specificity, and, secondly, by comparing results across MRI vs. FDG-PET studies in semantic dementia and progressive nonfluent aphasia. Semantic dementia showed atrophy in temporal, fusiform, parahippocampal gyri, hippocampus, and amygdala, progressive nonfluent aphasia in left putamen, insula, middle/superior temporal, precentral, and frontal gyri, logopenic progressive aphasia in middle/superior temporal, supramarginal, and dorsal posterior cingulate gyri. Results of the disease-specific meta-analyses across MRI studies were disjunct. Similarly, atrophic and hypometabolic brain networks were regionally dissociated in both semantic dementia and progressive nonfluent aphasia. In conclusion, meta-analyses support the specificity of new diagnostic imaging criteria for PPA and suggest that they should be specified for each imaging modality separately. PMID:26901360

  5. Description, Recognition and Analysis of Biological Images

    SciTech Connect

    Yu Donggang; Jin, Jesse S.; Luo Suhuai; Pham, Tuan D.; Lai Wei

    2010-01-25

    Description, recognition and analysis biological images plays an important role for human to describe and understand the related biological information. The color images are separated by color reduction. A new and efficient linearization algorithm is introduced based on some criteria of difference chain code. A series of critical points is got based on the linearized lines. The series of curvature angle, linearity, maximum linearity, convexity, concavity and bend angle of linearized lines are calculated from the starting line to the end line along all smoothed contours. The useful method can be used for shape description and recognition. The analysis, decision, classification of the biological images are based on the description of morphological structures, color information and prior knowledge, which are associated each other. The efficiency of the algorithms is described based on two applications. One application is the description, recognition and analysis of color flower images. Another one is related to the dynamic description, recognition and analysis of cell-cycle images.

  6. Release criteria and pathway analysis for radiological remediation

    SciTech Connect

    Subbaraman, G.; Tuttle, R.J.; Oliver, B.M. . Rocketdyne Div.); Devgun, J.S. )

    1991-01-01

    Site-specific activity concentrations were derived for soils contaminated with mixed fission products (MFP), or uranium-processing residues, using the Department of Energy (DOE) pathway analysis computer code RESRAD at four different sites. The concentrations and other radiological parameters, such as limits on background-subtracted gamma exposure rate were used as the basis to arrive at release criteria for two of the sites. Valid statistical parameters, calculated for the distribution of radiological data obtained from site surveys, were then compared with the criteria to determine releasability or need for further decontamination. For the other two sites, RESRAD has been used as a preremediation planning tool to derive residual material guidelines for uranium. 11 refs., 4 figs., 3 tabs.

  7. Investigation of various criteria for evaluation of aluminum thin foil ''smart sensors'' images

    NASA Astrophysics Data System (ADS)

    Panin, S. V.; Eremin, A. V.; Lyubutin, P. S.; Burkov, M. V.

    2014-10-01

    Various criteria for processing of aluminum foil ''smart sensors'' images for fatigue evaluation of carbon fiber reinforced polymer (CFRP) were analyzed. There are informative parameters used to assess image quality and surface relief and accordingly to characterize the fatigue damage state of CFRP. The sensitivity of all criteria to distortion influences, particularly, to Gaussian noise, blurring and JPEG compression was investigated. The main purpose of the research is related to the search of informative parameters for fatigue evaluation, which are the least sensitive to different distortions.

  8. Development of Advanced Imaging Criteria for the Endoscopic Identification of Inflammatory Polyps

    PubMed Central

    Sussman, Daniel A; Barkin, Jodie A; Martin, Aileen M; Varma, Tanya; Clarke, Jennifer; Quintero, Maria A; Barkin, Heather B; Deshpande, Amar R; Barkin, Jamie S; Abreu, Maria T

    2015-01-01

    OBJECTIVES: Inflammatory polyps (IPs) are frequently encountered at colonoscopy in inflammatory bowel disease (IBD) patients and are associated with an increased risk of colon cancer. The aim of this prospective endoscopic image review and analysis was to describe endoscopic features of IPs in IBD patients at surveillance colonoscopy and determine the ability to endoscopically discern IPs from other colon polyps using high-definition white light (WL), narrow band imaging with magnification (NBI), and chromoendoscopy (CE). METHODS: Digital images of IPs using WL, NBI, and CE were reviewed by four attending gastroenterologists using a two-round modified Delphi method. The ability to endoscopically discern IPs from other colon polyps was determined among groups of gastroenterology fellows and attendings. IPs were classified by gross appearance, contour, surface pattern, pit pattern, and appearance of surrounding mucosa in IPs, as well as accuracy of diagnosis. RESULTS: Features characteristic of IPs included a fibrinous cap, surface friability and ulceration, an appendage-like appearance, the halo sign with CE, and a clustering of a multiplicity of IPs. The overall diagnostic accuracy for IP identification was 63% for WL, 42% for NBI, and 64% for CE. High degrees of histologic inflammation significantly improved the accuracy of diagnosis of IP with WL and CE, whereas the use of NBI significantly impaired IP accuracy. CONCLUSIONS: The overall diagnostic accuracy when applying these criteria to clinical images was modest, with incremental benefit with addition of CE to WL. CE showed promise predicting IP histology in actively inflamed tissue. Institutional Review Board approval was obtained. ClinicalTrials.gov Identifier: NCT01557387. PMID:26583503

  9. Systemic Sclerosis Classification Criteria: Developing methods for multi-criteria decision analysis with 1000Minds

    PubMed Central

    Johnson, Sindhu R.; Naden, Raymond P.; Fransen, Jaap; van den Hoogen, Frank; Pope, Janet E.; Baron, Murray; Tyndall, Alan; Matucci-Cerinic, Marco; Denton, Christopher P.; Distler, Oliver; Gabrielli, Armando; van Laar, Jacob M.; Mayes, Maureen; Steen, Virginia; Seibold, James R.; Clements, Phillip; Medsger, Thomas A.; Carreira, Patricia E.; Riemekasten, Gabriela; Chung, Lorinda; Fessler, Barri J.; Merkel, Peter A.; Silver, Richard; Varga, John; Allanore, Yannick; Mueller-Ladner, Ulf; Vonk, Madelon C.; Walker, Ulrich A.; Cappelli, Susanna; Khanna, Dinesh

    2014-01-01

    Objective Classification criteria for systemic sclerosis (SSc) are being developed. The objectives were to: develop an instrument for collating case-data and evaluate its sensibility; use forced-choice methods to reduce and weight criteria; and explore agreement between experts on the probability that cases were classified as SSc. Study Design and Setting A standardized instrument was tested for sensibility. The instrument was applied to 20 cases covering a range of probabilities that each had SSc. Experts rank-ordered cases from highest to lowest probability; reduced and weighted the criteria using forced-choice methods; and re-ranked the cases. Consistency in rankings was evaluated using intraclass correlation coefficients (ICC). Results Experts endorsed clarity (83%), comprehensibility (100%), face and content validity (100%). Criteria were weighted (points): finger skin thickening (14–22), finger-tip lesions (9–21), friction rubs (21), finger flexion contractures (16), pulmonary fibrosis (14), SSc-related antibodies (15), Raynaud’s phenomenon (13), calcinosis (12), pulmonary hypertension (11), renal crisis (11), telangiectasia (10), abnormal nailfold capillaries (10), esophageal dilation (7) and puffy fingers (5). The ICC across experts was 0.73 (95%CI 0.58,0.86) and improved to 0.80 (95%CI 0.68,0.90). Conclusions Using a sensible instrument and forced-choice methods, the number of criteria were reduced by 39% (23 to 14) and weighted. Our methods reflect the rigors of measurement science, and serves as a template for developing classification criteria. PMID:24721558

  10. Image Analysis of Foods.

    PubMed

    Russ, John C

    2015-09-01

    The structure of foods, both natural and processed ones, is controlled by many variables ranging from biology to chemistry and mechanical forces. The structure also controls many of the properties of the food, including consumer acceptance, taste, mouthfeel, appearance, and so on, and nutrition. Imaging provides an important tool for measuring the structure of foods. This includes 2-dimensional (2D) images of surfaces and sections, for example, viewed in a microscope, as well as 3-dimensional (3D) images of internal structure as may be produced by confocal microscopy, or computed tomography and magnetic resonance imaging. The use of images also guides robotics for harvesting and sorting. Processing of images may be needed to calibrate colors, reduce noise, enhance detail, and delineate structure and dimensions. Measurement of structural information such as volume fraction and internal surface areas, as well as the analysis of object size, location, and shape in both 2- and 3-dimensional images is illustrated and described, with primary references and examples from a wide range of applications. PMID:26270611

  11. Adherence to Criteria for Transvaginal Ultrasound Imaging and Measurement of Cervical Length

    PubMed Central

    Iams, JD; Grobman, WA; Lozitska, A; Spong, CY; Saade, G; Mercer, BM; Tita, AN; Rouse, DJ; Sorokin, Y; Wapner, RJ; Leveno, KJ; Esplin, MS; Tolosa, JE; Thorp, JM; Caritis, SN; Van Dorsten, JP

    2014-01-01

    Background Adherence to published criteria for transvaginal imaging and measurement of cervical length is uncertain. We sought to assess adherence by evaluating images submitted to certify research sonographers for participation in a clinical trial. Study Design We reviewed qualifying test results of sonographers seeking certification to image and measure cervical length in a clinical trial. Participating sonographers were required to access training materials and submit 15 images, three each from five pregnant women not enrolled in the trial. One of two sonologists reviewed all qualifying images. We recorded the proportion of images that did not meet standard criteria (excess compression, landmarks not seen, improper image size, or full maternal bladder) and the proportion in which the cervical length was measured incorrectly. Failure for a given patient was defined as more than one unacceptable image, or more than two acceptable images with incorrect caliper placement or erroneous choice of the “shortest best” cervical length. Certification required satisfactory images and cervical length measurement from four or more patients. Results 327 sonographers submitted 4905 images. 271 sonographers (83%) were certified on the first, 41 (13%) on the second, and 2 (0.6%) on the third submission. 13 never achieved certification. Of 314 who passed, 196 submitted 15 acceptable images that were appropriately measured for all five women. There were 1277 deficient images: 493 were acceptable but incorrectly measured images from sonographers who passed certification because mis-measurement occurred no more than twice. Of 784 deficient images submitted by sonographers who failed the certification, 471 were rejected because of improper measurement (caliper placement and/or failure to identify the shortest best image), and 313 because of failure to obtain a satisfactory image (excessive compression, required landmarks not visible, incorrect image size, brief examination, and

  12. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  13. Facial motion parameter estimation and error criteria in model-based image coding

    NASA Astrophysics Data System (ADS)

    Liu, Yunhai; Yu, Lu; Yao, Qingdong

    2000-04-01

    Model-based image coding has been given extensive attention due to its high subject image quality and low bit-rates. But the estimation of object motion parameter is still a difficult problem, and there is not a proper error criteria for the quality assessment that are consistent with visual properties. This paper presents an algorithm of the facial motion parameter estimation based on feature point correspondence and gives the motion parameter error criteria. The facial motion model comprises of three parts. The first part is the global 3-D rigid motion of the head, the second part is non-rigid translation motion in jaw area, and the third part consists of local non-rigid expression motion in eyes and mouth areas. The feature points are automatically selected by a function of edges, brightness and end-node outside the blocks of eyes and mouth. The numbers of feature point are adjusted adaptively. The jaw translation motion is tracked by the changes of the feature point position of jaw. The areas of non-rigid expression motion can be rebuilt by using block-pasting method. The estimation approach of motion parameter error based on the quality of reconstructed image is suggested, and area error function and the error function of contour transition-turn rate are used to be quality criteria. The criteria reflect the image geometric distortion caused by the error of estimated motion parameters properly.

  14. Safety analysis, risk assessment, and risk acceptance criteria

    SciTech Connect

    Jamali, K.; Stack, D.W.; Sullivan, L.H.; Sanzo, D.L.

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.

  15. The evaluation and implementation of match criteria for forensic analysis of DNA.

    PubMed

    Laber, T L; Iverson, J T; Liberty, J A; Giese, S A

    1995-11-01

    This study describes a method for establishing match criteria used in forensic DNA typing. The validity of applying different match criteria based upon the molecular weight of a DNA band is discussed. The match criteria presented allow visually matching DNA patterns to be confirmed by computer assisted image analysis over the entire range of the sizing ladder. Approximately 5000 intragel and 5000 intergel comparisons were made between the restriction fragment length polymorphism (RFLP) DNA band sizes obtained from casework, mock cases, and environmentally insulted samples and the band sizes obtained from their corresponding bloodstain standards (controls). Analyses of these data suggested that fragments located in different molecular weight size regions of an analytical gel required different match criteria for assessing a visual match. The results of these analyses support the use of the following match criteria: Intragel 0.5-10 kb = +/- 1.7%, 10-15 kb = +/- 3.2%, 15-22.6 kb = +/- 5.8%; Intergel and blind control 0.5-10 kb = +/- 3.0%, 10-15 kb = +/- 4.2%, 15-22.6 kb = +/- 10.0%; and human cell-line K562 and the monomorphic locus D7Z2 = +/- 2.5%. Each match criterion was also evaluated with respect to the distance in millimeters between matching bands throughout the 0.5-22.6 kb molecular weight size range. Applying these match criteria to different gel regions has been shown to be valid and reliable in comparisons conducted on more than 10,000 validation samples, in over 500 forensic cases and in more than 200 searches of a criminal sexual offender (CSO) database containing over 5000 individuals. PMID:8522913

  16. Picosecond Imaging Circuit Analysis

    NASA Astrophysics Data System (ADS)

    Kash, Jeffrey A.

    1998-03-01

    With ever-increasing complexity, probing the internal operation of a silicon IC becomes more challenging. Present methods of internal probing are becoming obsolete. We have discovered that a very weak picosecond pulse of light is emitted by each FET in a CMOS circuit whenever the circuit changes logic state. This pulsed emission can be simultaneously imaged and time resolved, using a technique we have named Picosecond Imaging Circuit Analysis (PICA). With a suitable imaging detector, PICA allows time resolved measurement on thousands of devices simultaneously. Computer videos made from measurements on real IC's will be shown. These videos, along with a more quantitative evaluation of the light emission, permit the complete operation of an IC to be measured in a non-invasive way with picosecond time resolution.

  17. Finite element mesh refinement criteria for stress analysis

    NASA Technical Reports Server (NTRS)

    Kittur, Madan G.; Huston, Ronald L.

    1990-01-01

    This paper discusses procedures for finite-element mesh selection and refinement. The objective is to improve accuracy. The procedures are based on (1) the minimization of the stiffness matrix race (optimizing node location); (2) the use of h-version refinement (rezoning, element size reduction, and increasing the number of elements); and (3) the use of p-version refinement (increasing the order of polynomial approximation of the elements). A step-by-step procedure of mesh selection, improvement, and refinement is presented. The criteria for 'goodness' of a mesh are based on strain energy, displacement, and stress values at selected critical points of a structure. An analysis of an aircraft lug problem is presented as an example.

  18. Image analysis library software development

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Bryant, J.

    1977-01-01

    The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.

  19. Reliability and Diagnostic Performance of CT Imaging Criteria in the Diagnosis of Tuberculous Meningitis

    PubMed Central

    Botha, Hugo; Ackerman, Christelle; Candy, Sally; Carr, Jonathan A.; Griffith-Richards, Stephanie; Bateman, Kathleen J.

    2012-01-01

    Introduction Abnormalities on CT imaging may contribute to the diagnosis of tuberculous meningitis (TBM). Recently, an expert consensus case definition (CCD) and set of imaging criteria for diagnosing basal meningeal enhancement (BME) have been proposed. This study aimed to evaluate the sensitivity, specificity and reliability of these in a prospective cohort of adult meningitis patients. Methods Initial diagnoses were based on the CCD, classifying patients into: ‘Definite TBM’ (microbiological confirmation), ‘Probable TBM’ (diagnostic score ≥10), ‘Possible TBM’ (diagnostic score 6–9), ‘Not TBM’ (confirmation of an alternative diagnosis) or ‘Uncertain’ (diagnostic score of <6). CT images were evaluated independently on two occasions by four experienced reviewers. Intra-rater and inter-rater agreement were calculated using the kappa statistic. Sensitivities and specificities were calculated using both ‘Definite TBM’ and either ‘Definite TBM’ or ‘Probable TBM’ as gold standards. Results CT scan criteria for BME had good intra-rater agreement (κ range 0.35–0.78) and fair to moderate inter-rater agreement (κ range 0.20–0.52). Intra- and inter-rater agreement on the CCD components were good to fair (κ  =  ranges 0.47–0.81 and 0.21–0.63). Using ‘Definite TBM’ as a gold standard, the criteria for BME were very specific (61.5%–100%), but insensitive (5.9%–29.4%). Similarly, the imaging components of the CCD were highly specific (69.2–100%) but lacked sensitivity (0–56.7%). Similar values were found when using ‘Definite TBM’ or ‘Probable TBM’ as a gold standard. Discussion The fair to moderate inter-rater agreement and poor sensitivities of the criteria for BME suggest that little reliance should be placed in these features in isolation. While the presence of the CCD criteria of acute infarction or tuberculoma(s) appears useful as rule-in criteria, their absence is of little help in excluding TBM. The

  20. Medical Image Analysis Facility

    NASA Technical Reports Server (NTRS)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  1. Digital Image Analysis of Cereals

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Image analysis is the extraction of meaningful information from images, mainly digital images by means of digital processing techniques. The field was established in the 1950s and coincides with the advent of computer technology, as image analysis is profoundly reliant on computer processing. As t...

  2. About contradictions among different criteria for evaluation of image (interference) elements width changes

    NASA Astrophysics Data System (ADS)

    Wolczak, Bohdan K.

    1991-08-01

    It is shown that b a certain changing of optical signals /irnages/ statisica1 properties there are possible the cases of nar rowing image elements /interference lines! width which are deterni ned by containing 99 percent of total signal power ai. d sirnu1ta neuously enlarging of the same element deterrnined b containing of 90 per-cent of total signal power. Sirniliar or opposite situations are called as contradictions /anbi noniies/ of energetic criteria for evaluation of image elements changes /spectra bands or interference lines!. any examples of apparent contradictions /antinoxnies! of energetic criteria for evaluation image elenients width changes for seven examined physically possible to obtain low-band power concentration radiant spectra are prsen ted. Range of usefulness of four other oflen used criteria for eva luation lines width changes was examined. 6 coefficients of bands width which are followed bi other numerous cases of wrong eva1ua tions of optical or optoelectronical s''stems " sharpening" images are given. 318 / SPIE Vol. 1391 Laser Technology 111(1990) O-8194-0458-6/91/$4. 0

  3. Analysis of eligibility criteria from ClinicalTrials.gov.

    PubMed

    Doods, Justin; Dugas, Martin; Fritz, Fleur

    2014-01-01

    Electronic health care records are being used more and more for patient documentation. This electronic data can be used for secondary purposes, for example through systems that support clinical research. Eligibility criteria have to be processable for such systems to work, but criteria published on ClinicalTrials.gov have been shown to be complex, making them challenging to re-use. We analysed the eligibility criteria on ClinicalTrials.gov using automatic methods to determine whether the criteria definition and number changed over time. From 1998 to 2012 the average number of words used to describe eligibility criteria per year increased by 46%, while the average number of lines used per year only slightly increases until 2000 and stabilizes afterwards. Whether the increase of words resulted in increased criteria complexity or whether more data elements are used to describe eligibility needs further investigation. PMID:25160308

  4. Radar image analysis utilizing junctive image metamorphosis

    NASA Astrophysics Data System (ADS)

    Krueger, Peter G.; Gouge, Sally B.; Gouge, Jim O.

    1998-09-01

    A feasibility study was initiated to investigate the ability of algorithms developed for medical sonogram image analysis, to be trained for extraction of cartographic information from synthetic aperture radar imagery. BioComputer Research Inc. has applied proprietary `junctive image metamorphosis' algorithms to cancer cell recognition and identification in ultrasound prostate images. These algorithms have been shown to support automatic radar image feature detection and identification. Training set images were used to develop determinants for representative point, line and area features, which were used on test images to identify and localize the features of interest. The software is computationally conservative; operating on a PC platform in real time. The algorithms are robust; having applicability to be trained for feature recognition on any digital imagery, not just those formed from reflected energy, such as sonograms and radar images. Applications include land mass characterization, feature identification, target recognition, and change detection.

  5. Criteria for High Quality Biology Teaching: An Analysis

    ERIC Educational Resources Information Center

    Tasci, Guntay

    2015-01-01

    This study aims to analyze the process under which biology lessons are taught in terms of teaching quality criteria (TQC). Teaching quality is defined as the properties of efficient teaching and is considered to be the criteria used to measure teaching quality both in general and specific to a field. The data were collected through classroom…

  6. Statistical image analysis of longitudinal RAVENS images

    PubMed Central

    Lee, Seonjoo; Zipunnikov, Vadim; Reich, Daniel S.; Pham, Dzung L.

    2015-01-01

    Regional analysis of volumes examined in normalized space (RAVENS) are transformation images used in the study of brain morphometry. In this paper, RAVENS images are analyzed using a longitudinal variant of voxel-based morphometry (VBM) and longitudinal functional principal component analysis (LFPCA) for high-dimensional images. We demonstrate that the latter overcomes the limitations of standard longitudinal VBM analyses, which does not separate registration errors from other longitudinal changes and baseline patterns. This is especially important in contexts where longitudinal changes are only a small fraction of the overall observed variability, which is typical in normal aging and many chronic diseases. Our simulation study shows that LFPCA effectively separates registration error from baseline and longitudinal signals of interest by decomposing RAVENS images measured at multiple visits into three components: a subject-specific imaging random intercept that quantifies the cross-sectional variability, a subject-specific imaging slope that quantifies the irreversible changes over multiple visits, and a subject-visit specific imaging deviation. We describe strategies to identify baseline/longitudinal variation and registration errors combined with covariates of interest. Our analysis suggests that specific regional brain atrophy and ventricular enlargement are associated with multiple sclerosis (MS) disease progression. PMID:26539071

  7. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 2 2014-07-01 2014-07-01 false What hazards analysis criteria must my SEMS... and Environmental Management Systems (SEMS) § 250.1911 What hazards analysis criteria must my SEMS program meet? You must ensure that a hazards analysis (facility level) and a JSA (operations/task...

  8. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 2 2013-07-01 2013-07-01 false What hazards analysis criteria must my SEMS... and Environmental Management Systems (SEMS) § 250.1911 What hazards analysis criteria must my SEMS program meet? You must ensure that a hazards analysis (facility level) and a JSA (operations/task...

  9. Engineering design criteria for an image intensifier/image converter camera

    NASA Technical Reports Server (NTRS)

    Sharpsteen, J. T.; Lund, D. L.; Stoap, L. J.; Solheim, C. D.

    1976-01-01

    The design, display, and evaluation of an image intensifier/image converter camera which can be utilized in various requirements of spaceshuttle experiments are described. An image intensifier tube was utilized in combination with two brassboards as power supply and used for evaluation of night photography in the field. Pictures were obtained showing field details which would have been undistinguishable to the naked eye or to an ordinary camera.

  10. Comparative analysis of current diagnostic criteria for gestational diabetes mellitus

    PubMed Central

    Boyadzhieva, Mariya V; Atanasova, Iliana; Zacharieva, Sabina; Tankova, Tsvetalina; Dimitrova, Violeta

    2012-01-01

    Background To compare current guidelines for diagnosis of gestational diabetes mellitus (GDM) and to identify the ones that are the most relevant for application among pregnant Bulgarian population. Methods A total of 800 pregnant women at high risk for GDM underwent 75 g oral glucose tolerance test between 24 and 28 weeks of gestation as antenatal screening. The results were interpreted and classified according to the guidelines of the International Association of Diabetes and Pregnancy Study Groups (IADPSG), American Diabetes Association (ADA), Australasian Diabetes in Pregnancy Society, Canadian Diabetes Association, European Association for the Study of Diabetes, New Zealand Society for the study of Diabetes and World Health Organization. Results The application of different diagnostic criteria resulted in prevalences of GDM between 10.8% and 31.6%. Using any two sets of criteria, women who were classified differently varied between 0.1% and 21.1% (P < 0.001).The IADPSG criteria were the most inclusive criteria and resulted in the highest prevalence of GDM. There was a significant difference in the major metabolic parameters between GDM and control groups, regardless of which of the diagnostic criteria applied. GDM diagnosed according to all criteria resulted in increased proportion of delivery by caesarean section (CS). However, only ADA and IADPSG criteria identified both increased macrosomia (odds ratio, 2.36; 2.29) and CS rate. Conclusion The need for GDM screening is indisputable. In our view, the new IADPSG guidelines offer a unique opportunity for a unified national and global approach to GDM.

  11. Reflections on ultrasound image analysis.

    PubMed

    Alison Noble, J

    2016-10-01

    Ultrasound (US) image analysis has advanced considerably in twenty years. Progress in ultrasound image analysis has always been fundamental to the advancement of image-guided interventions research due to the real-time acquisition capability of ultrasound and this has remained true over the two decades. But in quantitative ultrasound image analysis - which takes US images and turns them into more meaningful clinical information - thinking has perhaps more fundamentally changed. From roots as a poor cousin to Computed Tomography (CT) and Magnetic Resonance (MR) image analysis, both of which have richer anatomical definition and thus were better suited to the earlier eras of medical image analysis which were dominated by model-based methods, ultrasound image analysis has now entered an exciting new era, assisted by advances in machine learning and the growing clinical and commercial interest in employing low-cost portable ultrasound devices outside traditional hospital-based clinical settings. This short article provides a perspective on this change, and highlights some challenges ahead and potential opportunities in ultrasound image analysis which may both have high impact on healthcare delivery worldwide in the future but may also, perhaps, take the subject further away from CT and MR image analysis research with time. PMID:27503078

  12. Analysis of proposed criteria for human response to vibration

    NASA Technical Reports Server (NTRS)

    Janeway, R. N.

    1975-01-01

    The development of criteria for human vibration response is reviewed, including the evolution of the ISO standard 2631. The document is analyzed to show why its application to vehicle ride evaluation is strongly opposed. Alternative vertical horizontal limits for comfort are recommended in the ground vehicle ride frequency range above 1 Hz. These values are derived by correlating the absorbed power findings of Pradko and Lee with other established criteria. Special emphasis is placed on working limits in the frequency range of 1 to 10 Hz since this is the most significant area in ground vehicle ride evaluation.

  13. Paediatric Multiple Sclerosis: Update on Diagnostic Criteria, Imaging, Histopathology and Treatment Choices.

    PubMed

    Chou, I-Jun; Wang, Huei-Shyong; Whitehouse, William P; Constantinescu, Cris S

    2016-07-01

    Paediatric multiple sclerosis (MS) represents less than 5 % of the MS population, but patients with paediatric-onset disease reach permanent disability at a younger age than adult-onset patients. Accurate diagnosis at presentation and optimal long-term treatment are vital to mitigate ongoing neuroinflammation and irreversible neurodegeneration. However, it may be difficult to early differentiate paediatric MS from acute disseminated encephalomyelitis (ADEM) and neuromyelitis optica spectrum disorders (NMOSD), as they often have atypical presentation that differs from that of adult-onset MS. The purpose of this review is to summarize the updated views on diagnostic criteria, imaging, histopathology and treatment choices. PMID:27271748

  14. Image quality criteria for wide-field x-ray imaging applications

    NASA Astrophysics Data System (ADS)

    Thompson, Patrick L.; Harvey, James E.

    1999-10-01

    For staring, wide-field applications, such as a solar x-ray imager, the severe off-axis aberrations of the classical Wolter Type-I grazing incidence x-ray telescope design drastically limits the 'resolution' near the solar limb. A specification upon on-axis fractional encircled energy is thus not an appropriate image quality criterion for such wide-angle applications. A more meaningful image quality criterion would be a field-weighted-average measure of 'resolution.' Since surface scattering effects from residual optical fabrication errors are always substantial at these very short wavelengths, the field-weighted-average half- power radius is a far more appropriate measure of aerial resolution. If an ideal mosaic detector array is being used in the focal plane, the finite pixel size provides a practical limit to this system performance. Thus, the total number of aerial resolution elements enclosed by the operational field-of-view, expressed as a percentage of the n umber of ideal detector pixels, is a further improved image quality criterion. In this paper we describe the development of an image quality criterion for wide-field applications of grazing incidence x-ray telescopes which leads to a new class of grazing incidence designs described in a following companion paper.

  15. Air Pollution Monitoring Site Selection by Multiple Criteria Decision Analysis

    EPA Science Inventory

    Criteria air pollutants (particulate matter, sulfur dioxide, oxides of nitrogen, volatile organic compounds, and carbon monoxide) as well as toxic air pollutants are a global concern. A particular scenario that is receiving increased attention in the research is the exposure to t...

  16. Spotlight-8 Image Analysis Software

    NASA Technical Reports Server (NTRS)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  17. Oncological image analysis: medical and molecular image analysis

    NASA Astrophysics Data System (ADS)

    Brady, Michael

    2007-03-01

    This paper summarises the work we have been doing on joint projects with GE Healthcare on colorectal and liver cancer, and with Siemens Molecular Imaging on dynamic PET. First, we recall the salient facts about cancer and oncological image analysis. Then we introduce some of the work that we have done on analysing clinical MRI images of colorectal and liver cancer, specifically the detection of lymph nodes and segmentation of the circumferential resection margin. In the second part of the paper, we shift attention to the complementary aspect of molecular image analysis, illustrating our approach with some recent work on: tumour acidosis, tumour hypoxia, and multiply drug resistant tumours.

  18. Mapping tropical dry forest succession using multiple criteria spectral mixture analysis

    NASA Astrophysics Data System (ADS)

    Cao, Sen; Yu, Qiuyan; Sanchez-Azofeifa, Arturo; Feng, Jilu; Rivard, Benoit; Gu, Zhujun

    2015-11-01

    Tropical dry forests (TDFs) in the Americas are considered the first frontier of economic development with less than 1% of their total original coverage under protection. Accordingly, accurate estimates of their spatial extent, fragmentation, and degree of regeneration are critical in evaluating the success of current conservation policies. This study focused on a well-protected secondary TDF in Santa Rosa National Park (SRNP) Environmental Monitoring Super Site, Guanacaste, Costa Rica. We used spectral signature analysis of TDF ecosystem succession (early, intermediate, and late successional stages), and its intrinsic variability, to propose a new multiple criteria spectral mixture analysis (MCSMA) method on the shortwave infrared (SWIR) of HyMap image. Unlike most existing iterative mixture analysis (IMA) techniques, MCSMA tries to extract and make use of representative endmembers with spectral and spatial information. MCSMA then considers three criteria that influence the comparative importance of different endmember combinations (endmember models): root mean square error (RMSE); spatial distance (SD); and fraction consistency (FC), to create an evaluation framework to select a best-fit model. The spectral analysis demonstrated that TDFs have a high spectral variability as a result of biomass variability. By adopting two search strategies, the unmixing results showed that our new MCSMA approach had a better performance in root mean square error (early: 0.160/0.159; intermediate: 0.322/0.321; and late: 0.239/0.235); mean absolute error (early: 0.132/0.128; intermediate: 0.254/0.251; and late: 0.191/0.188); and systematic error (early: 0.045/0.055; intermediate: -0.211/-0.214; and late: 0.161/0.160), compared to the multiple endmember spectral mixture analysis (MESMA). This study highlights the importance of SWIR in differentiating successional stages in TDFs. The proposed MCSMA provides a more flexible and generalized means for the best-fit model determination

  19. 75 FR 69140 - NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-10

    ... COMMISSION NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the...- Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis Risk Models...-Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis Risk...

  20. Image-guided tumor ablation: standardization of terminology and reporting criteria--a 10-year update.

    PubMed

    Ahmed, Muneeb; Solbiati, Luigi; Brace, Christopher L; Breen, David J; Callstrom, Matthew R; Charboneau, J William; Chen, Min-Hua; Choi, Byung Ihn; de Baère, Thierry; Dodd, Gerald D; Dupuy, Damian E; Gervais, Debra A; Gianfelice, David; Gillams, Alice R; Lee, Fred T; Leen, Edward; Lencioni, Riccardo; Littrup, Peter J; Livraghi, Tito; Lu, David S; McGahan, John P; Meloni, Maria Franca; Nikolic, Boris; Pereira, Philippe L; Liang, Ping; Rhim, Hyunchul; Rose, Steven C; Salem, Riad; Sofocleous, Constantinos T; Solomon, Stephen B; Soulen, Michael C; Tanaka, Masatoshi; Vogl, Thomas J; Wood, Bradford J; Goldberg, S Nahum

    2014-10-01

    Image-guided tumor ablation has become a well-established hallmark of local cancer therapy. The breadth of options available in this growing field increases the need for standardization of terminology and reporting criteria to facilitate effective communication of ideas and appropriate comparison among treatments that use different technologies, such as chemical (eg, ethanol or acetic acid) ablation, thermal therapies (eg, radiofrequency, laser, microwave, focused ultrasound, and cryoablation) and newer ablative modalities such as irreversible electroporation. This updated consensus document provides a framework that will facilitate the clearest communication among investigators regarding ablative technologies. An appropriate vehicle is proposed for reporting the various aspects of image-guided ablation therapy including classification of therapies, procedure terms, descriptors of imaging guidance, and terminology for imaging and pathologic findings. Methods are addressed for standardizing reporting of technique, follow-up, complications, and clinical results. As noted in the original document from 2003, adherence to the recommendations will improve the precision of communications in this field, leading to more accurate comparison of technologies and results, and ultimately to improved patient outcomes. Online supplemental material is available for this article . PMID:24927329

  1. Image-guided tumor ablation: standardization of terminology and reporting criteria--a 10-year update.

    PubMed

    Ahmed, Muneeb; Solbiati, Luigi; Brace, Christopher L; Breen, David J; Callstrom, Matthew R; Charboneau, J William; Chen, Min-Hua; Choi, Byung Ihn; de Baère, Thierry; Dodd, Gerald D; Dupuy, Damian E; Gervais, Debra A; Gianfelice, David; Gillams, Alice R; Lee, Fred T; Leen, Edward; Lencioni, Riccardo; Littrup, Peter J; Livraghi, Tito; Lu, David S; McGahan, John P; Meloni, Maria Franca; Nikolic, Boris; Pereira, Philippe L; Liang, Ping; Rhim, Hyunchul; Rose, Steven C; Salem, Riad; Sofocleous, Constantinos T; Solomon, Stephen B; Soulen, Michael C; Tanaka, Masatoshi; Vogl, Thomas J; Wood, Bradford J; Goldberg, S Nahum

    2014-11-01

    Image-guided tumor ablation has become a well-established hallmark of local cancer therapy. The breadth of options available in this growing field increases the need for standardization of terminology and reporting criteria to facilitate effective communication of ideas and appropriate comparison among treatments that use different technologies, such as chemical (eg, ethanol or acetic acid) ablation, thermal therapies (eg, radiofrequency, laser, microwave, focused ultrasound, and cryoablation) and newer ablative modalities such as irreversible electroporation. This updated consensus document provides a framework that will facilitate the clearest communication among investigators regarding ablative technologies. An appropriate vehicle is proposed for reporting the various aspects of image-guided ablation therapy including classification of therapies, procedure terms, descriptors of imaging guidance, and terminology for imaging and pathologic findings. Methods are addressed for standardizing reporting of technique, follow-up, complications, and clinical results. As noted in the original document from 2003, adherence to the recommendations will improve the precision of communications in this field, leading to more accurate comparison of technologies and results, and ultimately to improved patient outcomes. PMID:25442132

  2. Hyperspectral image analysis. A tutorial.

    PubMed

    Amigo, José Manuel; Babamoradi, Hamid; Elcoroaristizabal, Saioa

    2015-10-01

    This tutorial aims at providing guidelines and practical tools to assist with the analysis of hyperspectral images. Topics like hyperspectral image acquisition, image pre-processing, multivariate exploratory analysis, hyperspectral image resolution, classification and final digital image processing will be exposed, and some guidelines given and discussed. Due to the broad character of current applications and the vast number of multivariate methods available, this paper has focused on an industrial chemical framework to explain, in a step-wise manner, how to develop a classification methodology to differentiate between several types of plastics by using Near infrared hyperspectral imaging and Partial Least Squares - Discriminant Analysis. Thus, the reader is guided through every single step and oriented in order to adapt those strategies to the user's case. PMID:26481986

  3. A Comparison of Alternatives to Conducting Monte Carlo Analyses for Determining Parallel Analysis Criteria.

    ERIC Educational Resources Information Center

    Lautenschlager, Gary J.

    1989-01-01

    Procedures for implementing parallel analysis (PA) criteria in practice were compared, examining regression equation methods that can be used to estimate random data eigenvalues from known values of the sample size and number of variables. More internally accurate methods for determining PA criteria are presented. (SLD)

  4. Histopathological Image Analysis: A Review

    PubMed Central

    Gurcan, Metin N.; Boucheron, Laura; Can, Ali; Madabhushi, Anant; Rajpoot, Nasir; Yener, Bulent

    2010-01-01

    Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement to the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe. PMID:20671804

  5. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  6. Analysis of an interferometric Stokes imaging polarimeter

    NASA Astrophysics Data System (ADS)

    Murali, Sukumar

    Estimation of Stokes vector components from an interferometric fringe encoded image is a novel way of measuring the State Of Polarization (SOP) distribution across a scene. Imaging polarimeters employing interferometric techniques encode SOP in- formation across a scene in a single image in the form of intensity fringes. The lack of moving parts and use of a single image eliminates the problems of conventional polarimetry - vibration, spurious signal generation due to artifacts, beam wander, and need for registration routines. However, interferometric polarimeters are limited by narrow bandpass and short exposure time operations which decrease the Signal to Noise Ratio (SNR) defined as the ratio of the mean photon count to the standard deviation in the detected image. A simulation environment for designing an Interferometric Stokes Imaging polarimeter (ISIP) and a detector with noise effects is created and presented. Users of this environment are capable of imaging an object with defined SOP through an ISIP onto a detector producing a digitized image output. The simulation also includes bandpass imaging capabilities, control of detector noise, and object brightness levels. The Stokes images are estimated from a fringe encoded image of a scene by means of a reconstructor algorithm. A spatial domain methodology involving the idea of a unit cell and slide approach is applied to the reconstructor model developed using Mueller calculus. The validation of this methodology and effectiveness compared to a discrete approach is demonstrated with suitable examples. The pixel size required to sample the fringes and minimum unit cell size required for reconstruction are investigated using condition numbers. The importance of the PSF of fore-optics (telescope) used in imaging the object is investigated and analyzed using a point source imaging example and a Nyquist criteria is presented. Reconstruction of fringe modulated images in the presence of noise involves choosing an

  7. Analysis and performance of various classification criteria sets in a Colombian cohort of patients with spondyloarthritis.

    PubMed

    Bautista-Molano, Wilson; Landewé, Robert B M; Londoño, John; Romero-Sanchez, Consuelo; Valle-Oñate, Rafael; van der Heijde, Désirée

    2016-07-01

    The objective of this study was to investigate the performance of classification criteria sets (Assessment of SpondyloArthritis international Society (ASAS), European Spondylarthropathy Study Group (ESSG), and Amor) for spondyloarthritis (SpA) in a clinical practice cohort in Colombia and provide insight into how rheumatologists follow the diagnostic path in patients suspected of SpA. Patients with a rheumatologist's diagnosis of SpA were retrospectively classified according to three criteria sets. Classification rate was defined as the proportion of patients fulfilling a particular criterion. Characteristics of patients fulfilling and not fulfilling each criterion were compared. The ASAS criteria classified 81 % of all patients (n = 581) as having either axial SpA (44 %) or peripheral SpA (37 %), whereas a lower proportion met ESSG criteria (74 %) and Amor criteria (53 %). There was a high degree of overlap among the different criteria, and 42 % of the patients met all three criteria. Patients fulfilling all three criteria sets were older (36 vs. 30 years), had more SpA features (3 vs. 1 features), and more frequently had a current or past history of back pain (77 vs. 43 %), inflammatory back pain (47 vs. 13 %), enthesitis (67 vs. 26 %), and buttock pain (37 vs. 13 %) vs. those not fulfilling any criteria. HLA-B27, radiographs, and MRI-SI were performed in 77, 59, and 24 % of the patients, respectively. The ASAS criteria classified more patients as having SpA in this Colombian cohort when the rheumatologist's diagnosis is used as an external standard. Although physicians do not perform HLA-B27 or imaging in all patients, they do require these tests if the clinical symptoms fall short of confirming SpA and suspicion remains. PMID:26791876

  8. Image Analysis in Surgical Pathology.

    PubMed

    Lloyd, Mark C; Monaco, James P; Bui, Marilyn M

    2016-06-01

    Digitization of glass slides of surgical pathology samples facilitates a number of value-added capabilities beyond what a pathologist could previously do with a microscope. Image analysis is one of the most fundamental opportunities to leverage the advantages that digital pathology provides. The ability to quantify aspects of a digital image is an extraordinary opportunity to collect data with exquisite accuracy and reliability. In this review, we describe the history of image analysis in pathology and the present state of technology processes as well as examples of research and clinical use. PMID:27241112

  9. Decerns: A framework for multi-criteria decision analysis

    DOE PAGESBeta

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; Sullivan, Terry

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  10. Distributed multi-criteria model evaluation and spatial association analysis

    NASA Astrophysics Data System (ADS)

    Scherer, Laura; Pfister, Stephan

    2015-04-01

    Model performance, if evaluated, is often communicated by a single indicator and at an aggregated level; however, it does not embrace the trade-offs between different indicators and the inherent spatial heterogeneity of model efficiency. In this study, we simulated the water balance of the Mississippi watershed using the Soil and Water Assessment Tool (SWAT). The model was calibrated against monthly river discharge at 131 measurement stations. Its time series were bisected to allow for subsequent validation at the same gauges. Furthermore, the model was validated against evapotranspiration which was available as a continuous raster based on remote sensing. The model performance was evaluated for each of the 451 sub-watersheds using four different criteria: 1) Nash-Sutcliffe efficiency (NSE), 2) percent bias (PBIAS), 3) root mean square error (RMSE) normalized to standard deviation (RSR), as well as 4) a combined indicator of the squared correlation coefficient and the linear regression slope (bR2). Conditions that might lead to a poor model performance include aridity, a very flat and steep relief, snowfall and dams, as indicated by previous research. In an attempt to explain spatial differences in model efficiency, the goodness of the model was spatially compared to these four phenomena by means of a bivariate spatial association measure which combines Pearson's correlation coefficient and Moran's index for spatial autocorrelation. In order to assess the model performance of the Mississippi watershed as a whole, three different averages of the sub-watershed results were computed by 1) applying equal weights, 2) weighting by the mean observed river discharge, 3) weighting by the upstream catchment area and the square root of the time series length. Ratings of model performance differed significantly in space and according to efficiency criterion. The model performed much better in the humid Eastern region than in the arid Western region which was confirmed by the

  11. Image analysis for DNA sequencing

    NASA Astrophysics Data System (ADS)

    Palaniappan, Kannappan; Huang, Thomas S.

    1991-07-01

    There is a great deal of interest in automating the process of DNA (deoxyribonucleic acid) sequencing to support the analysis of genomic DNA such as the Human and Mouse Genome projects. In one class of gel-based sequencing protocols autoradiograph images are generated in the final step and usually require manual interpretation to reconstruct the DNA sequence represented by the image. The need to handle a large volume of sequence information necessitates automation of the manual autoradiograph reading step through image analysis in order to reduce the length of time required to obtain sequence data and reduce transcription errors. Various adaptive image enhancement, segmentation and alignment methods were applied to autoradiograph images. The methods are adaptive to the local characteristics of the image such as noise, background signal, or presence of edges. Once the two-dimensional data is converted to a set of aligned one-dimensional profiles waveform analysis is used to determine the location of each band which represents one nucleotide in the sequence. Different classification strategies including a rule-based approach are investigated to map the profile signals, augmented with the original two-dimensional image data as necessary, to textual DNA sequence information.

  12. Anmap: Image and data analysis

    NASA Astrophysics Data System (ADS)

    Alexander, Paul; Waldram, Elizabeth; Titterington, David; Rees, Nick

    2014-11-01

    Anmap analyses and processes images and spectral data. Originally written for use in radio astronomy, much of its functionality is applicable to other disciplines; additional algorithms and analysis procedures allow direct use in, for example, NMR imaging and spectroscopy. Anmap emphasizes the analysis of data to extract quantitative results for comparison with theoretical models and/or other experimental data. To achieve this, Anmap provides a wide range of tools for analysis, fitting and modelling (including standard image and data processing algorithms). It also provides a powerful environment for users to develop their own analysis/processing tools either by combining existing algorithms and facilities with the very powerful command (scripting) language or by writing new routines in FORTRAN that integrate seamlessly with the rest of Anmap.

  13. Image analysis and quantitative morphology.

    PubMed

    Mandarim-de-Lacerda, Carlos Alberto; Fernandes-Santos, Caroline; Aguila, Marcia Barbosa

    2010-01-01

    Quantitative studies are increasingly found in the literature, particularly in the fields of development/evolution, pathology, and neurosciences. Image digitalization converts tissue images into a numeric form by dividing them into very small regions termed picture elements or pixels. Image analysis allows automatic morphometry of digitalized images, and stereology aims to understand the structural inner three-dimensional arrangement based on the analysis of slices showing two-dimensional information. To quantify morphological structures in an unbiased and reproducible manner, appropriate isotropic and uniform random sampling of sections, and updated stereological tools are needed. Through the correct use of stereology, a quantitative study can be performed with little effort; efficiency in stereology means as little counting as possible (little work), low cost (section preparation), but still good accuracy. This short text provides a background guide for non-expert morphologists. PMID:19960334

  14. A Comparative Investigation of Rotation Criteria within Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Sass, Daniel A.; Schmitt, Thomas A.

    2010-01-01

    Exploratory factor analysis (EFA) is a commonly used statistical technique for examining the relationships between variables (e.g., items) and the factors (e.g., latent traits) they depict. There are several decisions that must be made when using EFA, with one of the more important being choice of the rotation criterion. This selection can be…

  15. Multispectral Imaging Broadens Cellular Analysis

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Amnis Corporation, a Seattle-based biotechnology company, developed ImageStream to produce sensitive fluorescence images of cells in flow. The company responded to an SBIR solicitation from Ames Research Center, and proposed to evaluate several methods of extending the depth of field for its ImageStream system and implement the best as an upgrade to its commercial products. This would allow users to view whole cells at the same time, rather than just one section of each cell. Through Phase I and II SBIR contracts, Ames provided Amnis the funding the company needed to develop this extended functionality. For NASA, the resulting high-speed image flow cytometry process made its way into Medusa, a life-detection instrument built to collect, store, and analyze sample organisms from erupting hydrothermal vents, and has the potential to benefit space flight health monitoring. On the commercial end, Amnis has implemented the process in ImageStream, combining high-resolution microscopy and flow cytometry in a single instrument, giving researchers the power to conduct quantitative analyses of individual cells and cell populations at the same time, in the same experiment. ImageStream is also built for many other applications, including cell signaling and pathway analysis; classification and characterization of peripheral blood mononuclear cell populations; quantitative morphology; apoptosis (cell death) assays; gene expression analysis; analysis of cell conjugates; molecular distribution; and receptor mapping and distribution.

  16. Multi-criteria analysis for PM10 planning

    NASA Astrophysics Data System (ADS)

    Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa

    To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.

  17. A Study to Determine Through Content Analysis Selected Criteria for Open-End Examinations.

    ERIC Educational Resources Information Center

    McNally, Elaine F.

    Content analysis was used to determine the evaluation criteria of high school and college teachers and college seniors in grading essay tests. Content analysis is defined as a way of asking a fixed set of questions unfalteringly of all of a predetermined body of writings, in such a way as to produce quantitative results. Four reponses to a…

  18. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  19. Positron emission tomography response criteria in solid tumours criteria for quantitative analysis of [18F]-fluorodeoxyglucose positron emission tomography with integrated computed tomography for treatment response assessment in metastasised solid tumours: All that glitters is not gold.

    PubMed

    Willemsen, Annelieke E C A B; Vlenterie, Myrella; van Herpen, Carla M L; van Erp, Nielka P; van der Graaf, Winette T A; de Geus-Oei, Lioe-Fee; Oyen, Wim J G

    2016-03-01

    For solid tumours, quantitative analysis of [(18)F]-fluorodeoxyglucose positron emission tomography with integrated computed tomography potentially can have significant value in early response assessment and thereby discrimination between responders and non-responders at an early stage of treatment. Standardised strategies for this analysis have been proposed, and the positron emission tomography response criteria in solid tumours (PERCIST) criteria can be regarded as the current standard to perform quantitative analysis in a research setting, yet is not implemented in daily practice. However, several exceptions and limitations limit the feasibility of PERCIST criteria. In this article, we point out dilemmas that arise when applying proposed criteria like PERCIST on an expansive set of patients with metastasised solid tumours. Clinicians and scientists should be aware of these limitations to prevent that methodological issues impede successful introduction of research data into clinical practice. Therefore, to deliver on the high potential of quantitative imaging, consensus should be reached on a standardised, feasible and clinically useful analysis methodology. This methodology should be applicable in the majority of patients, tumour types and treatments. PMID:26808297

  20. Performance analysis for geometrical attack on digital image watermarking

    NASA Astrophysics Data System (ADS)

    Jayanthi, VE.; Rajamani, V.; Karthikayen, P.

    2011-11-01

    We present a technique for irreversible watermarking approach robust to affine transform attacks in camera, biomedical and satellite images stored in the form of monochrome bitmap images. The watermarking approach is based on image normalisation in which both watermark embedding and extraction are carried out with respect to an image normalised to meet a set of predefined moment criteria. The normalisation procedure is invariant to affine transform attacks. The result of watermarking scheme is suitable for public watermarking applications, where the original image is not available for watermark extraction. Here, direct-sequence code division multiple access approach is used to embed multibit text information in DCT and DWT transform domains. The proposed watermarking schemes are robust against various types of attacks such as Gaussian noise, shearing, scaling, rotation, flipping, affine transform, signal processing and JPEG compression. Performance analysis results are measured using image processing metrics.

  1. 75 FR 80544 - NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... COMMISSION NUREG-1953, Confirmatory Thermal-Hydraulic Analysis To Support Specific Success Criteria in the..., ``Confirmatory Thermal-Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis... . SUPPLEMENTARY INFORMATION: NUREG-1953, ``Confirmatory Thermal-Hydraulic Analysis to Support Specific...

  2. Minimizing impacts of land use change on ecosystem services using multi-criteria heuristic analysis.

    PubMed

    Keller, Arturo A; Fournier, Eric; Fox, Jessica

    2015-06-01

    Development of natural landscapes to support human activities impacts the capacity of the landscape to provide ecosystem services. Typically, several ecosystem services are impacted at a single development site and various footprint scenarios are possible, thus a multi-criteria analysis is needed. Restoration potential should also be considered for the area surrounding the permanent impact site. The primary objective of this research was to develop a heuristic approach to analyze multiple criteria (e.g. impacts to various ecosystem services) in a spatial configuration with many potential development sites. The approach was to: (1) quantify the magnitude of terrestrial ecosystem service (biodiversity, carbon sequestration, nutrient and sediment retention, and pollination) impacts associated with a suite of land use change scenarios using the InVEST model; (2) normalize results across categories of ecosystem services to allow cross-service comparison; (3) apply the multi-criteria heuristic algorithm to select sites with the least impact to ecosystem services, including a spatial criterion (separation between sites). As a case study, the multi-criteria impact minimization algorithm was applied to InVEST output to select 25 potential development sites out of 204 possible locations (selected by other criteria) within a 24,000 ha property. This study advanced a generally applicable spatial multi-criteria approach for 1) considering many land use footprint scenarios, 2) balancing impact decisions across a suite of ecosystem services, and 3) determining the restoration potential of ecosystem services after impacts. PMID:25794964

  3. Irreversible Electroporation (IRE): Standardization of Terminology and Reporting Criteria for Analysis and Comparison

    PubMed Central

    Wendler, Johann J.; Fischbach, Katharina; Ricke, Jens; Jürgens, Julian; Fischbach, Frank; Köllermann, Jens; Porsch, Markus; Baumunk, Daniel; Schostak, Martin; Liehr, Uwe-Bernd; Pech, Maciej

    2016-01-01

    Summary Background Irreversible electroporation (IRE) as newer ablation modality has been introduced and its clinical niche is under investigation. At present just one IRE system has been approved for clinical use and is currently commercially available (NanoKnife® system). In 2014, the International Working Group on Image-Guided Tumor Ablation updated the recommendation about standardization of terms and reporting criteria for image-guided tumor ablation. The IRE method is not covered in detail. But the non-thermal IRE method and the NanoKnife System differ fundamentally from established ablations techniques, especially thermal approaches, e.g. radio frequency ablation (RFA). Material/Methods As numerous publications on IRE with varying terminology exist so far – with numbers continuously increasing – standardized terms and reporting criteria of IRE are needed urgently. The use of standardized terminology may then allow for a better inter-study comparison of the methodology applied as well as results achieved. Results Thus, the main objective of this document is to supplement the updated recommendation for image-guided tumor ablation by outlining a standardized set of terminology for the IRE procedure with the NanoKnife Sytem as well as address essential clinical and technical informations that should be provided when reporting on IRE tumor ablation. Conclusions We emphasize that the usage of all above recommended reporting criteria and terms can make IRE ablation reports comparable and provide treatment transparency to assess the current value of IRE and provide further development. PMID:26966472

  4. Do choosing wisely tools meet criteria for patient decision aids? A descriptive analysis of patient materials

    PubMed Central

    Légaré, France; Hébert, Jessica; Goh, Larissa; Lewis, Krystina B; Leiva Portocarrero, Maria Ester; Robitaille, Hubert; Stacey, Dawn

    2016-01-01

    Objectives Choosing Wisely is a remarkable physician-led campaign to reduce unnecessary or harmful health services. Some of the literature identifies Choosing Wisely as a shared decision-making approach. We evaluated the patient materials developed by Choosing Wisely Canada to determine whether they meet the criteria for shared decision-making tools known as patient decision aids. Design Descriptive analysis of all Choosing Wisely Canada patient materials. Data source In May 2015, we selected all Choosing Wisely Canada patient materials from its official website. Main outcomes and measures Four team members independently extracted characteristics of the English materials using the International Patient Decision Aid Standards (IPDAS) modified 16-item minimum criteria for qualifying and certifying patient decision aids. The research team discussed discrepancies between data extractors and reached a consensus. Descriptive analysis was conducted. Results Of the 24 patient materials assessed, 12 were about treatments, 11 were about screening and 1 was about prevention. The median score for patient materials using IPDAS criteria was 10/16 (range: 8–11) for screening topics and 6/12 (range: 6–9) for prevention and treatment topics. Commonly missed criteria were stating the decision (21/24 did not), providing balanced information on option benefits/harms (24/24 did not), citing evidence (24/24 did not) and updating policy (24/24 did not). Out of 24 patient materials, only 2 met the 6 IPDAS criteria to qualify as patient decision aids, and neither of these 2 met the 6 certifying criteria. Conclusions Patient materials developed by Choosing Wisely Canada do not meet the IPDAS minimal qualifying or certifying criteria for patient decision aids. Modifications to the Choosing Wisely Canada patient materials would help to ensure that they qualify as patient decision aids and thus as more effective shared decision-making tools. PMID:27566638

  5. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    SciTech Connect

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J

    2003-10-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  6. A computational image analysis glossary for biologists.

    PubMed

    Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M

    2012-09-01

    Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies. PMID:22872081

  7. Validity of Criteria-Based Content Analysis (CBCA) at Trial in Free-Narrative Interviews

    ERIC Educational Resources Information Center

    Roma, Paolo; San Martini, Pietro; Sabatello, Ugo; Tatarelli, Roberto; Ferracuti, Stefano

    2011-01-01

    Objective: The reliability of child witness testimony in sexual abuse cases is often controversial, and few tools are available. Criteria-Based Content Analysis (CBCA) is a widely used instrument for evaluating psychological credibility in cases of suspected child sexual abuse. Only few studies have evaluated CBCA scores in children suspected of…

  8. Target identification by image analysis.

    PubMed

    Fetz, V; Prochnow, H; Brönstrup, M; Sasse, F

    2016-05-01

    Covering: 1997 to the end of 2015Each biologically active compound induces phenotypic changes in target cells that are characteristic for its mode of action. These phenotypic alterations can be directly observed under the microscope or made visible by labelling structural elements or selected proteins of the cells with dyes. A comparison of the cellular phenotype induced by a compound of interest with the phenotypes of reference compounds with known cellular targets allows predicting its mode of action. While this approach has been successfully applied to the characterization of natural products based on a visual inspection of images, recent studies used automated microscopy and analysis software to increase speed and to reduce subjective interpretation. In this review, we give a general outline of the workflow for manual and automated image analysis, and we highlight natural products whose bacterial and eucaryotic targets could be identified through such approaches. PMID:26777141

  9. The economics of project analysis: Optimal investment criteria and methods of study

    NASA Technical Reports Server (NTRS)

    Scriven, M. C.

    1979-01-01

    Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.

  10. Planning applications in image analysis

    NASA Technical Reports Server (NTRS)

    Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.

    1994-01-01

    We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.

  11. Quantitative multi-image analysis for biomedical Raman spectroscopic imaging.

    PubMed

    Hedegaard, Martin A B; Bergholt, Mads S; Stevens, Molly M

    2016-05-01

    Imaging by Raman spectroscopy enables unparalleled label-free insights into cell and tissue composition at the molecular level. With established approaches limited to single image analysis, there are currently no general guidelines or consensus on how to quantify biochemical components across multiple Raman images. Here, we describe a broadly applicable methodology for the combination of multiple Raman images into a single image for analysis. This is achieved by removing image specific background interference, unfolding the series of Raman images into a single dataset, and normalisation of each Raman spectrum to render comparable Raman images. Multivariate image analysis is finally applied to derive the contributing 'pure' biochemical spectra for relative quantification. We present our methodology using four independently measured Raman images of control cells and four images of cells treated with strontium ions from substituted bioactive glass. We show that the relative biochemical distribution per area of the cells can be quantified. In addition, using k-means clustering, we are able to discriminate between the two cell types over multiple Raman images. This study shows a streamlined quantitative multi-image analysis tool for improving cell/tissue characterisation and opens new avenues in biomedical Raman spectroscopic imaging. PMID:26833935

  12. Grid computing in image analysis

    PubMed Central

    2011-01-01

    Diagnostic surgical pathology or tissue–based diagnosis still remains the most reliable and specific diagnostic medical procedure. The development of whole slide scanners permits the creation of virtual slides and to work on so-called virtual microscopes. In addition to interactive work on virtual slides approaches have been reported that introduce automated virtual microscopy, which is composed of several tools focusing on quite different tasks. These include evaluation of image quality and image standardization, analysis of potential useful thresholds for object detection and identification (segmentation), dynamic segmentation procedures, adjustable magnification to optimize feature extraction, and texture analysis including image transformation and evaluation of elementary primitives. Grid technology seems to possess all features to efficiently target and control the specific tasks of image information and detection in order to obtain a detailed and accurate diagnosis. Grid technology is based upon so-called nodes that are linked together and share certain communication rules in using open standards. Their number and functionality can vary according to the needs of a specific user at a given point in time. When implementing automated virtual microscopy with Grid technology, all of the five different Grid functions have to be taken into account, namely 1) computation services, 2) data services, 3) application services, 4) information services, and 5) knowledge services. Although all mandatory tools of automated virtual microscopy can be implemented in a closed or standardized open system, Grid technology offers a new dimension to acquire, detect, classify, and distribute medical image information, and to assure quality in tissue–based diagnosis. PMID:21516880

  13. Validating retinal fundus image analysis algorithms: issues and a proposal.

    PubMed

    Trucco, Emanuele; Ruggeri, Alfredo; Karnowski, Thomas; Giancardo, Luca; Chaum, Edward; Hubschman, Jean Pierre; Al-Diri, Bashir; Cheung, Carol Y; Wong, Damon; Abràmoff, Michael; Lim, Gilbert; Kumar, Dinesh; Burlina, Philippe; Bressler, Neil M; Jelinek, Herbert F; Meriaudeau, Fabrice; Quellec, Gwénolé; Macgillivray, Tom; Dhillon, Bal

    2013-05-01

    This paper concerns the validation of automatic retinal image analysis (ARIA) algorithms. For reasons of space and consistency, we concentrate on the validation of algorithms processing color fundus camera images, currently the largest section of the ARIA literature. We sketch the context (imaging instruments and target tasks) of ARIA validation, summarizing the main image analysis and validation techniques. We then present a list of recommendations focusing on the creation of large repositories of test data created by international consortia, easily accessible via moderated Web sites, including multicenter annotations by multiple experts, specific to clinical tasks, and capable of running submitted software automatically on the data stored, with clear and widely agreed-on performance criteria, to provide a fair comparison. PMID:23794433

  14. Validating Retinal Fundus Image Analysis Algorithms: Issues and a Proposal

    PubMed Central

    Trucco, Emanuele; Ruggeri, Alfredo; Karnowski, Thomas; Giancardo, Luca; Chaum, Edward; Hubschman, Jean Pierre; al-Diri, Bashir; Cheung, Carol Y.; Wong, Damon; Abràmoff, Michael; Lim, Gilbert; Kumar, Dinesh; Burlina, Philippe; Bressler, Neil M.; Jelinek, Herbert F.; Meriaudeau, Fabrice; Quellec, Gwénolé; MacGillivray, Tom; Dhillon, Bal

    2013-01-01

    This paper concerns the validation of automatic retinal image analysis (ARIA) algorithms. For reasons of space and consistency, we concentrate on the validation of algorithms processing color fundus camera images, currently the largest section of the ARIA literature. We sketch the context (imaging instruments and target tasks) of ARIA validation, summarizing the main image analysis and validation techniques. We then present a list of recommendations focusing on the creation of large repositories of test data created by international consortia, easily accessible via moderated Web sites, including multicenter annotations by multiple experts, specific to clinical tasks, and capable of running submitted software automatically on the data stored, with clear and widely agreed-on performance criteria, to provide a fair comparison. PMID:23794433

  15. Use of stochastic multi-criteria decision analysis to support sustainable management of contaminated sediments.

    PubMed

    Sparrevik, Magnus; Barton, David N; Bates, Mathew E; Linkov, Igor

    2012-02-01

    Sustainable management of contaminated sediments requires careful prioritization of available resources and focuses on efforts to optimize decisions that consider environmental, economic, and societal aspects simultaneously. This may be achieved by combining different analytical approaches such as risk analysis (RA), life cycle analysis (LCA), multicriteria decision analysis (MCDA), and economic valuation methods. We propose the use of stochastic MCDA based on outranking algorithms to implement integrative sustainability strategies for sediment management. In this paper we use the method to select the best sediment management alternatives for the dibenzo-p-dioxin and -furan (PCDD/F) contaminated Grenland fjord in Norway. In the analysis, the benefits of health risk reductions and socio-economic benefits from removing seafood health advisories are evaluated against the detriments of remedial costs and life cycle environmental impacts. A value-plural based weighing of criteria is compared to criteria weights mimicking traditional cost-effectiveness (CEA) and cost-benefit (CBA) analyses. Capping highly contaminated areas in the inner or outer fjord is identified as the most preferable remediation alternative under all criteria schemes and the results are confirmed by a probabilistic sensitivity analysis. The proposed methodology can serve as a flexible framework for future decision support and can be a step toward more sustainable decision making for contaminated sediment management. It may be applicable to the broader field of ecosystem restoration for trade-off analysis between ecosystem services and restoration costs. PMID:22191941

  16. Image analysis of dye stained patterns in soils

    NASA Astrophysics Data System (ADS)

    Bogner, Christina; Trancón y Widemann, Baltasar; Lange, Holger

    2013-04-01

    Quality of surface water and groundwater is directly affected by flow processes in the unsaturated zone. In general, it is difficult to measure or model water flow. Indeed, parametrization of hydrological models is problematic and often no unique solution exists. To visualise flow patterns in soils directly dye tracer studies can be done. These experiments provide images of stained soil profiles and their evaluation demands knowledge in hydrology as well as in image analysis and statistics. First, these photographs are converted to binary images classifying the pixels in dye stained and non-stained ones. Then, some feature extraction is necessary to discern relevant hydrological information. In our study we propose to use several index functions to extract different (ideally complementary) features. We associate each image row with a feature vector (i.e. a certain number of image function values) and use these features to cluster the image rows to identify similar image areas. Because images of stained profiles might have different reasonable clusterings, we calculate multiple consensus clusterings. An expert can explore these different solutions and base his/her interpretation of predominant flow mechanisms on quantitative (objective) criteria. The complete workflow from reading-in binary images to final clusterings has been implemented in the free R system, a language and environment for statistical computing. The calculation of image indices is part of our own package Indigo, manipulation of binary images, clustering and visualization of results are done using either build-in facilities in R, additional R packages or the LATEX system.

  17. A water quality monitoring network design using fuzzy theory and multiple criteria analysis.

    PubMed

    Chang, Chia-Ling; Lin, You-Tze

    2014-10-01

    A proper water quality monitoring design is required in a watershed, particularly in a water resource protected area. As numerous factors can influence the water quality monitoring design, this study applies multiple criteria analysis to evaluate the suitability of the water quality monitoring design in the Taipei Water Resource Domain (TWRD) in northern Taiwan. Seven criteria, which comprise percentage of farmland area, percentage of built-up area, amount of non-point source pollution, green cover ratio, landslide area ratio, ratio of over-utilization on hillsides, and density of water quality monitoring stations, are selected in the multiple criteria analysis. The criteria are normalized and weighted. The weighted method is applied to score the subbasins. The density of water quality stations needs to be increased in priority in the subbasins with a higher score. The fuzzy theory is utilized to prioritize the need for a higher density of water quality monitoring stations. The results show that the need for more water quality stations in subbasin 2 in the Bei-Shih Creek Basin is much higher than those in the other subbasins. Furthermore, the existing water quality station in subbasin 2 requires maintenance. It is recommended that new water quality stations be built in subbasin 2. PMID:24974234

  18. Evaluation of Diagnostic Criteria for Night Eating Syndrome Using Item Response Theory Analysis

    PubMed Central

    Allison, Kelly C.; Engel, Scott G.; Crosby, Ross D.; de Zwaan, Martina; O’Reardon, John P.; Wonderlich, Stephen A.; Mitchell, James E.; West, Delia Smith; Wadden, Thomas A.; Stunkard, Albert J.

    2008-01-01

    Uniform diagnostic criteria for the night eating syndrome (NES), a disorder characterized by a delay in the circadian pattern of eating, have not been established. Proposed criteria for NES were evaluated using item response theory (IRT) analysis. Six studies yielded 1,481 Night Eating Questionnaires which were coded to reflect the presence/absence of five night eating symptoms. Symptoms were evaluated based on the clinical usefulness of their diagnostic information and on the assumptions of IRT analysis (unidimensionality, monotonicity, local item independence, correct model specification), using a two parameter logistic (2PL) IRT model. Reports of (1) nocturnal eating and/or evening hyperphagia, (2) initial insomnia, and (3) night awakenings showed high precision in discriminating those with night eating problems, while morning anorexia and delayed morning meal provided little additional information. IRT is a useful tool for evaluating the diagnostic criteria of psychiatric disorders and can be used to evaluate potential diagnostic criteria of NES empirically. Behavioral factors were identified as useful discriminators of NES. Future work should also examine psychological factors in conjunction with those identified here. PMID:18928902

  19. Automated image analysis of uterine cervical images

    NASA Astrophysics Data System (ADS)

    Li, Wenjing; Gu, Jia; Ferris, Daron; Poirson, Allen

    2007-03-01

    Cervical Cancer is the second most common cancer among women worldwide and the leading cause of cancer mortality of women in developing countries. If detected early and treated adequately, cervical cancer can be virtually prevented. Cervical precursor lesions and invasive cancer exhibit certain morphologic features that can be identified during a visual inspection exam. Digital imaging technologies allow us to assist the physician with a Computer-Aided Diagnosis (CAD) system. In colposcopy, epithelium that turns white after application of acetic acid is called acetowhite epithelium. Acetowhite epithelium is one of the major diagnostic features observed in detecting cancer and pre-cancerous regions. Automatic extraction of acetowhite regions from cervical images has been a challenging task due to specular reflection, various illumination conditions, and most importantly, large intra-patient variation. This paper presents a multi-step acetowhite region detection system to analyze the acetowhite lesions in cervical images automatically. First, the system calibrates the color of the cervical images to be independent of screening devices. Second, the anatomy of the uterine cervix is analyzed in terms of cervix region, external os region, columnar region, and squamous region. Third, the squamous region is further analyzed and subregions based on three levels of acetowhite are identified. The extracted acetowhite regions are accompanied by color scores to indicate the different levels of acetowhite. The system has been evaluated by 40 human subjects' data and demonstrates high correlation with experts' annotations.

  20. Definition and Yield of Inclusion Criteria for a Meta-Analysis of Patient Education Studies in Clinical Preventive Services.

    ERIC Educational Resources Information Center

    Tabak, Ellen R.; And Others

    1991-01-01

    A framework and concepts for developing inclusion criteria for meta-analysis are presented and illustrated in a meta-analysis of primary studies in patient education for preventive health services. Of 5,451 citations located and abstracts screened, 64 studies eventually met acceptability criteria. (SLD)

  1. Model Criteria for and Content Analysis of Historical Fiction about the Holocaust for Grades Four through Twelve.

    ERIC Educational Resources Information Center

    Muallem, Miriam; Dowd, Frances A.

    1992-01-01

    Establishes components of high-quality books about the Holocaust for children and young adults. Uses these criteria in a content analysis of six historical novels about the Holocaust. Highlights setting, viewpoint, characterization, plot, style, theme, special features, and author. These criteria can be a model for future analysis of historical…

  2. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  3. Neural network ultrasound image analysis

    NASA Astrophysics Data System (ADS)

    Schneider, Alexander C.; Brown, David G.; Pastel, Mary S.

    1993-09-01

    Neural network based analysis of ultrasound image data was carried out on liver scans of normal subjects and those diagnosed with diffuse liver disease. In a previous study, ultrasound images from a group of normal volunteers, Gaucher's disease patients, and hepatitis patients were obtained by Garra et al., who used classical statistical methods to distinguish from among these three classes. In the present work, neural network classifiers were employed with the same image features found useful in the previous study for this task. Both standard backpropagation neural networks and a recently developed biologically-inspired network called Dystal were used. Classification performance as measured by the area under a receiver operating characteristic curve was generally excellent for the back propagation networks and was roughly comparable to that of classical statistical discriminators tested on the same data set and documented in the earlier study. Performance of the Dystal network was significantly inferior; however, this may be due to the choice of network parameter. Potential methods for enhancing network performance was identified.

  4. Exploratory Factor Analysis of Diagnostic and Statistical Manual, 5th Edition, Criteria for Posttraumatic Stress Disorder.

    PubMed

    McSweeney, Lauren B; Koch, Ellen I; Saules, Karen K; Jefferson, Stephen

    2016-01-01

    One change to the posttraumatic stress disorder (PTSD) nomenclature highlighted in the Diagnostic and Statistical Manual, 5th Edition (DSM-5; American Psychiatric Association, 2013) is the conceptualization of PTSD as a diagnostic category with four distinct symptom clusters. This article presents exploratory factor analysis to test the structural validity of the DSM-5 conceptualization of PTSD via an online survey that included the PTSD Checklist-5. The study utilized a sample of 113 college students from a large Midwestern university and 177 Amazon Mechanical Turk users. Participants were primarily female, Caucasian, single, and heterosexual with an average age of 32 years. Approximately 30% to 35% of participants met diagnostic criteria for PTSD based on two different scoring criteria. Results of the exploratory factor analysis revealed five distinct symptom clusters. The implications for the classification of PTSD are discussed. PMID:26669983

  5. A watershed-based cumulative risk impact analysis: environmental vulnerability and impact criteria.

    PubMed

    Osowski, S L; Swick, J D; Carney, G R; Pena, H B; Danielson, J E; Parrish, D A

    2001-01-01

    Swine Concentrated Animal Feeding Operations (CAFOs) have received much attention in recent years. As a result, a watershed-based screening tool, the Cumulative Risk Index Analysis (CRIA), was developed to assess the cumulative impacts of multiple CAFO facilities in a watershed subunit. The CRIA formula calculates an index number based on: 1) the area of one or more facilities compared to the area of the watershed subunit, 2) the average of the environmental vulnerability criteria, and 3) the average of the industry-specific impact criteria. Each vulnerability or impact criterion is ranked on a 1 to 5 scale, with a low rank indicating low environmental vulnerability or impact and a high rank indicating high environmental vulnerability or impact. The individual criterion ranks, as well as the total CRIA score, can be used to focus the environmental analysis and facilitate discussions with industry, public, and other stakeholders in the Agency decision-making process. PMID:11214349

  6. Potentially inappropriate drug prescribing in elderly hospitalized patients: an analysis and comparison of explicit criteria.

    PubMed

    Di Giorgio, Concetta; Provenzani, Alessio; Polidori, Piera

    2016-04-01

    Background The management of therapy in elderly is a critical aspect of primary care. The physio-pathological complexity of the elderly involves the prescription of multiple drugs, exposing them to a higher risk of adverse reactions. Objective Aim of this study was to assess the medication use and (potential) inappropriate medications and prescribing omissions in the elderly before and during hospitalization, according to the main tools in literature described, and their relation to the number of comorbidities. Setting The study was carried out by the Clinical Pharmacists at ISMETT, an Italian Research Institute. Methods The prescriptions of elderly, admitted in ISMETT between January and December 2012, were analyzed. The information about clinical profile of elderly and prescriptions was obtained from the electronic medical records. 2012 Beers criteria, Screening Tool of Older Person's Prescriptions/Screening Tool to Alert doctors to Right Treatment criteria, and Improving Prescribing in the Elderly criteria were used to evaluate the appropriateness of prescriptions. The correlation between the number of comorbidities and the different tools was analyzed with the Spearman correlation coefficient. The frequency analysis was done with the Pearson Chi square test. Main outcome measure Percentage of potentially inappropriate medications and prescribing omissions before/during hospitalization in elderly. Results 1027 elderly were admitted between January and December 2012. At admission and during hospitalization, according to Beers criteria 24 and 49 % of elderly had at least one potentially inappropriate medication, respectively; according to the Screening Tool of Older Person's Prescriptions criteria 21 and 27 %, respectively; according to the Improving Prescribing in the Elderly criteria 28 and 25 %, respectively; and then, according to Screening Tool to Alert doctors to Right Treatment criteria 28 and 33 % had at least one potentially prescribing omission

  7. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

  8. Use of multi-criteria decision analysis in regulatory alternatives analysis: a case study of lead free solder.

    PubMed

    Malloy, Timothy F; Sinsheimer, Peter J; Blake, Ann; Linkov, Igor

    2013-10-01

    Regulators are implementing new programs that require manufacturers of products containing certain chemicals of concern to identify, evaluate, and adopt viable, safer alternatives. Such programs raise the difficult question for policymakers and regulated businesses of which alternatives are "viable" and "safer." To address that question, these programs use "alternatives analysis," an emerging methodology that integrates issues of human health and environmental effects with technical feasibility and economic impact. Despite the central role that alternatives analysis plays in these programs, the methodology itself is neither well-developed nor tailored to application in regulatory settings. This study uses the case of Pb-based bar solder and its non-Pb-based alternatives to examine the application of 2 multi-criteria decision analysis (MCDA) methods to alternatives analysis: multi-attribute utility analysis and outranking. The article develops and evaluates an alternatives analysis methodology and supporting decision-analysis software for use in a regulatory context, using weighting of the relevant decision criteria generated from a stakeholder elicitation process. The analysis produced complete rankings of the alternatives, including identification of the relative contribution to the ranking of each of the highest level decision criteria such as human health impacts, technical feasibility, and economic feasibility. It also examined the effect of variation in data conventions, weighting, and decision frameworks on the outcome. The results indicate that MCDA can play a critical role in emerging prevention-based regulatory programs. Multi-criteria decision analysis methods offer a means for transparent, objective, and rigorous analysis of products and processes, providing regulators and stakeholders with a common baseline understanding of the relative performance of alternatives and the trade-offs they present. PMID:23703936

  9. Adaptation and Evaluation of a Multi-Criteria Decision Analysis Model for Lyme Disease Prevention

    PubMed Central

    Aenishaenslin, Cécile; Gern, Lise; Michel, Pascal; Ravel, André; Hongoh, Valérie; Waaub, Jean-Philippe; Milord, François; Bélanger, Denise

    2015-01-01

    Designing preventive programs relevant to vector-borne diseases such as Lyme disease (LD) can be complex given the need to include multiple issues and perspectives into prioritizing public health actions. A multi-criteria decision aid (MCDA) model was previously used to rank interventions for LD prevention in Quebec, Canada, where the disease is emerging. The aim of the current study was to adapt and evaluate the decision model constructed in Quebec under a different epidemiological context, in Switzerland, where LD has been endemic for the last thirty years. The model adaptation was undertaken with a group of Swiss stakeholders using a participatory approach. The PROMETHEE method was used for multi-criteria analysis. Key elements and results of the MCDA model are described and contrasted with the Quebec model. All criteria and most interventions of the MCDA model developed for LD prevention in Quebec were directly transferable to the Swiss context. Four new decision criteria were added, and the list of proposed interventions was modified. Based on the overall group ranking, interventions targeting human populations were prioritized in the Swiss model, with the top ranked action being the implementation of a large communication campaign. The addition of criteria did not significantly alter the intervention rankings, but increased the capacity of the model to discriminate between highest and lowest ranked interventions. The current study suggests that beyond the specificity of the MCDA models developed for Quebec and Switzerland, their general structure captures the fundamental and common issues that characterize the complexity of vector-borne disease prevention. These results should encourage public health organizations to adapt, use and share MCDA models as an effective and functional approach to enable the integration of multiple perspectives and considerations in the prevention and control of complex public health issues such as Lyme disease or other vector

  10. Latent Class Analysis of DSM-5 Alcohol Use Disorder Criteria Among Heavy-Drinking College Students.

    PubMed

    Rinker, Dipali Venkataraman; Neighbors, Clayton

    2015-10-01

    The DSM-5 has created significant changes in the definition of alcohol use disorders (AUDs). Limited work has considered the impact of these changes in specific populations, such as heavy-drinking college students. Latent class analysis (LCA) is a person-centered approach that divides a population into mutually exclusive and exhaustive latent classes, based on observable indicator variables. The present research was designed to examine whether there were distinct classes of heavy-drinking college students who met DSM-5 criteria for an AUD and whether gender, perceived social norms, use of protective behavioral strategies (PBS), drinking refusal self-efficacy (DRSE), self-perceptions of drinking identity, psychological distress, and membership in a fraternity/sorority would be associated with class membership. Three-hundred and ninety-four college students who met DSM-5 criteria for an AUD were recruited from three different universities. Two distinct classes emerged: Less Severe (86%), the majority of whom endorsed both drinking more than intended and tolerance, as well as met criteria for a mild AUD; and More Severe (14%), the majority of whom endorsed at least half of the DSM-5 AUD criteria and met criteria for a severe AUD. Relative to the Less Severe class, membership in the More Severe class was negatively associated with DRSE and positively associated with self-identification as a drinker. There is a distinct class of heavy-drinking college students with a more severe AUD and for whom intervention content needs to be more focused and tailored. Clinical implications are discussed. PMID:26051027

  11. New stability criteria and bifurcation analysis for nonlinear discrete-time coupled loops with multiple delays.

    PubMed

    Peng, Mingshu; Yang, Xiaozhong

    2010-03-01

    A detailed analysis of zero distributions in a special polynomial of the form lambda(tau)(lambda-a(1))(lambda-a(2))...(lambda-a(n))-(c+id) is proposed, where all a(i)(i=1,2,...,) have the same sign. As its applications, new criteria for asymptotic behavior of nonlinear delayed coupled systems with different topological structures are established. All possible bifurcations, including codimension-two bifurcations with 1:4/1:3 strong resonance in such a delayed difference system, are discussed. Numerical simulation gives a solid verification of the theoretical analysis. PMID:20370280

  12. New stability criteria and bifurcation analysis for nonlinear discrete-time coupled loops with multiple delays

    NASA Astrophysics Data System (ADS)

    Peng, Mingshu; Yang, Xiaozhong

    2010-03-01

    A detailed analysis of zero distributions in a special polynomial of the form λτ(λ -a1)(λ -a2)⋯(λ -an)-(c +id) is proposed, where all ai(i =1,2,…,) have the same sign. As its applications, new criteria for asymptotic behavior of nonlinear delayed coupled systems with different topological structures are established. All possible bifurcations, including codimension-two bifurcations with 1:4/1:3 strong resonance in such a delayed difference system, are discussed. Numerical simulation gives a solid verification of the theoretical analysis.

  13. Using multiple criteria decision analysis for supporting decisions of solid waste management.

    PubMed

    Cheng, Steven; Chan, Christine W; Huang, Guo H

    2002-01-01

    Design of solid-waste management systems requires consideration of multiple alternative solutions and evaluation criteria because the systems can have complex and conflicting impacts on different stakeholders. Multiple criteria decision analysis (MCDA) has been found to be a fruitful approach to solve this design problem. In this paper, the MCDA approach is applied to solve the landfill selection problem in Regina of Saskatchewan Canada. The systematic approach of MCDA helps decision makers select the most preferable decision and provides the basis of a decision support system. The techniques that are used in this study include: 1) Simple Weighted Addition method, 2) Weighted Product method, 3) TOPSIS, 4) cooperative game theory, and 5) ELECTRE. The results generated with these methods are compared and ranked so that the most preferable solution is identified. PMID:12090287

  14. MetaQC: objective quality control and inclusion/exclusion criteria for genomic meta-analysis

    PubMed Central

    Kang, Dongwan D.; Sibille, Etienne; Kaminski, Naftali; Tseng, George C.

    2012-01-01

    Genomic meta-analysis to combine relevant and homogeneous studies has been widely applied, but the quality control (QC) and objective inclusion/exclusion criteria have been largely overlooked. Currently, the inclusion/exclusion criteria mostly depend on ad-hoc expert opinion or naïve threshold by sample size or platform. There are pressing needs to develop a systematic QC methodology as the decision of study inclusion greatly impacts the final meta-analysis outcome. In this article, we propose six quantitative quality control measures, covering internal homogeneity of coexpression structure among studies, external consistency of coexpression pattern with pathway database, and accuracy and consistency of differentially expressed gene detection or enriched pathway identification. Each quality control index is defined as the minus log transformed P values from formal hypothesis testing. Principal component analysis biplots and a standardized mean rank are applied to assist visualization and decision. We applied the proposed method to 4 large-scale examples, combining 7 brain cancer, 9 prostate cancer, 8 idiopathic pulmonary fibrosis and 17 major depressive disorder studies, respectively. The identified problematic studies were further scrutinized for potential technical or biological causes of their lower quality to determine their exclusion from meta-analysis. The application and simulation results concluded a systematic quality assessment framework for genomic meta-analysis. PMID:22116060

  15. Spreadsheet-like image analysis

    NASA Astrophysics Data System (ADS)

    Wilson, Paul

    1992-08-01

    This report describes the design of a new software system being built by the Army to support and augment automated nondestructive inspection (NDI) on-line equipment implemented by the Army for detection of defective manufactured items. The new system recalls and post-processes (off-line) the NDI data sets archived by the on-line equipment for the purpose of verifying the correctness of the inspection analysis paradigms, of developing better analysis paradigms and to gather statistics on the defects of the items inspected. The design of the system is similar to that of a spreadsheet, i.e., an array of cells which may be programmed to contain functions with arguments being data from other cells and whose resultant is the output of that cell's function. Unlike a spreadsheet, the arguments and the resultants of a cell may be a matrix such as a two-dimensional matrix of picture elements (pixels). Functions include matrix mathematics, neural networks and image processing as well as those ordinarily found in spreadsheets. The system employs all of the common environmental supports of the Macintosh computer, which is the hardware platform. The system allows the resultant of a cell to be displayed in any of multiple formats such as a matrix of numbers, text, an image, or a chart. Each cell is a window onto the resultant. Like a spreadsheet if the input value of any cell is changed its effect is cascaded into the resultants of all cells whose functions use that value directly or indirectly. The system encourages the user to play what-of games, as ordinary spreadsheets do.

  16. APPROACH TO LEVEL 2 ANALYSIS BASED ON LEVEL 1 RESULTS, MEG CATEGORIES AND COMPOUNDS, AND DECISION CRITERIA

    EPA Science Inventory

    The report describes an approach to the decision criteria needed to proceed from the initial emission screening analysis (Level 1) to the detailed emission characterization (Level 2), and a Level 2 analytical approach. The decision criteria, considering only the available Level 1...

  17. Automated Dermoscopy Image Analysis of Pigmented Skin Lesions

    PubMed Central

    Baldi, Alfonso; Quartulli, Marco; Murace, Raffaele; Dragonetti, Emanuele; Manganaro, Mario; Guerra, Oscar; Bizzi, Stefano

    2010-01-01

    Dermoscopy (dermatoscopy, epiluminescence microscopy) is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs), allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis). This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR). PMID:24281070

  18. IMAGE ANALYSIS ALGORITHMS FOR DUAL MODE IMAGING SYSTEMS

    SciTech Connect

    Robinson, Sean M.; Jarman, Kenneth D.; Miller, Erin A.; Misner, Alex C.; Myjak, Mitchell J.; Pitts, W. Karl; Seifert, Allen; Seifert, Carolyn E.; Woodring, Mitchell L.

    2010-06-11

    The level of detail discernable in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes where information barriers are mandatory. However, if a balance can be struck between sufficient information barriers and feature extraction to verify or identify objects of interest, imaging may significantly advance verification efforts. This paper describes the development of combined active (conventional) radiography and passive (auto) radiography techniques for imaging sensitive items assuming that comparison images cannot be furnished. Three image analysis algorithms are presented, each of which reduces full image information to non-sensitive feature information and ultimately is intended to provide only a yes/no response verifying features present in the image. These algorithms are evaluated on both their technical performance in image analysis and their application with or without an explicitly constructed information barrier. The first algorithm reduces images to non-invertible pixel intensity histograms, retaining only summary information about the image that can be used in template comparisons. This one-way transform is sufficient to discriminate between different image structures (in terms of area and density) without revealing unnecessary specificity. The second algorithm estimates the attenuation cross-section of objects of known shape based on transition characteristics around the edge of the object’s image. The third algorithm compares the radiography image with the passive image to discriminate dense, radioactive material from point sources or inactive dense material. By comparing two images and reporting only a single statistic from the combination thereof, this algorithm can operate entirely behind an information barrier stage. Together with knowledge of the radiography system, the use of these algorithms in combination can be used to improve verification capability to inspection regimes and improve

  19. FFDM image quality assessment using computerized image texture analysis

    NASA Astrophysics Data System (ADS)

    Berger, Rachelle; Carton, Ann-Katherine; Maidment, Andrew D. A.; Kontos, Despina

    2010-04-01

    Quantitative measures of image quality (IQ) are routinely obtained during the evaluation of imaging systems. These measures, however, do not necessarily correlate with the IQ of the actual clinical images, which can also be affected by factors such as patient positioning. No quantitative method currently exists to evaluate clinical IQ. Therefore, we investigated the potential of using computerized image texture analysis to quantitatively assess IQ. Our hypothesis is that image texture features can be used to assess IQ as a measure of the image signal-to-noise ratio (SNR). To test feasibility, the "Rachel" anthropomorphic breast phantom (Model 169, Gammex RMI) was imaged with a Senographe 2000D FFDM system (GE Healthcare) using 220 unique exposure settings (target/filter, kVs, and mAs combinations). The mAs were varied from 10%-300% of that required for an average glandular dose (AGD) of 1.8 mGy. A 2.5cm2 retroareolar region of interest (ROI) was segmented from each image. The SNR was computed from the ROIs segmented from images linear with dose (i.e., raw images) after flat-field and off-set correction. Image texture features of skewness, coarseness, contrast, energy, homogeneity, and fractal dimension were computed from the Premium ViewTM postprocessed image ROIs. Multiple linear regression demonstrated a strong association between the computed image texture features and SNR (R2=0.92, p<=0.001). When including kV, target and filter as additional predictor variables, a stronger association with SNR was observed (R2=0.95, p<=0.001). The strong associations indicate that computerized image texture analysis can be used to measure image SNR and potentially aid in automating IQ assessment as a component of the clinical workflow. Further work is underway to validate our findings in larger clinical datasets.

  20. Image analysis applications for grain science

    NASA Astrophysics Data System (ADS)

    Zayas, Inna Y.; Steele, James L.

    1991-02-01

    Morphometrical features of single grain kernels or particles were used to discriminate two visibly similar wheat varieties foreign material in wheat hardsoft and spring-winter wheat classes and whole from broken corn kernels. Milled fractions of hard and soft wheat were evaluated using textural image analysis. Color image analysis of sound and mold damaged corn kernels yielded high recognition rates. The studies collectively demonstrate the potential for automated classification and assessment of grain quality using image analysis.

  1. Automatic processing, analysis, and recognition of images

    NASA Astrophysics Data System (ADS)

    Abrukov, Victor S.; Smirnov, Evgeniy V.; Ivanov, Dmitriy G.

    2004-11-01

    New approaches and computer codes (A&CC) for automatic processing, analysis and recognition of images are offered. The A&CC are based on presentation of object image as a collection of pixels of various colours and consecutive automatic painting of distinguished itself parts of the image. The A&CC have technical objectives centred on such direction as: 1) image processing, 2) image feature extraction, 3) image analysis and some others in any consistency and combination. The A&CC allows to obtain various geometrical and statistical parameters of object image and its parts. Additional possibilities of the A&CC usage deal with a usage of artificial neural networks technologies. We believe that A&CC can be used at creation of the systems of testing and control in a various field of industry and military applications (airborne imaging systems, tracking of moving objects), in medical diagnostics, at creation of new software for CCD, at industrial vision and creation of decision-making system, etc. The opportunities of the A&CC are tested at image analysis of model fires and plumes of the sprayed fluid, ensembles of particles, at a decoding of interferometric images, for digitization of paper diagrams of electrical signals, for recognition of the text, for elimination of a noise of the images, for filtration of the image, for analysis of the astronomical images and air photography, at detection of objects.

  2. Satellite image analysis using neural networks

    NASA Technical Reports Server (NTRS)

    Sheldon, Roger A.

    1990-01-01

    The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.

  3. Image analysis for discrimination of cervical neoplasia

    NASA Astrophysics Data System (ADS)

    Pogue, Brian W.; Mycek, Mary-Ann; Harper, Diane

    2000-01-01

    Colposcopy involves visual imaging of the cervix for patients who have exhibited some prior indication of abnormality, and the major goals are to visually inspect for any malignancies and to guide biopsy sampling. Currently colposcopy equipment is being upgraded in many health care centers to incorporate digital image acquisition and archiving. These permanent images can be analyzed for characteristic features and color patterns which may enhance the specificity and objectivity of the routine exam. In this study a series of images from patients with biopsy confirmed cervical intraepithelia neoplasia stage 2/3 are compared with images from patients with biopsy confirmed immature squamous metaplasia, with the goal of determining optimal criteria for automated discrimination between them. All images were separated into their red, green, and blue channels, and comparisons were made between relative intensity, intensity variation, spatial frequencies, fractal dimension, and Euler number. This study indicates that computer-based processing of cervical images can provide some discrimination of the type of tissue features which are important for clinical evaluation, with the Euler number being the most clinically useful feature to discriminate metaplasia from neoplasia. Also there was a strong indication that morphology observed in the blue channel of the image provided more information about epithelial cell changes. Further research in this field can lead to advances in computer-aided diagnosis as well as the potential for online image enhancement in digital colposcopy.

  4. Idiopathic environmental intolerance: Part 2: A causation analysis applying Bradford Hill's criteria to the psychogenic theory.

    PubMed

    Staudenmayer, Herman; Binkley, Karen E; Leznoff, Arthur; Phillips, Scott

    2003-01-01

    Toxicogenic and psychogenic theories have been proposed to explain idiopathic environmental intolerance (IEI). Part 2 of this article is an evidence-based causality analysis of the psychogenic theory using an extended version of Bradford Hill's criteria. The psychogenic theory meets all of the criteria directly or indirectly and is characterised by a progressive research programme including double-blind, placebo-controlled provocation challenge studies. We conclude that IEI is a belief characterised by an overvalued idea of toxic attribution of symptoms and disability, fulfilling criteria for a somatoform disorder and a functional somatic syndrome. A neurobiological diathesis similar to anxiety, specifically panic disorder, is a neurobiologically plausible mechanism to explain triggered reactions to ambient doses of environmental agents, real or perceived. In addition, there is a cognitively mediated fear response mechanism characterised by vigilance for perceived exposures and bodily sensations that are subsequently amplified in the process of learned sensitivity. Implications for the assessment and treatment of patients are presented. PMID:15189047

  5. Regulatory analysis on criteria for the release of patients administered radioactive material. Final report

    SciTech Connect

    Schneider, S.; McGuire, S.A.

    1997-02-01

    This regulatory analysis was developed to respond to three petitions for rulemaking to amend 10 CFR parts 20 and 35 regarding release of patients administered radioactive material. The petitions requested revision of these regulations to remove the ambiguity that existed between the 1-millisievert (0.1-rem) total effective dose equivalent (TEDE) public dose limit in Part 20, adopted in 1991, and the activity-based release limit in 10 CFR 35.75 that, in some instances, would permit release of individuals in excess of the current public dose limit. Three alternatives for resolution of the petitions were evaluated. Under Alternative 1, NRC would amend its patient release criteria in 10 CFR 35.75 to match the annual public dose limit in Part 20 of 1 millisievert (0.1 rem) TEDE. Alternative 2 would maintain the status quo of using the activity-based release criteria currently found in 10 CFR 35.75. Under Alternative 3, the NRC would revise the release criteria in 10 CFR 35.75 to specify a dose limit of 5 millisieverts (0.5 rem) TEDE.

  6. Using Multi Criteria Decision Making in Analysis of Alternatives for Selection of Enabling Technology

    NASA Astrophysics Data System (ADS)

    Georgiadis, Daniel

    Prior to Milestone A, the Department of Defense (DoD) requires that service sponsors conduct an Analysis of Alternatives (AoA), an analytical comparison of multiple alternatives, to be completed prior to committing and investing costly resources to one project or decision. Despite this requirement, sponsors will circumvent or dilute the process in an effort to save money or schedule, and specific requirements are proposed that can effectively eliminate all but the preselected alternatives. This research focuses on identifying decision aiding methods which can lead to the selection of specific criteria that are key performance drivers thus enabling an informed selection of the enabling technology. This work defines the enabling technology as the sub-system which presents the most risk within the system design. After a thorough literature review of available Multi Criteria Decision Making methods, a case study example is presented demonstrating the selection of the enabling technology of a Light Detection and Ranging (LIDAR) system. Using subjective criteria in the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) is shown to successfully account for tacit knowledge of expert practitioners.

  7. Regulatory analysis on criteria for the release of patients administered radioactive material

    SciTech Connect

    Schneider, S.; McGuire, S.A.; Behling, U.H.; Behling, K.; Goldin, D.

    1994-05-01

    The Nuclear Regulatory Commission (NRC) has received two petitions to amend its regulations in 10 CFR Parts 20 and 35 as they apply to doses received by members of the public exposed to patients released from a hospital after they have been administered radioactive material. While the two petitions are not identical they both request that the NRC establish a dose limit of 5 millisieverts (0.5 rem) per year for individuals exposed to patients who have been administered radioactive materials. This Regulatory Analysis evaluates three alternatives. Alternative 1 is for the NRC to amend its patient release criteria in 10 CFR 35.75 to use the more stringent dose limit of 1 millisievert per year in 10 CFR 20.1301(a) for its patient release criteria. Alternative 2 is for the NRC to continue using the existing patient release criteria in 10 CFR 35.75 of 1,110 megabecquerels of activity or a dose rate at one meter from the patient of 0.05 millisievert per hour. Alternative 3 is for the NRC to amend the patient release criteria in 10 CFR 35.75 to specify a dose limit of 5 millisieverts for patient release. The evaluation indicates that Alternative 1 would cause a prohibitively large increase in the national health care cost from retaining patients in a hospital longer and would cause significant personal and psychological costs to patients and their families. The choice of Alternatives 2 or 3 would affect only thyroid cancer patients treated with iodine-131. For those patients, Alternative 3 would result in less hospitalization than Alternative 2. Alternative 3 has a potential decrease in national health care cost of $30,000,000 per year but would increase the potential collective dose from released therapy patients by about 2,700 person-rem per year, mainly to family members.

  8. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya Monton, Carlos; Schuller, Ivan K.

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  9. Sizing and ranging criteria for SAR images of steel and wood specimens

    NASA Astrophysics Data System (ADS)

    Le, Viet; Yu, Tzuyang; Owusu Twumasi, Jones; Tang, Qixiang

    2016-04-01

    The use of microwave and radar sensors in the nondestructive evaluation (NDE) of damaged materials and structures has been proven to be a promising approach. In this paper, a portable imaging radar sensor utilizing 10 GHz central frequency and stripmap synthetic aperture radar (SAR) imaging was applied to steel and wood specimens for size and range determination. Relationships between range and properties of SAR images (e.g. maximum amplitude and total SAR amplitude) were developed and reported for various specimens including a steel bar (2.5 cm by 2.5 cm by 28.5 cm), a wood bar (2.5 cm by 2.5 cm by 28.5 cm), a steel plate (39.7 cm by 57.9 cm by 1.75 cm), and a wood board (30.5 cm by 30.5 cm by 1.8 cm). Various ranges from 30 cm to 100 cm were used on these specimens. In our experiment, attenuation of radar signals collected by the imaging radar system on different material specimens was measured and modeled. Change in the attenuation of maximum SAR amplitude was observed in different materials. It is found that SAR images can be used to distinguish materials of different compositions and sizes.

  10. Micro-CT imaging: Developing criteria for examining fetal skeletons in regulatory developmental toxicology studies - A workshop report.

    PubMed

    Solomon, Howard M; Makris, Susan L; Alsaid, Hasan; Bermudez, Oscar; Beyer, Bruce K; Chen, Antong; Chen, Connie L; Chen, Zhou; Chmielewski, Gary; DeLise, Anthony M; de Schaepdrijver, Luc; Dogdas, Belma; French, Julian; Harrouk, Wafa; Helfgott, Jonathan; Henkelman, R Mark; Hesterman, Jacob; Hew, Kok-Wah; Hoberman, Alan; Lo, Cecilia W; McDougal, Andrew; Minck, Daniel R; Scott, Lelia; Stewart, Jane; Sutherland, Vicki; Tatiparthi, Arun K; Winkelmann, Christopher T; Wise, L David; Wood, Sandra L; Ying, Xiaoyou

    2016-06-01

    During the past two decades the use and refinements of imaging modalities have markedly increased making it possible to image embryos and fetuses used in pivotal nonclinical studies submitted to regulatory agencies. Implementing these technologies into the Good Laboratory Practice environment requires rigorous testing, validation, and documentation to ensure the reproducibility of data. A workshop on current practices and regulatory requirements was held with the goal of defining minimal criteria for the proper implementation of these technologies and subsequent submission to regulatory agencies. Micro-computed tomography (micro-CT) is especially well suited for high-throughput evaluations, and is gaining popularity to evaluate fetal skeletons to assess the potential developmental toxicity of test agents. This workshop was convened to help scientists in the developmental toxicology field understand and apply micro-CT technology to nonclinical toxicology studies and facilitate the regulatory acceptance of imaging data. Presentations and workshop discussions covered: (1) principles of micro-CT fetal imaging; (2) concordance of findings with conventional skeletal evaluations; and (3) regulatory requirements for validating the system. Establishing these requirements for micro-CT examination can provide a path forward for laboratories considering implementing this technology and provide regulatory agencies with a basis to consider the acceptability of data generated via this technology. PMID:26930635

  11. Image registration with uncertainty analysis

    DOEpatents

    Simonson, Katherine M.

    2011-03-22

    In an image registration method, edges are detected in a first image and a second image. A percentage of edge pixels in a subset of the second image that are also edges in the first image shifted by a translation is calculated. A best registration point is calculated based on a maximum percentage of edges matched. In a predefined search region, all registration points other than the best registration point are identified that are not significantly worse than the best registration point according to a predetermined statistical criterion.

  12. Multi-criteria analysis on how to select solar radiation hydrogen production system

    SciTech Connect

    Badea, G.; Naghiu, G. S. Felseghi, R.-A.; Giurca, I.; Răboacă, S.; Aşchilean, I.

    2015-12-23

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  13. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    NASA Astrophysics Data System (ADS)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  14. Multi-criteria analysis on how to select solar radiation hydrogen production system

    NASA Astrophysics Data System (ADS)

    Badea, G.; Naghiu, G. S.; Felseghi, R.-A.; Rǎboacǎ, S.; Aşchilean, I.; Giurca, I.

    2015-12-01

    The purpose of this article is to present a method of selecting hydrogen-production systems using the electric power obtained in photovoltaic systems, and as a selecting method, we suggest the use of the Advanced Multi-Criteria Analysis based on the FRISCO formula. According to the case study on how to select the solar radiation hydrogen production system, the most convenient alternative is the alternative A4, namely the technical solution involving a hydrogen production system based on the electrolysis of water vapor obtained with concentrated solar thermal systems and electrical power obtained using concentrating photovoltaic systems.

  15. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    SciTech Connect

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.; Zuffranieri, Jason V.; Paananen, Orman Henrie; Jones, Scott A.; Ortner, Juergen G.; Brewer, Jeffrey D.; Valdez, Maximo M.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.

  16. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  17. Millimeter-wave sensor image analysis

    NASA Technical Reports Server (NTRS)

    Wilson, William J.; Suess, Helmut

    1989-01-01

    Images of an airborne, scanning, radiometer operating at a frequency of 98 GHz, have been analyzed. The mm-wave images were obtained in 1985/1986 using the JPL mm-wave imaging sensor. The goal of this study was to enhance the information content of these images and make their interpretation easier for human analysis. In this paper, a visual interpretative approach was used for information extraction from the images. This included application of nonlinear transform techniques for noise reduction and for color, contrast and edge enhancement. Results of the techniques on selected mm-wave images are presented.

  18. Image processing software for imaging spectrometry data analysis

    NASA Technical Reports Server (NTRS)

    Mazer, Alan; Martin, Miki; Lee, Meemong; Solomon, Jerry E.

    1988-01-01

    Imaging spectrometers simultaneously collect image data in hundreds of spectral channels, from the near-UV to the IR, and can thereby provide direct surface materials identification by means resembling laboratory reflectance spectroscopy. Attention is presently given to a software system, the Spectral Analysis Manager (SPAM) for the analysis of imaging spectrometer data. SPAM requires only modest computational resources and is composed of one main routine and a set of subroutine libraries. Additions and modifications are relatively easy, and special-purpose algorithms have been incorporated that are tailored to geological applications.

  19. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  20. Multiscale Analysis of Solar Image Data

    NASA Astrophysics Data System (ADS)

    Young, C. A.; Myers, D. C.

    2001-12-01

    It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.

  1. A 3D image analysis tool for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  2. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    SciTech Connect

    Sharifi, Mozafar Hadidi, Mosslem Vessali, Elahe Mosstafakhani, Parasto Taheri, Kamal Shahoie, Saber Khodamoradpour, Mehran

    2009-10-15

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  3. Image Reconstruction Using Analysis Model Prior.

    PubMed

    Han, Yu; Du, Huiqian; Lam, Fan; Mei, Wenbo; Fang, Liping

    2016-01-01

    The analysis model has been previously exploited as an alternative to the classical sparse synthesis model for designing image reconstruction methods. Applying a suitable analysis operator on the image of interest yields a cosparse outcome which enables us to reconstruct the image from undersampled data. In this work, we introduce additional prior in the analysis context and theoretically study the uniqueness issues in terms of analysis operators in general position and the specific 2D finite difference operator. We establish bounds on the minimum measurement numbers which are lower than those in cases without using analysis model prior. Based on the idea of iterative cosupport detection (ICD), we develop a novel image reconstruction model and an effective algorithm, achieving significantly better reconstruction performance. Simulation results on synthetic and practical magnetic resonance (MR) images are also shown to illustrate our theoretical claims. PMID:27379171

  4. Image Reconstruction Using Analysis Model Prior

    PubMed Central

    Han, Yu; Du, Huiqian; Lam, Fan; Mei, Wenbo; Fang, Liping

    2016-01-01

    The analysis model has been previously exploited as an alternative to the classical sparse synthesis model for designing image reconstruction methods. Applying a suitable analysis operator on the image of interest yields a cosparse outcome which enables us to reconstruct the image from undersampled data. In this work, we introduce additional prior in the analysis context and theoretically study the uniqueness issues in terms of analysis operators in general position and the specific 2D finite difference operator. We establish bounds on the minimum measurement numbers which are lower than those in cases without using analysis model prior. Based on the idea of iterative cosupport detection (ICD), we develop a novel image reconstruction model and an effective algorithm, achieving significantly better reconstruction performance. Simulation results on synthetic and practical magnetic resonance (MR) images are also shown to illustrate our theoretical claims. PMID:27379171

  5. Design criteria for a multiple input land use system. [digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Bryant, N. A.

    1975-01-01

    A design is presented that proposes the use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remote sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set, and that images taken of thematic maps or from remote sensing platforms can be converted to a raster scan. A major advantage of the raster format is that x, y coordinates are implicitly recognized by their position in the scan, and z values can be treated as Boolean layers in a three-dimensional data space. Such a system permits the rapid incorporation of data sets, rapid comparison of data sets, and adaptation to variable scales by resampling the raster scans.

  6. Coastal flooding as a parameter in multi-criteria analysis for industrial site selection

    NASA Astrophysics Data System (ADS)

    Christina, C.; Memos, C.; Diakoulaki, D.

    2014-12-01

    Natural hazards can trigger major industrial accidents, which apart from affecting industrial installations may cause a series of accidents with serious impacts on human health and the environment far beyond the site boundary. Such accidents, also called Na-Tech (natural - technical) accidents, deserve particular attention since they can cause release of hazardous substances possibly resulting in severe environmental pollution, explosions and/or fires. There are different kinds of natural events or, in general terms, of natural causes of industrial accidents, such as landslides, hurricanes, high winds, tsunamis, lightning, cold/hot temperature, floods, heavy rains etc that have caused accidents. The scope of this paper is to examine the coastal flooding as a parameter in causing an industrial accident, such as the nuclear disaster in Fukushima, Japan, and the critical role of this parameter in industrial site selection. Land use planning is a complex procedure that requires multi-criteria decision analysis involving economic, environmental and social parameters. In this context the parameter of a natural hazard occurrence, such as coastal flooding, for industrial site selection should be set by the decision makers. In this paper it is evaluated the influence that has in the outcome of a multi-criteria decision analysis for industrial spatial planning the parameter of an accident risk triggered by coastal flooding. The latter is analyzed in the context of both sea-and-inland induced flooding.

  7. Imaging-based enrichment criteria using deep learning algorithms for efficient clinical trials in mild cognitive impairment.

    PubMed

    Ithapu, Vamsi K; Singh, Vikas; Okonkwo, Ozioma C; Chappell, Richard J; Dowling, N Maritza; Johnson, Sterling C

    2015-12-01

    The mild cognitive impairment (MCI) stage of Alzheimer's disease (AD) may be optimal for clinical trials to test potential treatments for preventing or delaying decline to dementia. However, MCI is heterogeneous in that not all cases progress to dementia within the time frame of a trial and some may not have underlying AD pathology. Identifying those MCIs who are most likely to decline during a trial and thus most likely to benefit from treatment will improve trial efficiency and power to detect treatment effects. To this end, using multimodal, imaging-derived, inclusion criteria may be especially beneficial. Here, we present a novel multimodal imaging marker that predicts future cognitive and neural decline from [F-18]fluorodeoxyglucose positron emission tomography (PET), amyloid florbetapir PET, and structural magnetic resonance imaging, based on a new deep learning algorithm (randomized denoising autoencoder marker, rDAm). Using ADNI2 MCI data, we show that using rDAm as a trial enrichment criterion reduces the required sample estimates by at least five times compared with the no-enrichment regime and leads to smaller trials with high statistical power, compared with existing methods. PMID:26093156

  8. SU-E-J-27: Appropriateness Criteria for Deformable Image Registration and Dose Propagation

    SciTech Connect

    Papanikolaou, P; Tuohy, Rachel; Mavroidis, P; Eng, T; Gutierrez, A; Stathakis, S

    2014-06-01

    Purpose: Several commercial software packages have been recently released that allow the user to apply deformable registration algorithms (DRA) for image fusion and dose propagation. Although the idea of anatomically tracking the daily patient dose in the context of adaptive radiotherapy or merely adding the dose from prior treatment to the current one is very intuitive, the accuracy and applicability of such algorithms needs to be investigated as it remains somewhat subjective. In our study, we used true anatomical data where we introduced changes in the density, volume and location of segmented structures to test the DRA for its sensitivity and accuracy. Methods: The CT scan of a prostate patient was selected for this study. The CT images were first segmented to define structure such as the PTV, bladder, rectum, intestines and pelvic bone anatomy. To perform our study, we introduced anatomical changes in the reference patient image set in three different ways: (i) we kept the segmented volumes constant and changed the density of rectum and bladder in increments of 5% (ii) we changed the volume of rectum and bladder in increments of 5% and (iii) we kept the segmented volumes constant but changed their location by moving their COM in increments of 3mm. Using the Velocity software, we evaluated the accuracy of the DRA for each incremental change in all three scenarios. Results: The DRA performs reasonably well when the differential density difference against the background is more than 5%. For the volume change study, the DRA results became unreliable for relative volume changes greater than 10%. Finally for the location study, the DRA performance was acceptable for shifts below 9mm. Conclusion: Site specific and patient specific QA for DRA is an important step to evaluate such algorithms prior to their use for dose propagation.

  9. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  10. Assessing Interventions to Manage West Nile Virus Using Multi-Criteria Decision Analysis with Risk Scenarios.

    PubMed

    Hongoh, Valerie; Campagna, Céline; Panic, Mirna; Samuel, Onil; Gosselin, Pierre; Waaub, Jean-Philippe; Ravel, André; Samoura, Karim; Michel, Pascal

    2016-01-01

    The recent emergence of West Nile virus (WNV) in North America highlights vulnerability to climate sensitive diseases and stresses the importance of preventive efforts to reduce their public health impact. Effective prevention involves reducing environmental risk of exposure and increasing adoption of preventive behaviours, both of which depend on knowledge and acceptance of such measures. When making operational decisions about disease prevention and control, public health must take into account a wide range of operational, environmental, social and economic considerations in addition to intervention effectiveness. The current study aimed to identify, assess and rank possible risk reduction measures taking into account a broad set of criteria and perspectives applicable to the management of WNV in Quebec under increasing transmission risk scenarios, some of which may be related to ongoing warming in higher-latitude regions. A participatory approach was used to collect information on categories of concern to relevant stakeholders with respect to WNV prevention and control. Multi-criteria decision analysis was applied to examine stakeholder perspectives and their effect on strategy rankings under increasing transmission risk scenarios. Twenty-three preventive interventions were retained for evaluation using eighteen criteria identified by stakeholders. Combined evaluations revealed that, at an individual-level, inspecting window screen integrity, wearing light colored, long clothing, eliminating peridomestic larval sites and reducing outdoor activities at peak times were top interventions under six WNV transmission scenarios. At a regional-level, the use of larvicides was a preferred strategy in five out of six scenarios, while use of adulticides and dissemination of sterile male mosquitoes were found to be among the least favoured interventions in almost all scenarios. Our findings suggest that continued public health efforts aimed at reinforcing individual

  11. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    PubMed

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748

  12. Assessing Interventions to Manage West Nile Virus Using Multi-Criteria Decision Analysis with Risk Scenarios

    PubMed Central

    Hongoh, Valerie; Campagna, Céline; Panic, Mirna; Samuel, Onil; Gosselin, Pierre; Waaub, Jean-Philippe; Ravel, André; Samoura, Karim; Michel, Pascal

    2016-01-01

    The recent emergence of West Nile virus (WNV) in North America highlights vulnerability to climate sensitive diseases and stresses the importance of preventive efforts to reduce their public health impact. Effective prevention involves reducing environmental risk of exposure and increasing adoption of preventive behaviours, both of which depend on knowledge and acceptance of such measures. When making operational decisions about disease prevention and control, public health must take into account a wide range of operational, environmental, social and economic considerations in addition to intervention effectiveness. The current study aimed to identify, assess and rank possible risk reduction measures taking into account a broad set of criteria and perspectives applicable to the management of WNV in Quebec under increasing transmission risk scenarios, some of which may be related to ongoing warming in higher-latitude regions. A participatory approach was used to collect information on categories of concern to relevant stakeholders with respect to WNV prevention and control. Multi-criteria decision analysis was applied to examine stakeholder perspectives and their effect on strategy rankings under increasing transmission risk scenarios. Twenty-three preventive interventions were retained for evaluation using eighteen criteria identified by stakeholders. Combined evaluations revealed that, at an individual-level, inspecting window screen integrity, wearing light colored, long clothing, eliminating peridomestic larval sites and reducing outdoor activities at peak times were top interventions under six WNV transmission scenarios. At a regional-level, the use of larvicides was a preferred strategy in five out of six scenarios, while use of adulticides and dissemination of sterile male mosquitoes were found to be among the least favoured interventions in almost all scenarios. Our findings suggest that continued public health efforts aimed at reinforcing individual

  13. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    PubMed Central

    2011-01-01

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular

  14. Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis

    PubMed Central

    Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748

  15. Kepler mission exoplanet transit data analysis using fractal imaging

    NASA Astrophysics Data System (ADS)

    Dehipawala, S.; Tremberger, G.; Majid, Y.; Holden, T.; Lieberman, D.; Cheung, T.

    2012-10-01

    The Kepler mission is designed to survey a fist-sized patch of the sky within the Milky Way galaxy for the discovery of exoplanets, with emphasis on near Earth-size exoplanets in or near the habitable zone. The Kepler space telescope would detect the brightness fluctuation of a host star and extract periodic dimming in the lightcurve caused by exoplanets that cross in front of their host star. The photometric data of a host star could be interpreted as an image where fractal imaging would be applicable. Fractal analysis could elucidate the incomplete data limitation posed by the data integration window. The fractal dimension difference between the lower and upper halves of the image could be used to identify anomalies associated with transits and stellar activity as the buried signals are expected to be in the lower half of such an image. Using an image fractal dimension resolution of 0.04 and defining the whole image fractal dimension as the Chi-square expected value of the fractal dimension, a p-value can be computed and used to establish a numerical threshold for decision making that may be useful in further studies of lightcurves of stars with candidate exoplanets. Similar fractal dimension difference approaches would be applicable to the study of photometric time series data via the Higuchi method. The correlated randomness of the brightness data series could be used to support inferences based on image fractal dimension differences. Fractal compression techniques could be used to transform a lightcurve image, resulting in a new image with a new fractal dimension value, but this method has been found to be ineffective for images with high information capacity. The three studied criteria could be used together to further constrain the Kepler list of candidate lightcurves of stars with possible exoplanets that may be planned for ground-based telescope confirmation.

  16. Accuracy in Quantitative 3D Image Analysis

    PubMed Central

    Bassel, George W.

    2015-01-01

    Quantitative 3D imaging is becoming an increasingly popular and powerful approach to investigate plant growth and development. With the increased use of 3D image analysis, standards to ensure the accuracy and reproducibility of these data are required. This commentary highlights how image acquisition and postprocessing can introduce artifacts into 3D image data and proposes steps to increase both the accuracy and reproducibility of these analyses. It is intended to aid researchers entering the field of 3D image processing of plant cells and tissues and to help general readers in understanding and evaluating such data. PMID:25804539

  17. Item Response Theory Analysis of DSM-IV Cannabis Abuse and Dependence Criteria in Adolescents

    ERIC Educational Resources Information Center

    Hartman, Christie A.; Gelhorn, Heather; Crowley, Thomas J.; Sakai, Joseph T.; Stallings, Michael; Young, Susan E.; Rhee, Soo Hyun; Corley, Robin; Hewitt, John K.; Hopfer, Christian J.

    2008-01-01

    A study to examine the DSM-IV criteria for cannabis abuse and dependence among adolescents is conducted. Results conclude that abuse and dependence criteria were not found to affect the different levels of severity in cannabis use.

  18. Comparative Analysis of Thermoeconomic Evaluation Criteria for an Actual Heat Engine

    NASA Astrophysics Data System (ADS)

    Özel, Gülcan; Açıkkalp, Emin; Savaş, Ahmet Fevzi; Yamık, Hasan

    2016-07-01

    In the present study, an actual heat engine is investigated by using different thermoeconomic evaluation criteria in the literature. A criteria that has not been investigated in detail is considered and it is called as ecologico-economical criteria (F_{EC}). It is the difference of power cost and exergy destruction rate cost of the system. All four criteria are applied to an irreversible Carnot heat engine, results are presented numerically and some suggestions are made.

  19. A multi-criteria decision analysis assessment of waste paper management options

    SciTech Connect

    Hanan, Deirdre; Burnley, Stephen; Cooke, David

    2013-03-15

    Highlights: ► Isolated communities have particular problems in terms of waste management. ► An MCDA tool allowed a group of non-experts to evaluate waste management options. ► The group preferred local waste management solutions to export to the mainland. ► Gasification of paper was the preferred option followed by recycling. ► The group concluded that they could be involved in the decision making process. - Abstract: The use of Multi-criteria Decision Analysis (MCDA) was investigated in an exercise using a panel of local residents and stakeholders to assess the options for managing waste paper on the Isle of Wight. Seven recycling, recovery and disposal options were considered by the panel who evaluated each option against seven environmental, financial and social criteria. The panel preferred options where the waste was managed on the island with gasification and recycling achieving the highest scores. Exporting the waste to the English mainland for incineration or landfill proved to be the least preferred options. This research has demonstrated that MCDA is an effective way of involving community groups in waste management decision making.

  20. Multi-criteria decision analysis for waste management in Saharawi refugee camps

    SciTech Connect

    Garfi, M. Tondelli, S.; Bonoli, A.

    2009-10-15

    The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders: The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.

  1. Optical Analysis of Microscope Images

    NASA Astrophysics Data System (ADS)

    Biles, Jonathan R.

    Microscope images were analyzed with coherent and incoherent light using analog optical techniques. These techniques were found to be useful for analyzing large numbers of nonsymbolic, statistical microscope images. In the first part phase coherent transparencies having 20-100 human multiple myeloma nuclei were simultaneously photographed at 100 power magnification using high resolution holographic film developed to high contrast. An optical transform was obtained by focussing the laser onto each nuclear image and allowing the diffracted light to propagate onto a one dimensional photosensor array. This method reduced the data to the position of the first two intensity minima and the intensity of successive maxima. These values were utilized to estimate the four most important cancer detection clues of nuclear size, shape, darkness, and chromatin texture. In the second part, the geometric and holographic methods of phase incoherent optical processing were investigated for pattern recognition of real-time, diffuse microscope images. The theory and implementation of these processors was discussed in view of their mutual problems of dimness, image bias, and detector resolution. The dimness problem was solved by either using a holographic correlator or a speckle free laser microscope. The latter was built using a spinning tilted mirror which caused the speckle to change so quickly that it averaged out during the exposure. To solve the bias problem low image bias templates were generated by four techniques: microphotography of samples, creation of typical shapes by computer graphics editor, transmission holography of photoplates of samples, and by spatially coherent color image bias removal. The first of these templates was used to perform correlations with bacteria images. The aperture bias was successfully removed from the correlation with a video frame subtractor. To overcome the limited detector resolution it is necessary to discover some analog nonlinear intensity

  2. Objective analysis of image quality of video image capture systems

    NASA Astrophysics Data System (ADS)

    Rowberg, Alan H.

    1990-07-01

    As Picture Archiving and Communication System (PACS) technology has matured, video image capture has become a common way of capturing digital images from many modalities. While digital interfaces, such as those which use the ACR/NEMA standard, will become more common in the future, and are preferred because of the accuracy of image transfer, video image capture will be the dominant method in the short term, and may continue to be used for some time because of the low cost and high speed often associated with such devices. Currently, virtually all installed systems use methods of digitizing the video signal that is produced for display on the scanner viewing console itself. A series of digital test images have been developed for display on either a GE CT9800 or a GE Signa MRI scanner. These images have been captured with each of five commercially available image capture systems, and the resultant images digitally transferred on floppy disk to a PC1286 computer containing Optimast' image analysis software. Here the images can be displayed in a comparative manner for visual evaluation, in addition to being analyzed statistically. Each of the images have been designed to support certain tests, including noise, accuracy, linearity, gray scale range, stability, slew rate, and pixel alignment. These image capture systems vary widely in these characteristics, in addition to the presence or absence of other artifacts, such as shading and moire pattern. Other accessories such as video distribution amplifiers and noise filters can also add or modify artifacts seen in the captured images, often giving unusual results. Each image is described, together with the tests which were performed using them. One image contains alternating black and white lines, each one pixel wide, after equilibration strips ten pixels wide. While some systems have a slew rate fast enough to track this correctly, others blur it to an average shade of gray, and do not resolve the lines, or give

  3. Spatial multi-criteria decision analysis to predict suitability for African swine fever endemicity in Africa

    PubMed Central

    2014-01-01

    Background African swine fever (ASF) is endemic in several countries of Africa and may pose a risk to all pig producing areas on the continent. Official ASF reporting is often rare and there remains limited awareness of the continent-wide distribution of the disease. In the absence of accurate ASF outbreak data and few quantitative studies on the epidemiology of the disease in Africa, we used spatial multi-criteria decision analysis (MCDA) to derive predictions of the continental distribution of suitability for ASF persistence in domestic pig populations as part of sylvatic or domestic transmission cycles. In order to incorporate the uncertainty in the relative importance of different criteria in defining suitability, we modelled decisions within the MCDA framework using a stochastic approach. The predictive performance of suitability estimates was assessed via a partial ROC analysis using ASF outbreak data reported to the OIE since 2005. Results Outputs from the spatial MCDA indicate that large areas of sub-Saharan Africa may be suitable for ASF persistence as part of either domestic or sylvatic transmission cycles. Areas with high suitability for pig to pig transmission (‘domestic cycles’) were estimated to occur throughout sub-Saharan Africa, whilst areas with high suitability for introduction from wildlife reservoirs (‘sylvatic cycles’) were found predominantly in East, Central and Southern Africa. Based on average AUC ratios from the partial ROC analysis, the predictive ability of suitability estimates for domestic cycles alone was considerably higher than suitability estimates for sylvatic cycles alone, or domestic and sylvatic cycles in combination. Conclusions This study provides the first standardised estimates of the distribution of suitability for ASF transmission associated with domestic and sylvatic cycles in Africa. We provide further evidence for the utility of knowledge-driven risk mapping in animal health, particularly in data

  4. Secure thin client architecture for DICOM image analysis

    NASA Astrophysics Data System (ADS)

    Mogatala, Harsha V. R.; Gallet, Jacqueline

    2005-04-01

    This paper presents a concept of Secure Thin Client (STC) Architecture for Digital Imaging and Communications in Medicine (DICOM) image analysis over Internet. STC Architecture provides in-depth analysis and design of customized reports for DICOM images using drag-and-drop and data warehouse technology. Using a personal computer and a common set of browsing software, STC can be used for analyzing and reporting detailed patient information, type of examinations, date, Computer Tomography (CT) dose index, and other relevant information stored within the images header files as well as in the hospital databases. STC Architecture is three-tier architecture. The First-Tier consists of drag-and-drop web based interface and web server, which provides customized analysis and reporting ability to the users. The Second-Tier consists of an online analytical processing (OLAP) server and database system, which serves fast, real-time, aggregated multi-dimensional data using OLAP technology. The Third-Tier consists of a smart algorithm based software program which extracts DICOM tags from CT images in this particular application, irrespective of CT vendor's, and transfers these tags into a secure database system. This architecture provides Winnipeg Regional Health Authorities (WRHA) with quality indicators for CT examinations in the hospitals. It also provides health care professionals with analytical tool to optimize radiation dose and image quality parameters. The information is provided to the user by way of a secure socket layer (SSL) and role based security criteria over Internet. Although this particular application has been developed for WRHA, this paper also discusses the effort to extend the Architecture to other hospitals in the region. Any DICOM tag from any imaging modality could be tracked with this software.

  5. Scale-Specific Multifractal Medical Image Analysis

    PubMed Central

    Braverman, Boris

    2013-01-01

    Fractal geometry has been applied widely in the analysis of medical images to characterize the irregular complex tissue structures that do not lend themselves to straightforward analysis with traditional Euclidean geometry. In this study, we treat the nonfractal behaviour of medical images over large-scale ranges by considering their box-counting fractal dimension as a scale-dependent parameter rather than a single number. We describe this approach in the context of the more generalized Rényi entropy, in which we can also compute the information and correlation dimensions of images. In addition, we describe and validate a computational improvement to box-counting fractal analysis. This improvement is based on integral images, which allows the speedup of any box-counting or similar fractal analysis algorithm, including estimation of scale-dependent dimensions. Finally, we applied our technique to images of invasive breast cancer tissue from 157 patients to show a relationship between the fractal analysis of these images over certain scale ranges and pathologic tumour grade (a standard prognosticator for breast cancer). Our approach is general and can be applied to any medical imaging application in which the complexity of pathological image structures may have clinical value. PMID:24023588

  6. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  7. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    SciTech Connect

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  8. Materials characterization through quantitative digital image analysis

    SciTech Connect

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  9. Launch commit criteria performance trending analysis, phase 1, revision A. SRM and QA mission services

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An assessment of quantitative methods and measures for measuring launch commit criteria (LCC) performance measurement trends is made. A statistical performance trending analysis pilot study was processed and compared to STS-26 mission data. This study used four selected shuttle measurement types (solid rocket booster, external tank, space shuttle main engine, and range safety switch safe and arm device) from the five missions prior to mission 51-L. After obtaining raw data coordinates, each set of measurements was processed to obtain statistical confidence bounds and mean data profiles for each of the selected measurement types. STS-26 measurements were compared to the statistical data base profiles to verify the statistical capability of assessing occurrences of data trend anomalies and abnormal time-varying operational conditions associated with data amplitude and phase shifts.

  10. Harnessing Ecosystem Models and Multi-Criteria Decision Analysis for the Support of Forest Management

    NASA Astrophysics Data System (ADS)

    Wolfslehner, Bernhard; Seidl, Rupert

    2010-12-01

    The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.

  11. Factor Analysis of the Image Correlation Matrix.

    ERIC Educational Resources Information Center

    Kaiser, Henry F.; Cerny, Barbara A.

    1979-01-01

    Whether to factor the image correlation matrix or to use a new model with an alpha factor analysis of it is mentioned, with particular reference to the determinacy problem. It is pointed out that the distribution of the images is sensibly multivariate normal, making for "better" factor analyses. (Author/CTM)

  12. Viewing angle analysis of integral imaging

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Xia; Wu, Chun-Hong; Yang, Yang; Zhang, Lan

    2007-12-01

    Integral imaging (II) is a technique capable of displaying 3D images with continuous parallax in full natural color. It is becoming the most perspective technique in developing next generation three-dimensional TV (3DTV) and visualization field due to its outstanding advantages. However, most of conventional integral images are restricted by its narrow viewing angle. One reason is that the range in which a reconstructed integral image can be displayed with consistent parallax is limited. The other is that the aperture of system is finite. By far many methods , an integral imaging method to enhance the viewing angle of integral images has been proposed. Nevertheless, except Ren's MVW (Maximum Viewing Width) most of these methods involve complex hardware and modifications of optical system, which usually bring other disadvantages and make operation more difficult. At the same time the cost of these systems should be higher. In order to simplify optical systems, this paper systematically analyzes the viewing angle of traditional integral images instead of modified ones. Simultaneously for the sake of cost the research was based on computer generated integral images (CGII). With the analysis result we can know clearly how the viewing angle can be enhanced and how the image overlap or image flipping can be avoided. The result also promotes the development of optical instruments. Based on theoretical analysis, preliminary calculation was done to demonstrate how the other viewing properties which are closely related with the viewing angle, such as viewing distance, viewing zone, lens pitch, and etc. affect the viewing angle.

  13. Multi-level multi-criteria analysis of alternative fuels for waste collection vehicles in the United States.

    PubMed

    Maimoun, Mousa; Madani, Kaveh; Reinhart, Debra

    2016-04-15

    Historically, the U.S. waste collection fleet was dominated by diesel-fueled waste collection vehicles (WCVs); the growing need for sustainable waste collection has urged decision makers to incorporate economically efficient alternative fuels, while mitigating environmental impacts. The pros and cons of alternative fuels complicate the decisions making process, calling for a comprehensive study that assesses the multiple factors involved. Multi-criteria decision analysis (MCDA) methods allow decision makers to select the best alternatives with respect to selection criteria. In this study, two MCDA methods, Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) and Simple Additive Weighting (SAW), were used to rank fuel alternatives for the U.S. waste collection industry with respect to a multi-level environmental and financial decision matrix. The environmental criteria consisted of life-cycle emissions, tail-pipe emissions, water footprint (WFP), and power density, while the financial criteria comprised of vehicle cost, fuel price, fuel price stability, and fueling station availability. The overall analysis showed that conventional diesel is still the best option, followed by hydraulic-hybrid WCVs, landfill gas (LFG) sourced natural gas, fossil natural gas, and biodiesel. The elimination of the WFP and power density criteria from the environmental criteria ranked biodiesel 100 (BD100) as an environmentally better alternative compared to other fossil fuels (diesel and natural gas). This result showed that considering the WFP and power density as environmental criteria can make a difference in the decision process. The elimination of the fueling station and fuel price stability criteria from the decision matrix ranked fossil natural gas second after LFG-sourced natural gas. This scenario was found to represent the status quo of the waste collection industry. A sensitivity analysis for the status quo scenario showed the overall ranking of diesel and

  14. Depth-based selective image reconstruction using spatiotemporal image analysis

    NASA Astrophysics Data System (ADS)

    Haga, Tetsuji; Sumi, Kazuhiko; Hashimoto, Manabu; Seki, Akinobu

    1999-03-01

    In industrial plants, a remote monitoring system which removes physical tour inspection is often considered desirable. However the image sequence given from the mobile inspection robot is hard to see because interested objects are often partially occluded by obstacles such as pillars or fences. Our aim is to improve the image sequence that increases the efficiency and reliability of remote visual inspection. We propose a new depth-based image processing technique, which removes the needless objects from the foreground and recovers the occluded background electronically. Our algorithm is based on spatiotemporal analysis that enables fine and dense depth estimation, depth-based precise segmentation, and accurate interpolation. We apply this technique to a real time sequence given from the mobile inspection robot. The resulted image sequence is satisfactory in that the operator can make correct visual inspection with less fatigue.

  15. A Robust Actin Filaments Image Analysis Framework.

    PubMed

    Alioscha-Perez, Mitchel; Benadiba, Carine; Goossens, Katty; Kasas, Sandor; Dietler, Giovanni; Willaert, Ronnie; Sahli, Hichem

    2016-08-01

    The cytoskeleton is a highly dynamical protein network that plays a central role in numerous cellular physiological processes, and is traditionally divided into three components according to its chemical composition, i.e. actin, tubulin and intermediate filament cytoskeletons. Understanding the cytoskeleton dynamics is of prime importance to unveil mechanisms involved in cell adaptation to any stress type. Fluorescence imaging of cytoskeleton structures allows analyzing the impact of mechanical stimulation in the cytoskeleton, but it also imposes additional challenges in the image processing stage, such as the presence of imaging-related artifacts and heavy blurring introduced by (high-throughput) automated scans. However, although there exists a considerable number of image-based analytical tools to address the image processing and analysis, most of them are unfit to cope with the aforementioned challenges. Filamentous structures in images can be considered as a piecewise composition of quasi-straight segments (at least in some finer or coarser scale). Based on this observation, we propose a three-steps actin filaments extraction methodology: (i) first the input image is decomposed into a 'cartoon' part corresponding to the filament structures in the image, and a noise/texture part, (ii) on the 'cartoon' image, we apply a multi-scale line detector coupled with a (iii) quasi-straight filaments merging algorithm for fiber extraction. The proposed robust actin filaments image analysis framework allows extracting individual filaments in the presence of noise, artifacts and heavy blurring. Moreover, it provides numerous parameters such as filaments orientation, position and length, useful for further analysis. Cell image decomposition is relatively under-exploited in biological images processing, and our study shows the benefits it provides when addressing such tasks. Experimental validation was conducted using publicly available datasets, and in osteoblasts grown in

  16. A Robust Actin Filaments Image Analysis Framework

    PubMed Central

    Alioscha-Perez, Mitchel; Benadiba, Carine; Goossens, Katty; Kasas, Sandor; Dietler, Giovanni; Willaert, Ronnie; Sahli, Hichem

    2016-01-01

    The cytoskeleton is a highly dynamical protein network that plays a central role in numerous cellular physiological processes, and is traditionally divided into three components according to its chemical composition, i.e. actin, tubulin and intermediate filament cytoskeletons. Understanding the cytoskeleton dynamics is of prime importance to unveil mechanisms involved in cell adaptation to any stress type. Fluorescence imaging of cytoskeleton structures allows analyzing the impact of mechanical stimulation in the cytoskeleton, but it also imposes additional challenges in the image processing stage, such as the presence of imaging-related artifacts and heavy blurring introduced by (high-throughput) automated scans. However, although there exists a considerable number of image-based analytical tools to address the image processing and analysis, most of them are unfit to cope with the aforementioned challenges. Filamentous structures in images can be considered as a piecewise composition of quasi-straight segments (at least in some finer or coarser scale). Based on this observation, we propose a three-steps actin filaments extraction methodology: (i) first the input image is decomposed into a ‘cartoon’ part corresponding to the filament structures in the image, and a noise/texture part, (ii) on the ‘cartoon’ image, we apply a multi-scale line detector coupled with a (iii) quasi-straight filaments merging algorithm for fiber extraction. The proposed robust actin filaments image analysis framework allows extracting individual filaments in the presence of noise, artifacts and heavy blurring. Moreover, it provides numerous parameters such as filaments orientation, position and length, useful for further analysis. Cell image decomposition is relatively under-exploited in biological images processing, and our study shows the benefits it provides when addressing such tasks. Experimental validation was conducted using publicly available datasets, and in osteoblasts

  17. Brown Adipose Reporting Criteria in Imaging STudies (BARCIST 1.0): Recommendations for Standardized FDG-PET/CT Experiments in Humans.

    PubMed

    Chen, Kong Y; Cypess, Aaron M; Laughlin, Maren R; Haft, Carol R; Hu, Houchun Harry; Bredella, Miriam A; Enerbäck, Sven; Kinahan, Paul E; Lichtenbelt, Wouter van Marken; Lin, Frank I; Sunderland, John J; Virtanen, Kirsi A; Wahl, Richard L

    2016-08-01

    Human brown adipose tissue (BAT) presence, metabolic activity, and estimated mass are typically measured by imaging [18F]fluorodeoxyglucose (FDG) uptake in response to cold exposure in regions of the body expected to contain BAT, using positron emission tomography combined with X-ray computed tomography (FDG-PET/CT). Efforts to describe the epidemiology and biology of human BAT are hampered by diverse experimental practices, making it difficult to directly compare results among laboratories. An expert panel was assembled by the National Institute of Diabetes and Digestive and Kidney Diseases on November 4, 2014 to discuss minimal requirements for conducting FDG-PET/CT experiments of human BAT, data analysis, and publication of results. This resulted in Brown Adipose Reporting Criteria in Imaging STudies (BARCIST 1.0). Since there are no fully validated best practices at this time, panel recommendations are meant to enhance comparability across experiments, but not to constrain experimental design or the questions that can be asked. PMID:27508870

  18. Linear digital imaging system fidelity analysis

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.

    1989-01-01

    The combined effects of imaging gathering, sampling and reconstruction are analyzed in terms of image fidelity. The analysis is based upon a standard end-to-end linear system model which is sufficiently general so that the results apply to most line-scan and sensor-array imaging systems. Shift-variant sampling effects are accounted for with an expected value analysis based upon the use of a fixed deterministic input scene which is randomly shifted (mathematically) relative to the sampling grid. This random sample-scene phase approach has been used successfully by the author and associates in several previous related papers.

  19. Infrared image processing and data analysis

    NASA Astrophysics Data System (ADS)

    Ibarra-Castanedo, C.; González, D.; Klein, M.; Pilla, M.; Vallerand, S.; Maldague, X.

    2004-12-01

    Infrared thermography in nondestructive testing provides images (thermograms) in which zones of interest (defects) appear sometimes as subtle signatures. In this context, raw images are not often appropriate since most will be missed. In some other cases, what is needed is a quantitative analysis such as for defect detection and characterization. In this paper, presentation is made of various methods of data analysis required either at preprocessing and/or processing images. References from literature are provided for briefly discussed known methods while novelties are elaborated in more details within the text which include also experimental results.

  20. Malware Analysis Using Visualized Image Matrices

    PubMed Central

    Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively. PMID:25133202

  1. Malware analysis using visualized image matrices.

    PubMed

    Han, KyoungSoo; Kang, BooJoong; Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively. PMID:25133202

  2. Image texture analysis of crushed wheat kernels

    NASA Astrophysics Data System (ADS)

    Zayas, Inna Y.; Martin, C. R.; Steele, James L.; Dempster, Richard E.

    1992-03-01

    The development of new approaches for wheat hardness assessment may impact the grain industry in marketing, milling, and breeding. This study used image texture features for wheat hardness evaluation. Application of digital imaging to grain for grading purposes is principally based on morphometrical (shape and size) characteristics of the kernels. A composite sample of 320 kernels for 17 wheat varieties were collected after testing and crushing with a single kernel hardness characterization meter. Six wheat classes where represented: HRW, HRS, SRW, SWW, Durum, and Club. In this study, parameters which characterize texture or spatial distribution of gray levels of an image were determined and used to classify images of crushed wheat kernels. The texture parameters of crushed wheat kernel images were different depending on class, hardness and variety of the wheat. Image texture analysis of crushed wheat kernels showed promise for use in class, hardness, milling quality, and variety discrimination.

  3. Breast tomosynthesis imaging configuration analysis.

    PubMed

    Rayford, Cleveland E; Zhou, Weihua; Chen, Ying

    2013-01-01

    Traditional two-dimensional (2D) X-ray mammography is the most commonly used method for breast cancer diagnosis. Recently, a three-dimensional (3D) Digital Breast Tomosynthesis (DBT) system has been invented, which is likely to challenge the current mammography technology. The DBT system provides stunning 3D information, giving physicians increased detail of anatomical information, while reducing the chance of false negative screening. In this research, two reconstruction algorithms, Back Projection (BP) and Shift-And-Add (SAA), were used to investigate and compare View Angle (VA) and the number of projection images (N) with parallel imaging configurations. In addition, in order to better determine which method displayed better-quality imaging, Modulation Transfer Function (MTF) analyses were conducted with both algorithms, ultimately producing results which improve upon better breast cancer detection. Research studies find evidence that early detection of the disease is the best way to conquer breast cancer, and earlier detection results in the increase of life span for the affected person. PMID:23900440

  4. Breast cancer histopathology image analysis: a review.

    PubMed

    Veta, Mitko; Pluim, Josien P W; van Diest, Paul J; Viergever, Max A

    2014-05-01

    This paper presents an overview of methods that have been proposed for the analysis of breast cancer histopathology images. This research area has become particularly relevant with the advent of whole slide imaging (WSI) scanners, which can perform cost-effective and high-throughput histopathology slide digitization, and which aim at replacing the optical microscope as the primary tool used by pathologist. Breast cancer is the most prevalent form of cancers among women, and image analysis methods that target this disease have a huge potential to reduce the workload in a typical pathology lab and to improve the quality of the interpretation. This paper is meant as an introduction for nonexperts. It starts with an overview of the tissue preparation, staining and slide digitization processes followed by a discussion of the different image processing techniques and applications, ranging from analysis of tissue staining to computer-aided diagnosis, and prognosis of breast cancer patients. PMID:24759275

  5. Principal component analysis of scintimammographic images.

    PubMed

    Bonifazzi, Claudio; Cinti, Maria Nerina; Vincentis, Giuseppe De; Finos, Livio; Muzzioli, Valerio; Betti, Margherita; Nico, Lanconelli; Tartari, Agostino; Pani, Roberto

    2006-01-01

    The recent development of new gamma imagers based on scintillation array with high spatial resolution, has strongly improved the possibility of detecting sub-centimeter cancer in Scintimammography. However, Compton scattering contamination remains the main drawback since it limits the sensitivity of tumor detection. Principal component image analysis (PCA), recently introduced in scintimam nographic imaging, is a data reduction technique able to represent the radiation emitted from chest, breast healthy and damaged tissues as separated images. From these images a Scintimammography can be obtained where the Compton contamination is "removed". In the present paper we compared the PCA reconstructed images with the conventional scintimammographic images resulting from the photopeak (Ph) energy window. Data coming from a clinical trial were used. For both kinds of images the tumor presence was quantified by evaluating the t-student statistics for independent sample as a measure of the signal-to-noise ratio (SNR). Since the absence of Compton scattering, the PCA reconstructed images shows a better noise suppression and allows a more reliable diagnostics in comparison with the images obtained by the photopeak energy window, reducing the trend in producing false positive. PMID:17646004

  6. Image analysis in comparative genomic hybridization

    SciTech Connect

    Lundsteen, C.; Maahr, J.; Christensen, B.

    1995-01-01

    Comparative genomic hybridization (CGH) is a new technique by which genomic imbalances can be detected by combining in situ suppression hybridization of whole genomic DNA and image analysis. We have developed software for rapid, quantitative CGH image analysis by a modification and extension of the standard software used for routine karyotyping of G-banded metaphase spreads in the Magiscan chromosome analysis system. The DAPI-counterstained metaphase spread is karyotyped interactively. Corrections for image shifts between the DAPI, FITC, and TRITC images are done manually by moving the three images relative to each other. The fluorescence background is subtracted. A mean filter is applied to smooth the FITC and TRITC images before the fluorescence ratio between the individual FITC and TRITC-stained chromosomes is computed pixel by pixel inside the area of the chromosomes determined by the DAPI boundaries. Fluorescence intensity ratio profiles are generated, and peaks and valleys indicating possible gains and losses of test DNA are marked if they exceed ratios below 0.75 and above 1.25. By combining the analysis of several metaphase spreads, consistent findings of gains and losses in all or almost all spreads indicate chromosomal imbalance. Chromosomal imbalances are detected either by visual inspection of fluorescence ratio (FR) profiles or by a statistical approach that compares FR measurements of the individual case with measurements of normal chromosomes. The complete analysis of one metaphase can be carried out in approximately 10 minutes. 8 refs., 7 figs., 1 tab.

  7. Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.

    PubMed

    Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor

    2011-09-01

    Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied. PMID:21764422

  8. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    PubMed

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results. PMID:27054724

  9. GIS, Geoscience, Multi-criteria Analysis and Integrated Management of the Coastal Zone

    NASA Astrophysics Data System (ADS)

    Kacimi, Y.; Barich, A.

    2011-12-01

    In this 3rd millennium, geology can be considered as a science of decision that intervenes in all the society domains. It has passed its academic dimension to spread toward some domains that until now were out of reach. Combining different Geoscience sub-disciplines emanates from a strong will to demonstrate the contribution of this science and its impact on the daily life, especially by making it applicable to various innovative projects. Geophysics, geochemistry and structural geology are complementary disciplines that can be applied in perfect symbiosis in many domains like construction, mining prospection, impact assessment, environment, etc. This can be proved by using collected data from these studies and integrate them into Geographic Information Systems (GIS), in order to make a multi-criteria analysis, which gives generally very impressive results. From this point, it is easy to set mining, eco-geotouristic and risk assessment models in order to establish land use projects but also in the case of integrated management of the coastal zone (IMCZ). Touristic projects in Morocco focus on its coast which represents at least 3500 km ; the management of this zone for building marinas or touristic infrastructures requires a deep and detailed study of marine currents on the coast, for example, by creating surveillance models and a coastal hazards map. An innovative project that will include geophysical, geochemical and structural geology studies associated to a multi-criteria analysis. The data will be integrated into a GIS to establish a coastal map that will highlight low-risk erosion zones and thus will facilitate implementation of ports and other construction projects. YES Morocco is a chapter of the International YES Network that aims to promote Geoscience in the service of society and professional development of Young and Early Career Geoscientists. Our commitment for such project will be of qualitative aspect into an associative framework that will involve

  10. Environmental Education Research Project, Content Analysis Criteria, Report on First Evaluation Trial.

    ERIC Educational Resources Information Center

    Linke, R. D.

    Ten criteria for use in assessing the emphasis on environmental education in textbooks and similar resource materials were developed and given to 30 members of the Australian Conservation Foundation Education and Training Committees throughout the country. Each rater applied the criteria to three chapters of a biology textbook "The Web of Life,"…

  11. The Politics of Determining Merit Aid Eligibility Criteria: An Analysis of the Policy Process

    ERIC Educational Resources Information Center

    Ness, Erik C.

    2010-01-01

    Despite the scholarly attention on the effects of merit aid on college access and choice, particularly on the significant effect that states' varied eligibility criteria play, no studies have examined the policy process through which merit aid criteria are determined. This is surprising given the recent attention to state-level policy dynamics and…

  12. The application of integral performance criteria to the analysis of discrete maneuvers in a driving simulator

    NASA Technical Reports Server (NTRS)

    Repa, B. S.; Zucker, R. S.; Wierwille, W. W.

    1977-01-01

    The influence of vehicle transient response characteristics on driver-vehicle performance in discrete maneuvers as measured by integral performance criteria was investigated. A group of eight ordinary drivers was presented with a series of eight vehicle transfer function configurations in a driving simulator. Performance in two discrete maneuvers was analyzed by means of integral performance criteria. Results are presented.

  13. Appropriate use criteria for amyloid PET: a report of the Amyloid Imaging Task Force, the Society of Nuclear Medicine and Molecular Imaging, and the Alzheimer's Association.

    PubMed

    Johnson, Keith A; Minoshima, Satoshi; Bohnen, Nicolaas I; Donohoe, Kevin J; Foster, Norman L; Herscovitch, Peter; Karlawish, Jason H; Rowe, Christopher C; Carrillo, Maria C; Hartley, Dean M; Hedrick, Saima; Pappas, Virginia; Thies, William H

    2013-03-01

    Positron emission tomography (PET) of brain amyloid β is a technology that is becoming more available, but its clinical utility in medical practice requires careful definition. To provide guidance to dementia care practitioners, patients, and caregivers, the Alzheimer's Association and the Society of Nuclear Medicine and Molecular Imaging convened the Amyloid Imaging Taskforce (AIT). The AIT considered a broad range of specific clinical scenarios in which amyloid PET could potentially be used appropriately. Peer-reviewed, published literature was searched to ascertain available evidence relevant to these scenarios, and the AIT developed a consensus of expert opinion. Although empirical evidence of impact on clinical outcomes is not yet available, a set of specific appropriate use criteria (AUC) were agreed on that define the types of patients and clinical circumstances in which amyloid PET could be used. Both appropriate and inappropriate uses were considered and formulated, and are reported and discussed here. Because both dementia care and amyloid PET technology are in active development, these AUC will require periodic reassessment. Future research directions are also outlined, including diagnostic utility and patient-centered outcomes. PMID:23359661

  14. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  15. Hybrid µCT-FMT imaging and image analysis

    PubMed Central

    Zafarnia, Sara; Babler, Anne; Jahnen-Dechent, Willi; Lammers, Twan; Lederle, Wiltrud; Kiessling, Fabian

    2015-01-01

    Fluorescence-mediated tomography (FMT) enables longitudinal and quantitative determination of the fluorescence distribution in vivo and can be used to assess the biodistribution of novel probes and to assess disease progression using established molecular probes or reporter genes. The combination with an anatomical modality, e.g., micro computed tomography (µCT), is beneficial for image analysis and for fluorescence reconstruction. We describe a protocol for multimodal µCT-FMT imaging including the image processing steps necessary to extract quantitative measurements. After preparing the mice and performing the imaging, the multimodal data sets are registered. Subsequently, an improved fluorescence reconstruction is performed, which takes into account the shape of the mouse. For quantitative analysis, organ segmentations are generated based on the anatomical data using our interactive segmentation tool. Finally, the biodistribution curves are generated using a batch-processing feature. We show the applicability of the method by assessing the biodistribution of a well-known probe that binds to bones and joints. PMID:26066033

  16. Particle Pollution Estimation Based on Image Analysis.

    PubMed

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction. PMID:26828757

  17. Particle Pollution Estimation Based on Image Analysis

    PubMed Central

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction. PMID:26828757

  18. Membrane composition analysis by imaging mass spectrometry

    SciTech Connect

    Boxer, S G; Kraft, M L; Longo, M; Hutcheon, I D; Weber, P K

    2006-03-29

    Membranes on solid supports offer an ideal format for imaging. Secondary ion mass spectrometry (SIMS) can be used to obtain composition information on membrane-associated components. Using the NanoSIMS50, images of composition variations in membrane domains can be obtained with a lateral resolution better than 100 nm. By suitable calibration, these variations in composition can be translated into a quantitative analysis of the membrane composition. Progress towards imaging small phase-separated lipid domains, membrane-associated proteins and natural biological membranes will be described.

  19. Data analysis for GOPEX image frames

    NASA Technical Reports Server (NTRS)

    Levine, B. M.; Shaik, K. S.; Yan, T.-Y.

    1993-01-01

    The data analysis based on the image frames received at the Solid State Imaging (SSI) camera of the Galileo Optical Experiment (GOPEX) demonstration conducted between 9-16 Dec. 1992 is described. Laser uplink was successfully established between the ground and the Galileo spacecraft during its second Earth-gravity-assist phase in December 1992. SSI camera frames were acquired which contained images of detected laser pulses transmitted from the Table Mountain Facility (TMF), Wrightwood, California, and the Starfire Optical Range (SOR), Albuquerque, New Mexico. Laser pulse data were processed using standard image-processing techniques at the Multimission Image Processing Laboratory (MIPL) for preliminary pulse identification and to produce public release images. Subsequent image analysis corrected for background noise to measure received pulse intensities. Data were plotted to obtain histograms on a daily basis and were then compared with theoretical results derived from applicable weak-turbulence and strong-turbulence considerations. Processing steps are described and the theories are compared with the experimental results. Quantitative agreement was found in both turbulence regimes, and better agreement would have been found, given more received laser pulses. Future experiments should consider methods to reliably measure low-intensity pulses, and through experimental planning to geometrically locate pulse positions with greater certainty.

  20. VAICo: visual analysis for image comparison.

    PubMed

    Schmidt, Johanna; Gröller, M Eduard; Bruckner, Stefan

    2013-12-01

    Scientists, engineers, and analysts are confronted with ever larger and more complex sets of data, whose analysis poses special challenges. In many situations it is necessary to compare two or more datasets. Hence there is a need for comparative visualization tools to help analyze differences or similarities among datasets. In this paper an approach for comparative visualization for sets of images is presented. Well-established techniques for comparing images frequently place them side-by-side. A major drawback of such approaches is that they do not scale well. Other image comparison methods encode differences in images by abstract parameters like color. In this case information about the underlying image data gets lost. This paper introduces a new method for visualizing differences and similarities in large sets of images which preserves contextual information, but also allows the detailed analysis of subtle variations. Our approach identifies local changes and applies cluster analysis techniques to embed them in a hierarchy. The results of this process are then presented in an interactive web application which allows users to rapidly explore the space of differences and drill-down on particular features. We demonstrate the flexibility of our approach by applying it to multiple distinct domains. PMID:24051775

  1. Prediction of Depression in Cancer Patients With Different Classification Criteria, Linear Discriminant Analysis versus Logistic Regression

    PubMed Central

    Shayan, Zahra; Mezerji, Naser Mohammad Gholi; Shayan, Leila; Naseri, Parisa

    2016-01-01

    Background: Logistic regression (LR) and linear discriminant analysis (LDA) are two popular statistical models for prediction of group membership. Although they are very similar, the LDA makes more assumptions about the data. When categorical and continuous variables used simultaneously, the optimal choice between the two models is questionable. In most studies, classification error (CE) is used to discriminate between subjects in several groups, but this index is not suitable to predict the accuracy of the outcome. The present study compared LR and LDA models using classification indices. Methods: This cross-sectional study selected 243 cancer patients. Sample sets of different sizes (n = 50, 100, 150, 200, 220) were randomly selected and the CE, B, and Q classification indices were calculated by the LR and LDA models. Results: CE revealed the a lack of superiority for one model over the other, but the results showed that LR performed better than LDA for the B and Q indices in all situations. No significant effect for sample size on CE was noted for selection of an optimal model. Assessment of the accuracy of prediction of real data indicated that the B and Q indices are appropriate for selection of an optimal model. Conclusion: The results of this study showed that LR performs better in some cases and LDA in others when based on CE. The CE index is not appropriate for classification, although the B and Q indices performed better and offered more efficient criteria for comparison and discrimination between groups.

  2. Using soil function evaluation in multi-criteria decision analysis for sustainability appraisal of remediation alternatives.

    PubMed

    Volchko, Yevheniya; Norrman, Jenny; Rosén, Lars; Bergknut, Magnus; Josefsson, Sarah; Söderqvist, Tore; Norberg, Tommy; Wiberg, Karin; Tysklind, Mats

    2014-07-01

    Soil contamination is one of the major threats constraining proper functioning of the soil and thus provision of ecosystem services. Remedial actions typically only address the chemical soil quality by reducing total contaminant concentrations to acceptable levels guided by land use. However, emerging regulatory requirements on soil protection demand a holistic view on soil assessment in remediation projects thus accounting for a variety of soil functions. Such a view would require not only that the contamination concentrations are assessed and attended to, but also that other aspects are taking into account, thus addressing also physical and biological as well as other chemical soil quality indicators (SQIs). This study outlines how soil function assessment can be a part of a holistic sustainability appraisal of remediation alternatives using multi-criteria decision analysis (MCDA). The paper presents a method for practitioners for evaluating the effects of remediation alternatives on selected ecological soil functions using a suggested minimum data set (MDS) containing physical, biological and chemical SQIs. The measured SQIs are transformed into sub-scores by the use of scoring curves, which allows interpretation and the integration of soil quality data into the MCDA framework. The method is demonstrated at a study site (Marieberg, Sweden) and the results give an example of how soil analyses using the suggested MDS can be used for soil function assessment and subsequent input to the MCDA framework. PMID:24529453

  3. Rural tourism spatial distribution based on multi-criteria decision analysis and GIS

    NASA Astrophysics Data System (ADS)

    Zhang, Hongxian; Yang, Qingsheng

    2008-10-01

    To study spatial distribution of rural tourism can provide scientific decision basis for developing rural economics. Traditional ways of tourism spatial distribution have some limitations in quantifying priority locations of tourism development on small units. They can only produce the overall tourism distribution locations and whether locations are suitable to tourism development simply while the tourism develop ranking with different decision objectives should be considered. This paper presents a way to find ranking of location of rural tourism development in spatial by integrating multi-criteria decision analysis (MCDA) and geography information system (GIS). In order to develop country economics with inconvenient transportation, undeveloped economy and better tourism resource, these locations should be firstly develop rural tourism. Based on this objective, the tourism develop priority utility of each town is calculated with MCDA and GIS. Towns which should be first develop rural tourism can be selected with higher tourism develop priority utility. The method is used to find ranking of location of rural tourism in Ningbo City successfully. The result shows that MCDA is an effective way for distribution rural tourism in spatial based on special decision objectives and rural tourism can promote economic development.

  4. Multi-Criteria Decision Making for a Spatial Decision Support System on the Analysis of Changing Risk

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; van Westen, Cees; Bakker, Wim H.; Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    Natural hazard risk management requires decision making in several stages. Decision making on alternatives for risk reduction planning starts with an intelligence phase for recognition of the decision problems and identifying the objectives. Development of the alternatives and assigning the variable by decision makers to each alternative are employed to the design phase. Final phase evaluates the optimal choice by comparing the alternatives, defining indicators, assigning a weight to each and ranking them. This process is referred to as Multi-Criteria Decision Making analysis (MCDM), Multi-Criteria Evaluation (MCE) or Multi-Criteria Analysis (MCA). In the framework of the ongoing 7th Framework Program "CHANGES" (2011-2014, Grant Agreement No. 263953) of the European Commission, a Spatial Decision Support System is under development, that has the aim to analyse changes in hydro-meteorological risk and provide support to selecting the best risk reduction alternative. This paper describes the module for Multi-Criteria Decision Making analysis (MCDM) that incorporates monetary and non-monetary criteria in the analysis of the optimal alternative. The MCDM module consists of several components. The first step is to define criteria (or Indicators) which are subdivided into disadvantages (criteria that indicate the difficulty for implementing the risk reduction strategy, also referred to as Costs) and advantages (criteria that indicate the favorability, also referred to as benefits). In the next step the stakeholders can use the developed web-based tool for prioritizing criteria and decision matrix. Public participation plays a role in decision making and this is also planned through the use of a mobile web-version where the general local public can indicate their agreement on the proposed alternatives. The application is being tested through a case study related to risk reduction of a mountainous valley in the Alps affected by flooding. Four alternatives are evaluated in

  5. A pairwise image analysis with sparse decomposition

    NASA Astrophysics Data System (ADS)

    Boucher, A.; Cloppet, F.; Vincent, N.

    2013-02-01

    This paper aims to detect the evolution between two images representing the same scene. The evolution detection problem has many practical applications, especially in medical images. Indeed, the concept of a patient "file" implies the joint analysis of different acquisitions taken at different times, and the detection of significant modifications. The research presented in this paper is carried out within the application context of the development of computer assisted diagnosis (CAD) applied to mammograms. It is performed on already registered pair of images. As the registration is never perfect, we must develop a comparison method sufficiently adapted to detect real small differences between comparable tissues. In many applications, the assessment of similarity used during the registration step is also used for the interpretation step that yields to prompt suspicious regions. In our case registration is assumed to match the spatial coordinates of similar anatomical elements. In this paper, in order to process the medical images at tissue level, the image representation is based on elementary patterns, therefore seeking patterns, not pixels. Besides, as the studied images have low entropy, the decomposed signal is expressed in a parsimonious way. Parsimonious representations are known to help extract the significant structures of a signal, and generate a compact version of the data. This change of representation should allow us to compare the studied images in a short time, thanks to the low weight of the images thus represented, while maintaining a good representativeness. The good precision of our results show the approach efficiency.

  6. Image analysis of insulation mineral fibres.

    PubMed

    Talbot, H; Lee, T; Jeulin, D; Hanton, D; Hobbs, L W

    2000-12-01

    We present two methods for measuring the diameter and length of man-made vitreous fibres based on the automated image analysis of scanning electron microscopy images. The fibres we want to measure are used in materials such as glass wool, which in turn are used for thermal and acoustic insulation. The measurement of the diameters and lengths of these fibres is used by the glass wool industry for quality control purposes. To obtain reliable quality estimators, the measurement of several hundred images is necessary. These measurements are usually obtained manually by operators. Manual measurements, although reliable when performed by skilled operators, are slow due to the need for the operators to rest often to retain their ability to spot faint fibres on noisy backgrounds. Moreover, the task of measuring thousands of fibres every day, even with the help of semi-automated image analysis systems, is dull and repetitive. The need for an automated procedure which could replace manual measurements is quite real. For each of the two methods that we propose to accomplish this task, we present the sample preparation, the microscope setting and the image analysis algorithms used for the segmentation of the fibres and for their measurement. We also show how a statistical analysis of the results can alleviate most measurement biases, and how we can estimate the true distribution of fibre lengths by diameter class by measuring only the lengths of the fibres visible in the field of view. PMID:11106965

  7. Automated eXpert Spectral Image Analysis

    Energy Science and Technology Software Center (ESTSC)

    2003-11-25

    AXSIA performs automated factor analysis of hyperspectral images. In such images, a complete spectrum is collected an each point in a 1-, 2- or 3- dimensional spatial array. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful information. Multivariate factor analysis techniques have proven effective for extracting the essential information from high dimensional data sets into a limtedmore » number of factors that describe the spectral characteristics and spatial distributions of the pure components comprising the sample. AXSIA provides tools to estimate different types of factor models including Singular Value Decomposition (SVD), Principal Component Analysis (PCA), PCA with factor rotation, and Alternating Least Squares-based Multivariate Curve Resolution (MCR-ALS). As part of the analysis process, AXSIA can automatically estimate the number of pure components that comprise the data and can scale the data to account for Poisson noise. The data analysis methods are fundamentally based on eigenanalysis of the data crossproduct matrix coupled with orthogonal eigenvector rotation and constrained alternating least squares refinement. A novel method for automatically determining the number of significant components, which is based on the eigenvalues of the crossproduct matrix, has also been devised and implemented. The data can be compressed spectrally via PCA and spatially through wavelet transforms, and algorithms have been developed that perform factor analysis in the transform domain while retaining full spatial and spectral resolution in the final result. These latter innovations enable the analysis of larger-than core-memory spectrum-images. AXSIA was designed to perform automated chemical phase analysis of spectrum-images acquired by a variety of chemical imaging techniques. Successful applications include Energy Dispersive X-ray Spectroscopy, X

  8. Objective facial photograph analysis using imaging software.

    PubMed

    Pham, Annette M; Tollefson, Travis T

    2010-05-01

    Facial analysis is an integral part of the surgical planning process. Clinical photography has long been an invaluable tool in the surgeon's practice not only for accurate facial analysis but also for enhancing communication between the patient and surgeon, for evaluating postoperative results, for medicolegal documentation, and for educational and teaching opportunities. From 35-mm slide film to the digital technology of today, clinical photography has benefited greatly from technological advances. With the development of computer imaging software, objective facial analysis becomes easier to perform and less time consuming. Thus, while the original purpose of facial analysis remains the same, the process becomes much more efficient and allows for some objectivity. Although clinical judgment and artistry of technique is never compromised, the ability to perform objective facial photograph analysis using imaging software may become the standard in facial plastic surgery practices in the future. PMID:20511080

  9. SCORE: a novel multi-criteria decision analysis approach to assessing the sustainability of contaminated land remediation.

    PubMed

    Rosén, Lars; Back, Pär-Erik; Söderqvist, Tore; Norrman, Jenny; Brinkhoff, Petra; Norberg, Tommy; Volchko, Yevheniya; Norin, Malin; Bergknut, Magnus; Döberl, Gernot

    2015-04-01

    The multi-criteria decision analysis (MCDA) method provides for a comprehensive and transparent basis for performing sustainability assessments. Development of a relevant MCDA-method requires consideration of a number of key issues, e.g. (a) definition of assessment boundaries, (b) definition of performance scales, both temporal and spatial, (c) selection of relevant criteria (indicators) that facilitate a comprehensive sustainability assessment while avoiding double-counting of effects, and (d) handling of uncertainties. Adding to the complexity is the typically wide variety of inputs, including quantifications based on existing data, expert judgements, and opinions expressed in interviews. The SCORE (Sustainable Choice Of REmediation) MCDA-method was developed to provide a transparent assessment of the sustainability of possible remediation alternatives for contaminated sites relative to a reference alternative, considering key criteria in the economic, environmental, and social sustainability domains. The criteria were identified based on literature studies, interviews and focus-group meetings. SCORE combines a linear additive model to rank the alternatives with a non-compensatory approach to identify alternatives regarded as non-sustainable. The key strengths of the SCORE method are as follows: a framework that at its core is designed to be flexible and transparent; the possibility to integrate both quantitative and qualitative estimations on criteria; its ability, unlike other sustainability assessment tools used in industry and academia, to allow for the alteration of boundary conditions where necessary; the inclusion of a full uncertainty analysis of the results, using Monte Carlo simulation; and a structure that allows preferences and opinions of involved stakeholders to be openly integrated into the analysis. A major insight from practical application of SCORE is that its most important contribution may be that it initiates a process where criteria

  10. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    NASA Astrophysics Data System (ADS)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  11. Motion Analysis From Television Images

    NASA Astrophysics Data System (ADS)

    Silberberg, George G.; Keller, Patrick N.

    1982-02-01

    The Department of Defense ranges have relied on photographic instrumentation for gathering data of firings for all types of ordnance. A large inventory of cameras are available on the market that can be used for these tasks. A new set of optical instrumentation is beginning to appear which, in many cases, can directly replace photographic cameras for a great deal of the work being performed now. These are television cameras modified so they can stop motion, see in the dark, perform under hostile environments, and provide real time information. This paper discusses techniques for modifying television cameras so they can be used for motion analysis.

  12. Analysis of extensively washed hair from cocaine users and drug chemists to establish new reporting criteria.

    PubMed

    Morris-Kukoski, Cynthia L; Montgomery, Madeline A; Hammer, Rena L

    2014-01-01

    Samples from a self-proclaimed cocaine (COC) user, from 19 drug users (postmortem) and from 27 drug chemists were extensively washed and analyzed for COC, benzoylecgonine, norcocaine (NC), cocaethylene (CE) and aryl hydroxycocaines by liquid chromatography-tandem mass spectrometry. Published wash criteria and cutoffs were applied to the results. Additionally, the data were used to formulate new reporting criteria and interpretation guidelines for forensic casework. Applying the wash and reporting criteria, hair that was externally contaminated with COC was distinguished from hair collected from individuals known to have consumed COC. In addition, CE, NC and hydroxycocaine metabolites were only present in COC users' hair and not in drug chemists' hair. When properly applied, the use of an extended wash, along with the reporting criteria defined here, will exclude false-positive results from environmental contact with COC. PMID:25100648

  13. Discussion paper on applicability of oil and grease analysis for RCRA closure criteria

    SciTech Connect

    1995-02-01

    A site characterization (SC) was performed for the Building 9409-5 Diked Tank Storage Facility. The initial SC indicated areas which had oil and grease levels above the criteria of the currently proposed RCRA closure plan. After further investigation, it was demonstrated that the oil and grease parameter may not be an accurate indication of a release from this facility and should not be included as a contaminant of concern in the closure criteria.

  14. Up-to-seven criteria for hepatocellular carcinoma liver transplantation: A single center analysis

    PubMed Central

    Lei, Jian-Yong; Wang, Wen-Tao; Yan, Lu-Nan

    2013-01-01

    AIM: To detect whether the up-to-seven should be used as inclusion criteria for liver transplantation for hepatocellular carcinoma. METHODS: Between April 2002 and July 2008, 220 hepatocellular carcinoma (HCC) patients who were diagnosed with HCC and underwent liver transplantation (LT) at our liver transplantation center were included. These patients were divided into three groups according to the characteristics of their tumors (tumor diameter, tumor number): the Milan criteria group (Group 1), the in up-to-seven group (Group 2) and the out up-to-seven group (Group 3). Then, we compared long-term survival and tumor recurrence of these three groups. RESULTS: The baseline characteristics of transplant recipients were comparable among these three groups, except for the type of liver graft (deceased donor liver transplant or live donor liver transplantation). There were also no significant differences in the pre-operative α-fetoprotein level. The 1-, 3-, and 5-year overall survival and tumor-free survival rate for the Milan criteria group were 94.8%, 91.4%, 89.7% and 91.4%, 86.2%, and 86.2% respectively; in the up-to-seven criteria group, these rates were 87.8%, 77.8%, and 76.6% and 85.6%, 75.6%, and 75.6% respectively (P < 0.05). However, the advanced HCC patients’ (in the group out of up-to-seven criteria) overall and tumor-free survival rates were much lower, at 75%, 53.3%, and 50% and 65.8%, 42.5%, and 41.7%, respectively (P < 0.01). CONCLUSION: Considering that patients in the up-to-seven criteria group exhibited a considerable but lower survival rate compared with the Milan criteria group, the up-to-seven criteria should be used carefully and selectively. PMID:24106409

  15. Endoscopic image analysis in semantic space.

    PubMed

    Kwitt, R; Vasconcelos, N; Rasiwasia, N; Uhl, A; Davis, B; Häfner, M; Wrba, F

    2012-10-01

    A novel approach to the design of a semantic, low-dimensional, encoding for endoscopic imagery is proposed. This encoding is based on recent advances in scene recognition, where semantic modeling of image content has gained considerable attention over the last decade. While the semantics of scenes are mainly comprised of environmental concepts such as vegetation, mountains or sky, the semantics of endoscopic imagery are medically relevant visual elements, such as polyps, special surface patterns, or vascular structures. The proposed semantic encoding differs from the representations commonly used in endoscopic image analysis (for medical decision support) in that it establishes a semantic space, where each coordinate axis has a clear human interpretation. It is also shown to establish a connection to Riemannian geometry, which enables principled solutions to a number of problems that arise in both physician training and clinical practice. This connection is exploited by leveraging results from information geometry to solve problems such as (1) recognition of important semantic concepts, (2) semantically-focused image browsing, and (3) estimation of the average-case semantic encoding for a collection of images that share a medically relevant visual detail. The approach can provide physicians with an easily interpretable, semantic encoding of visual content, upon which further decisions, or operations, can be naturally carried out. This is contrary to the prevalent practice in endoscopic image analysis for medical decision support, where image content is primarily captured by discriminative, high-dimensional, appearance features, which possess discriminative power but lack human interpretability. PMID:22717411

  16. Endoscopic Image Analysis in Semantic Space

    PubMed Central

    Kwitt, R.; Vasconcelos, N.; Rasiwasia, N.; Uhl, A.; Davis, B.; Häfner, M.; Wrba, F.

    2013-01-01

    A novel approach to the design of a semantic, low-dimensional, encoding for endoscopic imagery is proposed. This encoding is based on recent advances in scene recognition, where semantic modeling of image content has gained considerable attention over the last decade. While the semantics of scenes are mainly comprised of environmental concepts such as vegetation, mountains or sky, the semantics of endoscopic imagery are medically relevant visual elements, such as polyps, special surface patterns, or vascular structures. The proposed semantic encoding differs from the representations commonly used in endoscopic image analysis (for medical decision support) in that it establishes a semantic space, where each coordinate axis has a clear human interpretation. It is also shown to establish a connection to Riemannian geometry, which enables principled solutions to a number of problems that arise in both physician training and clinical practice. This connection is exploited by leveraging results from information geometry to solve problems such as 1) recognition of important semantic concepts, 2) semantically-focused image browsing, and 3) estimation of the average-case semantic encoding for a collection of images that share a medically relevant visual detail. The approach can provide physicians with an easily interpretable, semantic encoding of visual content, upon which further decisions, or operations, can be naturally carried out. This is contrary to the prevalent practice in endoscopic image analysis for medical decision support, where image content is primarily captured by discriminative, high-dimensional, appearance features, which possess discriminative power but lack human interpretability. PMID:22717411

  17. Criteria for Developing Criteria Sets.

    ERIC Educational Resources Information Center

    Martin, James L.

    Criteria sets are a necessary step in the systematic development of evaluation in education. Evaluation results from the combination of criteria and evidence. There is a need to develop explicit tools for evaluating criteria, similar to those used in evaluating evidence. The formulation of such criteria depends on distinguishing between terms…

  18. Unsupervised hyperspectral image analysis using independent component analysis (ICA)

    SciTech Connect

    S. S. Chiang; I. W. Ginsberg

    2000-06-30

    In this paper, an ICA-based approach is proposed for hyperspectral image analysis. It can be viewed as a random version of the commonly used linear spectral mixture analysis, in which the abundance fractions in a linear mixture model are considered to be unknown independent signal sources. It does not require the full rank of the separating matrix or orthogonality as most ICA methods do. More importantly, the learning algorithm is designed based on the independency of the material abundance vector rather than the independency of the separating matrix generally used to constrain the standard ICA. As a result, the designed learning algorithm is able to converge to non-orthogonal independent components. This is particularly useful in hyperspectral image analysis since many materials extracted from a hyperspectral image may have similar spectral signatures and may not be orthogonal. The AVIRIS experiments have demonstrated that the proposed ICA provides an effective unsupervised technique for hyperspectral image classification.

  19. Multi-criteria multi-stakeholder decision analysis using a fuzzy-stochastic approach for hydrosystem management

    NASA Astrophysics Data System (ADS)

    Subagadis, Y. H.; Schütze, N.; Grundmann, J.

    2014-09-01

    The conventional methods used to solve multi-criteria multi-stakeholder problems are less strongly formulated, as they normally incorporate only homogeneous information at a time and suggest aggregating objectives of different decision-makers avoiding water-society interactions. In this contribution, Multi-Criteria Group Decision Analysis (MCGDA) using a fuzzy-stochastic approach has been proposed to rank a set of alternatives in water management decisions incorporating heterogeneous information under uncertainty. The decision making framework takes hydrologically, environmentally, and socio-economically motivated conflicting objectives into consideration. The criteria related to the performance of the physical system are optimized using multi-criteria simulation-based optimization, and fuzzy linguistic quantifiers have been used to evaluate subjective criteria and to assess stakeholders' degree of optimism. The proposed methodology is applied to find effective and robust intervention strategies for the management of a coastal hydrosystem affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. Preliminary results show that the MCGDA based on a fuzzy-stochastic approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.

  20. Curvelet Based Offline Analysis of SEM Images

    PubMed Central

    Shirazi, Syed Hamad; Haq, Nuhman ul; Hayat, Khizar; Naz, Saeeda; Haque, Ihsan ul

    2014-01-01

    Manual offline analysis, of a scanning electron microscopy (SEM) image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method employs a state of the art Curvelet transform followed by segmentation through a combination of entropy filtering, thresholding and mathematical morphology (MM). The quantification is carried out by the application of a box-counting algorithm, for fractal dimension (FD) calculations, with the ultimate goal of measuring the parameters, like surface area and perimeter. The perimeter is estimated indirectly by counting the boundary boxes of the filled shapes. The proposed method, when applied to a representative set of SEM images, not only showed better results in image segmentation but also exhibited a good accuracy in the calculation of surface area and perimeter. The proposed method outperforms the well-known Watershed segmentation algorithm. PMID:25089617

  1. Medical image analysis with artificial neural networks.

    PubMed

    Jiang, J; Trundle, P; Ren, J

    2010-12-01

    Given that neural networks have been widely reported in the research community of medical imaging, we provide a focused literature survey on recent neural network developments in computer-aided diagnosis, medical image segmentation and edge detection towards visual content analysis, and medical image registration for its pre-processing and post-processing, with the aims of increasing awareness of how neural networks can be applied to these areas and to provide a foundation for further research and practical development. Representative techniques and algorithms are explained in detail to provide inspiring examples illustrating: (i) how a known neural network with fixed structure and training procedure could be applied to resolve a medical imaging problem; (ii) how medical images could be analysed, processed, and characterised by neural networks; and (iii) how neural networks could be expanded further to resolve problems relevant to medical imaging. In the concluding section, a highlight of comparisons among many neural network applications is included to provide a global view on computational intelligence with neural networks in medical imaging. PMID:20713305

  2. Fourier analysis: from cloaking to imaging

    NASA Astrophysics Data System (ADS)

    Wu, Kedi; Cheng, Qiluan; Wang, Guo Ping

    2016-04-01

    Regarding invisibility cloaks as an optical imaging system, we present a Fourier approach to analytically unify both Pendry cloaks and complementary media-based invisibility cloaks into one kind of cloak. By synthesizing different transfer functions, we can construct different devices to realize a series of interesting functions such as hiding objects (events), creating illusions, and performing perfect imaging. In this article, we give a brief review on recent works of applying Fourier approach to analysis invisibility cloaks and optical imaging through scattering layers. We show that, to construct devices to conceal an object, no constructive materials with extreme properties are required, making most, if not all, of the above functions realizable by using naturally occurring materials. As instances, we experimentally verify a method of directionally hiding distant objects and create illusions by using all-dielectric materials, and further demonstrate a non-invasive method of imaging objects completely hidden by scattering layers.

  3. Building a picture: Prioritisation of exotic diseases for the pig industry in Australia using multi-criteria decision analysis.

    PubMed

    Brookes, V J; Hernández-Jover, M; Cowled, B; Holyoake, P K; Ward, M P

    2014-01-01

    Diseases that are exotic to the pig industry in Australia were prioritised using a multi-criteria decision analysis framework that incorporated weights of importance for a range of criteria important to industry stakeholders. Measurements were collected for each disease for nine criteria that described potential disease impacts. A total score was calculated for each disease using a weighted sum value function that aggregated the nine disease criterion measurements and weights of importance for the criteria that were previously elicited from two groups of industry stakeholders. One stakeholder group placed most value on the impacts of disease on livestock, and one group placed more value on the zoonotic impacts of diseases. Prioritisation lists ordered by disease score were produced for both of these groups. Vesicular diseases were found to have the highest priority for the group valuing disease impacts on livestock, followed by acute forms of African and classical swine fever, then highly pathogenic porcine reproductive and respiratory syndrome. The group who valued zoonotic disease impacts prioritised rabies, followed by Japanese encephalitis, Eastern equine encephalitis and Nipah virus, interspersed with vesicular diseases. The multi-criteria framework used in this study systematically prioritised diseases using a multi-attribute theory based technique that provided transparency and repeatability in the process. Flexibility of the framework was demonstrated by aggregating the criterion weights from more than one stakeholder group with the disease measurements for the criteria. This technique allowed industry stakeholders to be active in resource allocation for their industry without the need to be disease experts. We believe it is the first prioritisation of livestock diseases using values provided by industry stakeholders. The prioritisation lists will be used by industry stakeholders to identify diseases for further risk analysis and disease spread modelling to

  4. Multi-criteria decision analysis for bioenergy in the Centre Region of Portugal

    NASA Astrophysics Data System (ADS)

    Esteves, T. C. J.; Cabral, P.; Ferreira, A. J. D.; Teixeira, J. C.

    2012-04-01

    With the consumption of fossil fuels, the resources essential to Man's survival are being rapidly contaminated. A sustainable future may be achieved by the use of renewable energies, allowing countries without non-renewable energy resources to guarantee energetic sovereignty. Using bioenergy may mean a steep reduction and/or elimination of the external dependency, enhancing the countries' capital and potentially reducing of the negative effects that outcome from the use of fossil fuels, such as loss of biodiversity, air, water, and soil pollution, … This work's main focus is to increase bioenergy use in the centre region of Portugal by allying R&D to facilitate determination of bioenergy availability and distribution throughout the study area.This analysis is essential, given that nowadays this knowledge is still very limited in the study area. Geographic Information Systems (GIS) was the main tool used to asses this study, due to its unseeingly ability to integrate various types of information (such as alphanumerical, statistical, geographical, …) and various sources of biomass (forest, agricultural, husbandry, municipal and industrial residues, shrublands, used vegetable oil and energy crops) to determine the bioenergy potential of the study area, as well as their spatial distribution. By allying GIS with multi-criteria decision analysis, the initial table-like information of difficult comprehension is transformed into tangible and easy to read results: both intermediate and final results of the created models will facilitate the decision making process. General results show that the major contributors for the bioenergy potential in the Centre Region of Portugal are forest residues, which are mostly located in the inner region of the study area. However, a more detailed analysis should be made to analyze the viability to use energy crops. As a main conclusion, we can say that, although this region may not use only this type of energy to be completely

  5. Measuring toothbrush interproximal penetration using image analysis

    NASA Astrophysics Data System (ADS)

    Hayworth, Mark S.; Lyons, Elizabeth K.

    1994-09-01

    An image analysis method of measuring the effectiveness of a toothbrush in reaching the interproximal spaces of teeth is described. Artificial teeth are coated with a stain that approximates real plaque and then brushed with a toothbrush on a brushing machine. The teeth are then removed and turned sideways so that the interproximal surfaces can be imaged. The areas of stain that have been removed within masked regions that define the interproximal regions are measured and reported. These areas correspond to the interproximal areas of the tooth reached by the toothbrush bristles. The image analysis method produces more precise results (10-fold decrease in standard deviation) in a fraction (22%) of the time as compared to our prior visual grading method.

  6. Four Common Simplifications of Multi-Criteria Decision Analysis do not hold for River Rehabilitation

    PubMed Central

    2016-01-01

    River rehabilitation aims at alleviating negative effects of human impacts such as loss of biodiversity and reduction of ecosystem services. Such interventions entail difficult trade-offs between different ecological and often socio-economic objectives. Multi-Criteria Decision Analysis (MCDA) is a very suitable approach that helps assessing the current ecological state and prioritizing river rehabilitation measures in a standardized way, based on stakeholder or expert preferences. Applications of MCDA in river rehabilitation projects are often simplified, i.e. using a limited number of objectives and indicators, assuming linear value functions, aggregating individual indicator assessments additively, and/or assuming risk neutrality of experts. Here, we demonstrate an implementation of MCDA expert preference assessments to river rehabilitation and provide ample material for other applications. To test whether the above simplifications reflect common expert opinion, we carried out very detailed interviews with five river ecologists and a hydraulic engineer. We defined essential objectives and measurable quality indicators (attributes), elicited the experts´ preferences for objectives on a standardized scale (value functions) and their risk attitude, and identified suitable aggregation methods. The experts recommended an extensive objectives hierarchy including between 54 and 93 essential objectives and between 37 to 61 essential attributes. For 81% of these, they defined non-linear value functions and in 76% recommended multiplicative aggregation. The experts were risk averse or risk prone (but never risk neutral), depending on the current ecological state of the river, and the experts´ personal importance of objectives. We conclude that the four commonly applied simplifications clearly do not reflect the opinion of river rehabilitation experts. The optimal level of model complexity, however, remains highly case-study specific depending on data and resource

  7. Morphometry of spermatozoa using semiautomatic image analysis.

    PubMed Central

    Jagoe, J R; Washbrook, N P; Hudson, E A

    1986-01-01

    Human sperm heads were detected and tracked using semiautomatic image analysis. Measurements of size and shape on two specimens from each of 26 men showed that the major component of variability both within and between subjects was the number of small elongated sperm heads. Variability of the computed features between subjects was greater than that between samples from the same subject. PMID:3805320

  8. Scale Free Reduced Rank Image Analysis.

    ERIC Educational Resources Information Center

    Horst, Paul

    In the traditional Guttman-Harris type image analysis, a transformation is applied to the data matrix such that each column of the transformed data matrix is the best least squares estimate of the corresponding column of the data matrix from the remaining columns. The model is scale free. However, it assumes (1) that the correlation matrix is…

  9. Using Image Analysis to Build Reading Comprehension

    ERIC Educational Resources Information Center

    Brown, Sarah Drake; Swope, John

    2010-01-01

    Content area reading remains a primary concern of history educators. In order to better prepare students for encounters with text, the authors propose the use of two image analysis strategies tied with a historical theme to heighten student interest in historical content and provide a basis for improved reading comprehension.

  10. Expert system for imaging spectrometer analysis results

    NASA Technical Reports Server (NTRS)

    Borchardt, Gary C.

    1985-01-01

    Information on an expert system for imaging spectrometer analysis results is outlined. Implementation requirements, the Simple Tool for Automated Reasoning (STAR) program that provides a software environment for the development and operation of rule-based expert systems, STAR data structures, and rule-based identification of surface materials are among the topics outlined.

  11. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  12. Comparison of Image Quality Criteria between Digital Storage Phosphor Plate in Mammography and Full-Field Digital Mammography in the Detection of Breast Cancer

    PubMed Central

    Thevi Rajendran, Pushpa; Krishnapillai, Vijayalakshmi; Tamanang, Sulaiman; Kumari Chelliah, Kanaga

    2012-01-01

    Background: Digital mammography is slowly replacing screen film mammography. In digital mammography, 2 methods are available in acquiring images: digital storage phosphor plate and full-field digital mammography. The aim of this study was to compare the image quality acquired from the 2 methods of digital mammography in the detection of breast cancer. Methods: The study took place at the National Cancer Society, Kuala Lumpur, and followed 150 asymptomatic women for the duration of 1 year. Participating women gave informed consent and were exposed to 4 views from each system. Two radiologists independently evaluated the printed images based on the image quality criteria in mammography. McNemar’s test was used to compare the image quality criteria between the systems. Results: The agreement between the radiologists for the digital storage phosphor plate was к = 0.551 and for full-field digital mammography was к = 0.523. Full-field digital mammography was significantly better compared with the digital storage phosphor plate in right and left mediolateral oblique views (P < 0.05) in the detection of microcalcifications, which are early signs of breast cancer. However, both systems were comparable in all other aspects of image quality. Conclusion: Digital mammography is a useful screening tool for the detection of early breast cancer and ensures better prognosis and quality of life. PMID:22977375

  13. A multi-criteria analysis approach for ranking and selection of microorganisms for the production of oils for biodiesel production.

    PubMed

    Ahmad, Farah B; Zhang, Zhanying; Doherty, William O S; O'Hara, Ian M

    2015-08-01

    Oleaginous microorganisms have potential to be used to produce oils as alternative feedstock for biodiesel production. Microalgae (Chlorella protothecoides and Chlorella zofingiensis), yeasts (Cryptococcus albidus and Rhodotorula mucilaginosa), and fungi (Aspergillus oryzae and Mucor plumbeus) were investigated for their ability to produce oil from glucose, xylose and glycerol. Multi-criteria analysis (MCA) using analytic hierarchy process (AHP) and preference ranking organization method for the enrichment of evaluations (PROMETHEE) with graphical analysis for interactive aid (GAIA), was used to rank and select the preferred microorganisms for oil production for biodiesel application. This was based on a number of criteria viz., oil concentration, content, production rate and yield, substrate consumption rate, fatty acids composition, biomass harvesting and nutrient costs. PROMETHEE selected A. oryzae, M. plumbeus and R. mucilaginosa as the most prospective species for oil production. However, further analysis by GAIA Webs identified A. oryzae and M. plumbeus as the best performing microorganisms. PMID:25958151

  14. Good relationships between computational image analysis and radiological physics

    SciTech Connect

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  15. Good relationships between computational image analysis and radiological physics

    NASA Astrophysics Data System (ADS)

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-01

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  16. Evaluating integrated watershed management using multiple criteria analysis--a case study at Chittagong Hill Tracts in Bangladesh.

    PubMed

    Biswas, Shampa; Vacik, Harald; Swanson, Mark E; Haque, S M Sirajul

    2012-05-01

    Criteria and indicators assessment is one of the ways to evaluate management strategies for mountain watersheds. One framework for this, Integrated Watershed Management (IWM), was employed at Chittagong Hill Tracts region of Bangladesh using a multi-criteria analysis approach. The IWM framework, consisting of the design and application of principles, criteria, indicators, and verifiers (PCIV), facilitates active participation by diverse professionals, experts, and interest groups in watershed management, to explicitly address the demands and problems to measure the complexity of problems in a transparent and understandable way. Management alternatives are developed to fulfill every key component of IWM considering the developed PCIV set and current situation of the study area. Different management strategies, each focusing on a different approach (biodiversity conservation, flood control, soil and water quality conservation, indigenous knowledge conservation, income generation, watershed conservation, and landscape conservation) were assessed qualitatively on their potential to improve the current situation according to each verifier of the criteria and indicator set. Analytic Hierarchy Process (AHP), including sensitivity analysis, was employed to identify an appropriate management strategy according to overall priorities (i.e., different weights of each principle) of key informants. The AHP process indicated that a strategy focused on conservation of biodiversity provided the best option to address watershed-related challenges in the Chittagong Hill Tracts, Bangladesh. PMID:21674224

  17. Frequency domain analysis of knock images

    NASA Astrophysics Data System (ADS)

    Qi, Yunliang; He, Xin; Wang, Zhi; Wang, Jianxin

    2014-12-01

    High speed imaging-based knock analysis has mainly focused on time domain information, e.g. the spark triggered flame speed, the time when end gas auto-ignition occurs and the end gas flame speed after auto-ignition. This study presents a frequency domain analysis on the knock images recorded using a high speed camera with direct photography in a rapid compression machine (RCM). To clearly visualize the pressure wave oscillation in the combustion chamber, the images were high-pass-filtered to extract the luminosity oscillation. The luminosity spectrum was then obtained by applying fast Fourier transform (FFT) to three basic colour components (red, green and blue) of the high-pass-filtered images. Compared to the pressure spectrum, the luminosity spectra better identify the resonant modes of pressure wave oscillation. More importantly, the resonant mode shapes can be clearly visualized by reconstructing the images based on the amplitudes of luminosity spectra at the corresponding resonant frequencies, which agree well with the analytical solutions for mode shapes of gas vibration in a cylindrical cavity.

  18. ImageJ: Image processing and analysis in Java

    NASA Astrophysics Data System (ADS)

    Rasband, W. S.

    2012-06-01

    ImageJ is a public domain Java image processing program inspired by NIH Image. It can display, edit, analyze, process, save and print 8-bit, 16-bit and 32-bit images. It can read many image formats including TIFF, GIF, JPEG, BMP, DICOM, FITS and "raw". It supports "stacks", a series of images that share a single window. It is multithreaded, so time-consuming operations such as image file reading can be performed in parallel with other operations.

  19. Multiscale likelihood analysis and image reconstruction

    NASA Astrophysics Data System (ADS)

    Willett, Rebecca M.; Nowak, Robert D.

    2003-11-01

    The nonparametric multiscale polynomial and platelet methods presented here are powerful new tools for signal and image denoising and reconstruction. Unlike traditional wavelet-based multiscale methods, these methods are both well suited to processing Poisson or multinomial data and capable of preserving image edges. At the heart of these new methods lie multiscale signal decompositions based on polynomials in one dimension and multiscale image decompositions based on what the authors call platelets in two dimensions. Platelets are localized functions at various positions, scales and orientations that can produce highly accurate, piecewise linear approximations to images consisting of smooth regions separated by smooth boundaries. Polynomial and platelet-based maximum penalized likelihood methods for signal and image analysis are both tractable and computationally efficient. Polynomial methods offer near minimax convergence rates for broad classes of functions including Besov spaces. Upper bounds on the estimation error are derived using an information-theoretic risk bound based on squared Hellinger loss. Simulations establish the practical effectiveness of these methods in applications such as density estimation, medical imaging, and astronomy.

  20. Recent advances in morphological cell image analysis.

    PubMed

    Chen, Shengyong; Zhao, Mingzhu; Wu, Guang; Yao, Chunyan; Zhang, Jianwei

    2012-01-01

    This paper summarizes the recent advances in image processing methods for morphological cell analysis. The topic of morphological analysis has received much attention with the increasing demands in both bioinformatics and biomedical applications. Among many factors that affect the diagnosis of a disease, morphological cell analysis and statistics have made great contributions to results and effects for a doctor. Morphological cell analysis finds the cellar shape, cellar regularity, classification, statistics, diagnosis, and so forth. In the last 20 years, about 1000 publications have reported the use of morphological cell analysis in biomedical research. Relevant solutions encompass a rather wide application area, such as cell clumps segmentation, morphological characteristics extraction, 3D reconstruction, abnormal cells identification, and statistical analysis. These reports are summarized in this paper to enable easy referral to suitable methods for practical solutions. Representative contributions and future research trends are also addressed. PMID:22272215

  1. Analysis of imaging system performance capabilities

    NASA Astrophysics Data System (ADS)

    Haim, Harel; Marom, Emanuel

    2013-06-01

    Present performance analysis of optical imaging systems based on results obtained with classic one-dimensional (1D) resolution targets (such as the USAF resolution chart) are significantly different than those obtained with a newly proposed 2D target [1]. We hereby prove such claim and show how the novel 2D target should be used for correct characterization of optical imaging systems in terms of resolution and contrast. We apply thereafter the consequences of these observations on the optimal design of some two-dimensional barcode structures.

  2. A Study of the Comparability of External Criteria for Hierarchical Cluster Analysis.

    PubMed

    Milligan, G W; Cooper, M C

    1986-10-01

    Five external criteria were used to evaluate the extent of recovery of the true structure in a hierarchical clustering solution. This was accomplished by comparing the partitions produced by the clustering algorithm with the partition that indicates the true cluster structure known to exist in the data. The five criteria examined were the Rand, the Morey and Agresti adjusted Rand, the Hubert and Arabie adjusted Rand, the Jaccard, and the Fowlkes and Mallows measures. The results of the study indicated that the Hubert and Arabie adjusted Rank index was best suited to the task of comparison across hierarchy levels. Deficiencies with the other measures are noted. PMID:26828221

  3. Autonomous Image Analysis for Future Mars Missions

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Ruzon, M. A.; Bandari, E.; Roush, T. L.

    1999-01-01

    To explore high priority landing sites and to prepare for eventual human exploration, future Mars missions will involve rovers capable of traversing tens of kilometers. However, the current process by which scientists interact with a rover does not scale to such distances. Specifically, numerous command cycles are required to complete even simple tasks, such as, pointing the spectrometer at a variety of nearby rocks. In addition, the time required by scientists to interpret image data before new commands can be given and the limited amount of data that can be downlinked during a given command cycle constrain rover mobility and achievement of science goals. Experience with rover tests on Earth supports these concerns. As a result, traverses to science sites as identified in orbital images would require numerous science command cycles over a period of many weeks, months or even years, perhaps exceeding rover design life and other constraints. Autonomous onboard science analysis can address these problems in two ways. First, it will allow the rover to preferentially transmit "interesting" images, defined as those likely to have higher science content. Second, the rover will be able to anticipate future commands. For example, a rover might autonomously acquire and return spectra of "interesting" rocks along with a high-resolution image of those rocks in addition to returning the context images in which they were detected. Such approaches, coupled with appropriate navigational software, help to address both the data volume and command cycle bottlenecks that limit both rover mobility and science yield. We are developing fast, autonomous algorithms to enable such intelligent on-board decision making by spacecraft. Autonomous algorithms developed to date have the ability to identify rocks and layers in a scene, locate the horizon, and compress multi-spectral image data. We are currently investigating the possibility of reconstructing a 3D surface from a sequence of images

  4. Weighting of Criteria for Disease Prioritization Using Conjoint Analysis and Based on Health Professional and Student Opinion

    PubMed Central

    Stebler, Nadine; Schuepbach-Regula, Gertraud; Braam, Peter; Falzon, Laura Cristina

    2016-01-01

    Disease prioritization exercises have been used by several organizations to inform surveillance and control measures. Though most methodologies for disease prioritization are based on expert opinion, it is becoming more common to include different stakeholders in the prioritization exercise. This study was performed to compare the weighting of disease criteria, and the consequent prioritization of zoonoses, by both health professionals and students in Switzerland using a Conjoint Analysis questionnaire. The health professionals comprised public health and food safety experts, cantonal physicians and cantonal veterinarians, while the student group comprised first-year veterinary and agronomy students. Eight criteria were selected for this prioritization based on expert elicitation and literature review. These criteria, described on a 3-tiered scale, were evaluated through a choice-based Conjoint Analysis questionnaire with 25 choice tasks. Questionnaire results were analyzed to obtain importance scores (for each criterion) and mean utility values (for each criterion level), and the latter were then used to rank 16 zoonoses. While the most important criterion for both groups was “Severity of the disease in humans”, the second ranked criteria by the health professionals and students were “Economy” and “Treatment in humans”, respectively. Regarding the criterion “Control and Prevention”, health professionals tended to prioritize a disease when the control and preventive measures were described to be 95% effective, while students prioritized a disease if there were almost no control and preventive measures available. Bovine Spongiform Encephalopathy was the top-ranked disease by both groups. Health professionals and students agreed on the weighting of certain criteria such as “Severity” and “Treatment of disease in humans”, but disagreed on others such as “Economy” or “Control and Prevention”. Nonetheless, the overall disease ranking

  5. Weighting of Criteria for Disease Prioritization Using Conjoint Analysis and Based on Health Professional and Student Opinion.

    PubMed

    Stebler, Nadine; Schuepbach-Regula, Gertraud; Braam, Peter; Falzon, Laura Cristina

    2016-01-01

    Disease prioritization exercises have been used by several organizations to inform surveillance and control measures. Though most methodologies for disease prioritization are based on expert opinion, it is becoming more common to include different stakeholders in the prioritization exercise. This study was performed to compare the weighting of disease criteria, and the consequent prioritization of zoonoses, by both health professionals and students in Switzerland using a Conjoint Analysis questionnaire. The health professionals comprised public health and food safety experts, cantonal physicians and cantonal veterinarians, while the student group comprised first-year veterinary and agronomy students. Eight criteria were selected for this prioritization based on expert elicitation and literature review. These criteria, described on a 3-tiered scale, were evaluated through a choice-based Conjoint Analysis questionnaire with 25 choice tasks. Questionnaire results were analyzed to obtain importance scores (for each criterion) and mean utility values (for each criterion level), and the latter were then used to rank 16 zoonoses. While the most important criterion for both groups was "Severity of the disease in humans", the second ranked criteria by the health professionals and students were "Economy" and "Treatment in humans", respectively. Regarding the criterion "Control and Prevention", health professionals tended to prioritize a disease when the control and preventive measures were described to be 95% effective, while students prioritized a disease if there were almost no control and preventive measures available. Bovine Spongiform Encephalopathy was the top-ranked disease by both groups. Health professionals and students agreed on the weighting of certain criteria such as "Severity" and "Treatment of disease in humans", but disagreed on others such as "Economy" or "Control and Prevention". Nonetheless, the overall disease ranking lists were similar, and these may be

  6. Morphological analysis of infrared images for waterjets

    NASA Astrophysics Data System (ADS)

    Gong, Yuxin; Long, Aifang

    2013-03-01

    High-speed waterjet has been widely used in industries and been investigated as a model of free shearing turbulence. This paper presents an investigation involving the flow visualization of high speed water jet, the noise reduction of the raw thermogram using a high-pass morphological filter ? and a median filter; the image enhancement using white top-hat filter; and the image segmentation using the multiple thresholding method. The image processing results by the designed morphological filters, ? - top-hat, were proved being ideal for further quantitative and in-depth analysis and can be used as a new morphological filter bank that may be of general implications for the analogous work

  7. Alzheimer's disease - a neurospirochetosis. Analysis of the evidence following Koch's and Hill's criteria

    PubMed Central

    2011-01-01

    It is established that chronic spirochetal infection can cause slowly progressive dementia, brain atrophy and amyloid deposition in late neurosyphilis. Recently it has been suggested that various types of spirochetes, in an analogous way to Treponema pallidum, could cause dementia and may be involved in the pathogenesis of Alzheimer's disease (AD). Here, we review all data available in the literature on the detection of spirochetes in AD and critically analyze the association and causal relationship between spirochetes and AD following established criteria of Koch and Hill. The results show a statistically significant association between spirochetes and AD (P = 1.5 × 10-17, OR = 20, 95% CI = 8-60, N = 247). When neutral techniques recognizing all types of spirochetes were used, or the highly prevalent periodontal pathogen Treponemas were analyzed, spirochetes were observed in the brain in more than 90% of AD cases. Borrelia burgdorferi was detected in the brain in 25.3% of AD cases analyzed and was 13 times more frequent in AD compared to controls. Periodontal pathogen Treponemas (T. pectinovorum, T. amylovorum, T. lecithinolyticum, T. maltophilum, T. medium, T. socranskii) and Borrelia burgdorferi were detected using species specific PCR and antibodies. Importantly, co-infection with several spirochetes occurs in AD. The pathological and biological hallmarks of AD were reproduced in vitro by exposure of mammalian cells to spirochetes. The analysis of reviewed data following Koch's and Hill's postulates shows a probable causal relationship between neurospirochetosis and AD. Persisting inflammation and amyloid deposition initiated and sustained by chronic spirochetal infection form together with the various hypotheses suggested to play a role in the pathogenesis of AD a comprehensive entity. As suggested by Hill, once the probability of a causal relationship is established prompt action is needed. Support and attention should be given to this field of AD research

  8. Analysis of Time-Sharing Contract Agreements with Related Suggested Systems Evaluation Criteria.

    ERIC Educational Resources Information Center

    Chanoux, Jo Ann J.

    While avoiding evaluation or specification of individual companies, computer time-sharing commercial contract agreements are analyzed. Price and non-price contract elements are analyzed according to 22 evaluation criteria: confidentiality measures assumed by the vendor; consultation services available; package programs and user routines; languages…

  9. A Study of the Comparability of External Criteria for Hierarchical Cluster Analysis.

    ERIC Educational Resources Information Center

    Milligan, Glenn W.; Cooper, Martha C.

    1986-01-01

    Five external criteria were used to evaluate the extent of recovery of the true structure in a hierarchical clustering solution. The results of the study indicated that the Hubert and Arabie adjusted Rank index was best suited to the task of comparison across hierarchy levels. (Author/LMO)

  10. SENSITIVITY ANALYSIS OF THE APPLICATION OF CHEMICAL EXPOSURE CRITERIA FOR COMPARING SITES AND WATERSHEDS

    EPA Science Inventory

    A methodology was developed for deriving quantitative exposure criteria useful for comparing a site or watershed to a reference condition. The prototype method used indicators of exposures to oil contamination and combustion by-products, naphthalene and benzo(a)pyrene metabolites...

  11. Evaluation of WHO criteria for diagnosis of polycythemia vera: a prospective analysis.

    PubMed

    Silver, Richard T; Chow, William; Orazi, Attilio; Arles, Stephen P; Goldsmith, Stanley J

    2013-09-12

    We prospectively evaluated the accuracy of the 2007 World Health Organization (WHO) criteria for diagnosing polycythemia vera (PV), especially in "early-stage" patients. A total of 28 of 30 patients were diagnosed as PV owing to an elevated Cr-51 red cell mass (RCM), JAK2 positivity, and at least 1 minor criterion. A total of 18 PV patients did not meet the WHO criterion for an increased hemoglobin value and 8 did not meet the WHO criterion for an increased hematocrit value. Bone marrow morphology was very valuable for diagnosis. Low serum erythropoietin (EPO) values were specific for PV, but normal EPO values were found at presentation (20%). We recommend revision of the WHO criteria, especially to distinguish early-stage PV from essential thrombocythemia. Major criteria remain JAK2 positivity and increased red cell volume, but Cr-51 RCM is mandatory for patients who do not meet the defined elevated hemoglobin or hematocrit value (>18.5 g/dL and 60% in men and >16.5 g/dL and 56% in women, respectively). Minor criteria remain bone marrow histology or a low serum EPO value. For patients with a normal EPO value, marrow examination is mandatory for diagnostic confirmation. Because the therapies for myeloproliferative disorders differ, our data have major clinical implications. PMID:23900239

  12. Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox

    PubMed Central

    Lacerda, Luis Miguel; Ferreira, Hugo Alexandre

    2015-01-01

    Aim. In recent years, connectivity studies using neuroimaging data have increased the understanding of the organization of large-scale structural and functional brain networks. However, data analysis is time consuming as rigorous procedures must be assured, from structuring data and pre-processing to modality specific data procedures. Until now, no single toolbox was able to perform such investigations on truly multimodal image data from beginning to end, including the combination of different connectivity analyses. Thus, we have developed the Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox with the goal of diminishing time waste in data processing and to allow an innovative and comprehensive approach to brain connectivity. Materials and Methods. The MIBCA toolbox is a fully automated all-in-one connectivity toolbox that offers pre-processing, connectivity and graph theoretical analyses of multimodal image data such as diffusion-weighted imaging, functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). It was developed in MATLAB environment and pipelines well-known neuroimaging softwares such as Freesurfer, SPM, FSL, and Diffusion Toolkit. It further implements routines for the construction of structural, functional and effective or combined connectivity matrices, as well as, routines for the extraction and calculation of imaging and graph-theory metrics, the latter using also functions from the Brain Connectivity Toolbox. Finally, the toolbox performs group statistical analysis and enables data visualization in the form of matrices, 3D brain graphs and connectograms. In this paper the MIBCA toolbox is presented by illustrating its capabilities using multimodal image data from a group of 35 healthy subjects (19–73 years old) with volumetric T1-weighted, diffusion tensor imaging, and resting state fMRI data, and 10 subjets with 18F-Altanserin PET data also. Results. It was observed both a high inter-hemispheric symmetry

  13. Analysis of Handling Qualities Design Criteria for Active Inceptor Force-Feel Characteristics

    NASA Technical Reports Server (NTRS)

    Malpica, Carlos A.; Lusardi, Jeff A.

    2013-01-01

    ratio. While these two studies produced boundaries for acceptable/unacceptable stick dynamics for rotorcraft, they were not able to provide guidance on how variations of the stick dynamics in the acceptable region impact handling qualities. More recently, a ground based simulation study [5] suggested little benefit was to be obtained from variations of the damping ratio for a side-stick controller exhibiting high natural frequencies (greater than 17 rad/s) and damping ratios (greater than 2.0). A flight test campaign was conducted concurrently on the RASCAL JUH-60A in-flight simulator and the ACT/FHS EC-135 in flight simulator [6]. Upon detailed analysis of the pilot evaluations the study identified a clear preference for a high damping ratio and natural frequency of the center stick inceptors. Side stick controllers were found to be less sensitive to the damping. While these studies have compiled a substantial amount of data, in the form of qualitative and quantitative pilot opinion, a fundamental analysis of the effect of the inceptor force-feel system on flight control is found to be lacking. The study of Ref. [6] specifically concluded that a systematic analysis was necessary, since discrepancies with the assigned handling qualities showed that proposed analytical design metrics, or criteria, were not suitable. The overall goal of the present study is to develop a clearer fundamental understanding of the underlying mechanisms associated with the inceptor dynamics that govern the handling qualities using a manageable analytical methodology.

  14. Pain related inflammation analysis using infrared images

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Bardhan, Shawli; Das, Kakali; Bhattacharjee, Debotosh; Nath, Satyabrata

    2016-05-01

    Medical Infrared Thermography (MIT) offers a potential non-invasive, non-contact and radiation free imaging modality for assessment of abnormal inflammation having pain in the human body. The assessment of inflammation mainly depends on the emission of heat from the skin surface. Arthritis is a disease of joint damage that generates inflammation in one or more anatomical joints of the body. Osteoarthritis (OA) is the most frequent appearing form of arthritis, and rheumatoid arthritis (RA) is the most threatening form of them. In this study, the inflammatory analysis has been performed on the infrared images of patients suffering from RA and OA. For the analysis, a dataset of 30 bilateral knee thermograms has been captured from the patient of RA and OA by following a thermogram acquisition standard. The thermograms are pre-processed, and areas of interest are extracted for further processing. The investigation of the spread of inflammation is performed along with the statistical analysis of the pre-processed thermograms. The objectives of the study include: i) Generation of a novel thermogram acquisition standard for inflammatory pain disease ii) Analysis of the spread of the inflammation related to RA and OA using K-means clustering. iii) First and second order statistical analysis of pre-processed thermograms. The conclusion reflects that, in most of the cases, RA oriented inflammation affects bilateral knees whereas inflammation related to OA present in the unilateral knee. Also due to the spread of inflammation in OA, contralateral asymmetries are detected through the statistical analysis.

  15. Image analysis for measuring rod network properties

    NASA Astrophysics Data System (ADS)

    Kim, Dongjae; Choi, Jungkyu; Nam, Jaewook

    2015-12-01

    In recent years, metallic nanowires have been attracting significant attention as next-generation flexible transparent conductive films. The performance of films depends on the network structure created by nanowires. Gaining an understanding of their structure, such as connectivity, coverage, and alignment of nanowires, requires the knowledge of individual nanowires inside the microscopic images taken from the film. Although nanowires are flexible up to a certain extent, they are usually depicted as rigid rods in many analysis and computational studies. Herein, we propose a simple and straightforward algorithm based on the filtering in the frequency domain for detecting the rod-shape objects inside binary images. The proposed algorithm uses a specially designed filter in the frequency domain to detect image segments, namely, the connected components aligned in a certain direction. Those components are post-processed to be combined under a given merging rule in a single rod object. In this study, the microscopic properties of the rod networks relevant to the analysis of nanowire networks were measured for investigating the opto-electric performance of transparent conductive films and their alignment distribution, length distribution, and area fraction. To verify and find the optimum parameters for the proposed algorithm, numerical experiments were performed on synthetic images with predefined properties. By selecting proper parameters, the algorithm was used to investigate silver nanowire transparent conductive films fabricated by the dip coating method.

  16. Scalable histopathological image analysis via active learning.

    PubMed

    Zhu, Yan; Zhang, Shaoting; Liu, Wei; Metaxas, Dimitris N

    2014-01-01

    Training an effective and scalable system for medical image analysis usually requires a large amount of labeled data, which incurs a tremendous annotation burden for pathologists. Recent progress in active learning can alleviate this issue, leading to a great reduction on the labeling cost without sacrificing the predicting accuracy too much. However, most existing active learning methods disregard the "structured information" that may exist in medical images (e.g., data from individual patients), and make a simplifying assumption that unlabeled data is independently and identically distributed. Both may not be suitable for real-world medical images. In this paper, we propose a novel batch-mode active learning method which explores and leverages such structured information in annotations of medical images to enforce diversity among the selected data, therefore maximizing the information gain. We formulate the active learning problem as an adaptive submodular function maximization problem subject to a partition matroid constraint, and further present an efficient greedy algorithm to achieve a good solution with a theoretically proven bound. We demonstrate the efficacy of our algorithm on thousands of histopathological images of breast microscopic tissues. PMID:25320821

  17. Multiresolution simulated annealing for brain image analysis

    NASA Astrophysics Data System (ADS)

    Loncaric, Sven; Majcenic, Zoran

    1999-05-01

    Analysis of biomedical images is an important step in quantification of various diseases such as human spontaneous intracerebral brain hemorrhage (ICH). In particular, the study of outcome in patients having ICH requires measurements of various ICH parameters such as hemorrhage volume and their change over time. A multiresolution probabilistic approach for segmentation of CT head images is presented in this work. This method views the segmentation problem as a pixel labeling problem. In this application the labels are: background, skull, brain tissue, and ICH. The proposed method is based on the Maximum A-Posteriori (MAP) estimation of the unknown pixel labels. The MAP method maximizes the a-posterior probability of segmented image given the observed (input) image. Markov random field (MRF) model has been used for the posterior distribution. The MAP estimation of the segmented image has been determined using the simulated annealing (SA) algorithm. The SA algorithm is used to minimize the energy function associated with MRF posterior distribution function. A multiresolution SA (MSA) has been developed to speed up the annealing process. MSA is presented in detail in this work. A knowledge-based classification based on the brightness, size, shape and relative position toward other regions is performed at the end of the procedure. The regions are identified as background, skull, brain, ICH and calcifications.

  18. The synthesis and analysis of color images

    NASA Technical Reports Server (NTRS)

    Wandell, Brian A.

    1987-01-01

    A method is described for performing the synthesis and analysis of digital color images. The method is based on two principles. First, image data are represented with respect to the separate physical factors, surface reflectance and the spectral power distribution of the ambient light, that give rise to the perceived color of an object. Second, the encoding is made efficiently by using a basis expansion for the surface spectral reflectance and spectral power distribution of the ambient light that takes advantage of the high degree of correlation across the visible wavelengths normally found in such functions. Within this framework, the same basic methods can be used to synthesize image data for color display monitors and printed materials, and to analyze image data into estimates of the spectral power distribution and surface spectral reflectances. The method can be applied to a variety of tasks. Examples of applications include the color balancing of color images, and the identification of material surface spectral reflectance when the lighting cannot be completely controlled.

  19. The synthesis and analysis of color images

    NASA Technical Reports Server (NTRS)

    Wandell, B. A.

    1985-01-01

    A method is described for performing the synthesis and analysis of digital color images. The method is based on two principles. First, image data are represented with respect to the separate physical factors, surface reflectance and the spectral power distribution of the ambient light, that give rise to the perceived color of an object. Second, the encoding is made efficient by using a basis expansion for the surface spectral reflectance and spectral power distribution of the ambient light that takes advantage of the high degree of correlation across the visible wavelengths normally found in such functions. Within this framework, the same basic methods can be used to synthesize image data for color display monitors and printed materials, and to analyze image data into estimates of the spectral power distribution and surface spectral reflectances. The method can be applied to a variety of tasks. Examples of applications include the color balancing of color images, and the identification of material surface spectral reflectance when the lighting cannot be completely controlled.

  20. Applying Multiple Criteria Decision Analysis to Comparative Benefit-Risk Assessment: Choosing among Statins in Primary Prevention.

    PubMed

    Tervonen, Tommi; Naci, Huseyin; van Valkenhoef, Gert; Ades, Anthony E; Angelis, Aris; Hillege, Hans L; Postmus, Douwe

    2015-10-01

    Decision makers in different health care settings need to weigh the benefits and harms of alternative treatment strategies. Such health care decisions include marketing authorization by regulatory agencies, practice guideline formulation by clinical groups, and treatment selection by prescribers and patients in clinical practice. Multiple criteria decision analysis (MCDA) is a family of formal methods that help make explicit the tradeoffs that decision makers accept between the benefit and risk outcomes of different treatment options. Despite the recent interest in MCDA, certain methodological aspects are poorly understood. This paper presents 7 guidelines for applying MCDA in benefit-risk assessment and illustrates their use in the selection of a statin drug for the primary prevention of cardiovascular disease. We provide guidance on the key methodological issues of how to define the decision problem, how to select a set of nonoverlapping evaluation criteria, how to synthesize and summarize the evidence, how to translate relative measures to absolute ones that permit comparisons between the criteria, how to define suitable scale ranges, how to elicit partial preference information from the decision makers, and how to incorporate uncertainty in the analysis. Our example on statins indicates that fluvastatin is likely to be the most preferred drug by our decision maker and that this result is insensitive to the amount of preference information incorporated in the analysis. PMID:25986470

  1. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  2. Evaluation of 3D multimodality image registration using receiver operating characteristic (ROC) analysis

    NASA Astrophysics Data System (ADS)

    Holton Tainter, Kerrie S.; Robb, Richard A.; Taneja, Udita; Gray, Joel E.

    1995-04-01

    Receiver operating characteristic analysis has evolved as a useful method for evaluating the discriminatory capability and efficacy of visualization. The ability of such analysis to account for the variance in decision criteria of multiple observers, multiple reading, and a wide range of difficulty in detection among case studies makes ROC especially useful for interpreting the results of a viewing experiment. We are currently using ROC analysis to evaluate the effectiveness of using fused multispectral, or complementary multimodality imaging data in the diagnostic process. The use of multispectral image recordings, gathered from multiple imaging modalities, to provide advanced image visualization and quantization capabilities in evaluating medical images is an important challenge facing medical imaging scientists. Such capabilities would potentially significantly enhance the ability of clinicians to extract scientific and diagnostic information from images. a first step in the effective use of multispectral information is the spatial registration of complementary image datasets so that a point-to-point correspondence exists between them. We are developing a paradigm of measuring the accuracy of existing image registration techniques which includes the ability to relate quantitative measurements, taken from the images themselves, to the decisions made by observers about the state of registration (SOR) of the 3D images. We have used ROC analysis to evaluate the ability of observers to discriminate between correctly registered and incorrectly registered multimodality fused images. We believe this experience is original and represents the first time that ROC analysis has been used to evaluate registered/fused images. We have simulated low-resolution and high-resolution images from real patient MR images of the brain, and fused them with the original MR to produce colorwash superposition images whose exact SOR is known. We have also attempted to extend this analysis to

  3. Imaging Brain Dynamics Using Independent Component Analysis

    PubMed Central

    Jung, Tzyy-Ping; Makeig, Scott; McKeown, Martin J.; Bell, Anthony J.; Lee, Te-Won; Sejnowski, Terrence J.

    2010-01-01

    The analysis of electroencephalographic (EEG) and magnetoencephalographic (MEG) recordings is important both for basic brain research and for medical diagnosis and treatment. Independent component analysis (ICA) is an effective method for removing artifacts and separating sources of the brain signals from these recordings. A similar approach is proving useful for analyzing functional magnetic resonance brain imaging (fMRI) data. In this paper, we outline the assumptions underlying ICA and demonstrate its application to a variety of electrical and hemodynamic recordings from the human brain. PMID:20824156

  4. Criteria for the use of regression analysis for remote sensing of sediment and pollutants

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R.

    1982-01-01

    An examination of limitations, requirements, and precision of the linear multiple-regression technique for quantification of marine environmental parameters is conducted. Both environmental and optical physics conditions have been defined for which an exact solution to the signal response equations is of the same form as the multiple regression equation. Various statistical parameters are examined to define a criteria for selection of an unbiased fit when upwelled radiance values contain error and are correlated with each other. Field experimental data are examined to define data smoothing requirements in order to satisfy the criteria of Daniel and Wood (1971). Recommendations are made concerning improved selection of ground-truth locations to maximize variance and to minimize physical errors associated with the remote sensing experiment.

  5. Effect analysis and design on array geometry for coincidence imaging radar based on effective rank theory

    NASA Astrophysics Data System (ADS)

    Zha, Guofeng; Wang, Hongqiang; Yang, Zhaocheng; Cheng, Yongqiang; Qin, Yuliang

    2016-03-01

    As a complementary imaging technology, coincidence imaging radar (CIR) achieves super-resolution in real aperture staring radar imagery via employing the temporal-spatial independent array detecting (TSIAD) signals. The characters of TSIAD signals are impacted by the array geometry and the imaging performance are influenced by the relative imaging position with respect to antennas array. In this paper, the effect of array geometry on CIR system is investigated in detail based on the judgment criteria of the effective rank theory. In course of analyzing of these influences, useful system design guidance about the array geometry is remarked for the CIR system. With the design guidance, the target images are reconstructed based on the Tikhonov regularization algorithm. Simulation results are presented to validate the whole analysis and the efficiency of the design guidance.

  6. Optimal site selection for sitting a solar park using multi-criteria decision analysis and geographical information systems

    NASA Astrophysics Data System (ADS)

    Georgiou, Andreas; Skarlatos, Dimitrios

    2016-07-01

    Among the renewable power sources, solar power is rapidly becoming popular because it is inexhaustible, clean, and dependable. It has also become more efficient since the power conversion efficiency of photovoltaic solar cells has increased. Following these trends, solar power will become more affordable in years to come and considerable investments are to be expected. Despite the size of solar plants, the sitting procedure is a crucial factor for their efficiency and financial viability. Many aspects influence such a decision: legal, environmental, technical, and financial to name a few. This paper describes a general integrated framework to evaluate land suitability for the optimal placement of photovoltaic solar power plants, which is based on a combination of a geographic information system (GIS), remote sensing techniques, and multi-criteria decision-making methods. An application of the proposed framework for the Limassol district in Cyprus is further illustrated. The combination of a GIS and multi-criteria methods produces an excellent analysis tool that creates an extensive database of spatial and non-spatial data, which will be used to simplify problems as well as solve and promote the use of multiple criteria. A set of environmental, economic, social, and technical constrains, based on recent Cypriot legislation, European's Union policies, and expert advice, identifies the potential sites for solar park installation. The pairwise comparison method in the context of the analytic hierarchy process (AHP) is applied to estimate the criteria weights in order to establish their relative importance in site evaluation. In addition, four different methods to combine information layers and check their sensitivity were used. The first considered all the criteria as being equally important and assigned them equal weight, whereas the others grouped the criteria and graded them according to their objective perceived importance. The overall suitability of the study

  7. F-106 data summary and model results relative to threat criteria and protection design analysis

    NASA Technical Reports Server (NTRS)

    Pitts, F. L.; Finelli, G. B.; Perala, R. A.; Rudolph, T. H.

    1986-01-01

    The NASA F-106 has acquired considerable data on the rates-of-change of EM parameters on the aircraft surface during 690 direct lightning strikes while penetrating thunderstorms at altitudes from 15,000 to 40,000 feet. The data are presently being used in updating previous lightning criteria and standards. The new lightning standards will, therefore, be the first which reflect actual aircraft responses measured at flight altitudes.

  8. Extended Tabu Search on Fuzzy Traveling Salesman Problem in Multi-criteria Analysis

    NASA Astrophysics Data System (ADS)

    Zheng, Yujun

    The paper proposes an extended tabu search algorithm for the traveling salesman problem (TSP) with fuzzy edge weights. The algorithm considers three important fuzzy ranking criteria including expected value, optimistic value and pessimistic value, and performs a three-stage search towards the Pareto front, involving a preferred criterion at each stage. Simulations demonstrate that our approach can produce a set of near optimal solutions for fuzzy TSP instances with up to 750 uniformly randomly generated nodes.

  9. GIS analysis of the siting criteria for the Mixed and Low-Level Waste Treatment Facility and the Idaho Waste Processing Facility

    SciTech Connect

    Hoskinson, R.L.

    1994-01-01

    This report summarizes a study conducted using the Arc/Info{reg_sign} geographic information system (GIS) to analyze the criteria used for site selection for the Mixed and Low-Level Waste Treatment Facility (MLLWTF) and the Idaho Waste Processing Facility (IWPF). The purpose of the analyses was to determine, based on predefined criteria, the areas on the INEL that best satisfied the criteria. The coverages used in this study were produced by importing the AutoCAD files that produced the maps for a pre site selection draft report into the GIS. The files were then converted to Arc/Info{reg_sign} GIS format. The initial analysis was made by considering all of the criteria as having equal importance in determining the areas of the INEL that would best satisfy the requirements. Another analysis emphasized four of the criteria as ``must`` criteria which had to be satisfied. Additional analyses considered other criteria that were considered for, but not included in the predefined criteria. This GIS analysis of the siting criteria for the IWPF and MLLWTF provides a logical, repeatable, and defensible approach to the determination of candidate locations for the facilities. The results of the analyses support the location of the Candidate Locations.

  10. System analysis approach to deriving design criteria (loads) for Space Shuttle and its payloads. Volume 1: General statement of approach

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.

    1981-01-01

    Space shuttle, the most complex transportation system designed to date, illustrates the requirement for an analysis approach that considers all major disciplines simultaneously. Its unique cross coupling and high sensitivity to aerodynamic uncertainties and high performance requirements dictated a less conservative approach than those taken in programs. Analyses performed for the space shuttle and certain payloads, Space Telescope and Spacelab, are used a examples. These illustrate the requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements control requirements and the resulting design verification approaches. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.

  11. System analysis approach to deriving design criteria (Loads) for Space Shuttle and its payloads. Volume 2: Typical examples

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.

    1981-01-01

    The achievement of an optimized design from the system standpoint under the low cost, high risk constraints of the present day environment was analyzed. Space Shuttle illustrates the requirement for an analysis approach that considers all major disciplines (coupling between structures control, propulsion, thermal, aeroelastic, and performance), simultaneously. The Space Shuttle and certain payloads, Space Telescope and Spacelab, are examined. The requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements, control requirements, and the resulting design verification approaches are illustrated. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.

  12. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    PubMed

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments). PMID:26298253

  13. Image analysis of Renaissance copperplate prints

    NASA Astrophysics Data System (ADS)

    Hedges, S. Blair

    2008-02-01

    From the fifteenth to the nineteenth centuries, prints were a common form of visual communication, analogous to photographs. Copperplate prints have many finely engraved black lines which were used to create the illusion of continuous tone. Line densities generally are 100-2000 lines per square centimeter and a print can contain more than a million total engraved lines 20-300 micrometers in width. Because hundreds to thousands of prints were made from a single copperplate over decades, variation among prints can have historical value. The largest variation is plate-related, which is the thinning of lines over successive editions as a result of plate polishing to remove time-accumulated corrosion. Thinning can be quantified with image analysis and used to date undated prints and books containing prints. Print-related variation, such as over-inking of the print, is a smaller but significant source. Image-related variation can introduce bias if images were differentially illuminated or not in focus, but improved imaging technology can limit this variation. The Print Index, the percentage of an area composed of lines, is proposed as a primary measure of variation. Statistical methods also are proposed for comparing and identifying prints in the context of a print database.

  14. Multispectral laser imaging for advanced food analysis

    NASA Astrophysics Data System (ADS)

    Senni, L.; Burrascano, P.; Ricci, M.

    2016-07-01

    A hardware-software apparatus for food inspection capable of realizing multispectral NIR laser imaging at four different wavelengths is herein discussed. The system was designed to operate in a through-transmission configuration to detect the presence of unwanted foreign bodies inside samples, whether packed or unpacked. A modified Lock-In technique was employed to counterbalance the significant signal intensity attenuation due to transmission across the sample and to extract the multispectral information more efficiently. The NIR laser wavelengths used to acquire the multispectral images can be varied to deal with different materials and to focus on specific aspects. In the present work the wavelengths were selected after a preliminary analysis to enhance the image contrast between foreign bodies and food in the sample, thus identifying the location and nature of the defects. Experimental results obtained from several specimens, with and without packaging, are presented and the multispectral image processing as well as the achievable spatial resolution of the system are discussed.

  15. ACR Appropriateness Criteria Myelopathy.

    PubMed

    Roth, Christopher J; Angevine, Peter D; Aulino, Joseph M; Berger, Kevin L; Choudhri, Asim F; Fries, Ian Blair; Holly, Langston T; Kendi, Ayse Tuba Karaqulle; Kessler, Marcus M; Kirsch, Claudia F; Luttrull, Michael D; Mechtler, Laszlo L; O'Toole, John E; Sharma, Aseem; Shetty, Vilaas S; West, O Clark; Cornelius, Rebecca S; Bykowski, Julie

    2016-01-01

    Patients presenting with myelopathic symptoms may have a number of causative intradural and extradural etiologies, including disc degenerative diseases, spinal masses, infectious or inflammatory processes, vascular compromise, and vertebral fracture. Patients may present acutely or insidiously and may progress toward long-term paralysis if not treated promptly and effectively. Noncontrast CT is the most appropriate first examination in acute trauma cases to diagnose vertebral fracture as the cause of acute myelopathy. In most nontraumatic cases, MRI is the modality of choice to evaluate the location, severity, and causative etiology of spinal cord myelopathy, and predicts which patients may benefit from surgery. Myelopathy from spinal stenosis and spinal osteoarthritis is best confirmed without MRI intravenous contrast. Many other myelopathic conditions are more easily visualized after contrast administration. Imaging performed should be limited to the appropriate spinal levels, based on history, physical examination, and clinical judgment. The ACR Appropriateness Criteria are evidence-based guidelines for specific clinical conditions that are reviewed every three years by a multidisciplinary expert panel. The guideline development and review include an extensive analysis of current medical literature from peer-reviewed journals, and the application of a well-established consensus methodology (modified Delphi) to rate the appropriateness of imaging and treatment procedures by the panel. In those instances in which evidence is lacking or not definitive, expert opinion may be used to recommend imaging or treatment. PMID:26653797

  16. Nursing image: an evolutionary concept analysis.

    PubMed

    Rezaei-Adaryani, Morteza; Salsali, Mahvash; Mohammadi, Eesa

    2012-12-01

    A long-term challenge to the nursing profession is the concept of image. In this study, we used the Rodgers' evolutionary concept analysis approach to analyze the concept of nursing image (NI). The aim of this concept analysis was to clarify the attributes, antecedents, consequences, and implications associated with the concept. We performed an integrative internet-based literature review to retrieve English literature published from 1980-2011. Findings showed that NI is a multidimensional, all-inclusive, paradoxical, dynamic, and complex concept. The media, invisibility, clothing style, nurses' behaviors, gender issues, and professional organizations are the most important antecedents of the concept. We found that NI is pivotal in staff recruitment and nursing shortage, resource allocation to nursing, nurses' job performance, workload, burnout and job dissatisfaction, violence against nurses, public trust, and salaries available to nurses. An in-depth understanding of the NI concept would assist nurses to eliminate negative stereotypes and build a more professional image for the nurse and the profession. PMID:23343236

  17. Shannon information and ROC analysis in imaging.

    PubMed

    Clarkson, Eric; Cushing, Johnathan B

    2015-07-01

    Shannon information (SI) and the ideal-observer receiver operating characteristic (ROC) curve are two different methods for analyzing the performance of an imaging system for a binary classification task, such as the detection of a variable signal embedded within a random background. In this work we describe a new ROC curve, the Shannon information receiver operator curve (SIROC), that is derived from the SI expression for a binary classification task. We then show that the ideal-observer ROC curve and the SIROC have many properties in common, and are equivalent descriptions of the optimal performance of an observer on the task. This equivalence is described mathematically by an integral transform that maps the ideal-observer ROC curve onto the SIROC. This then leads to an integral transform relating the minimum probability of error, as a function of the odds against a signal, to the conditional entropy, as a function of the same variable. This last relation then gives us the complete mathematical equivalence between ideal-observer ROC analysis and SI analysis of the classification task for a given imaging system. We also find that there is a close relationship between the area under the ideal-observer ROC curve, which is often used as a figure of merit for imaging systems and the area under the SIROC. Finally, we show that the relationships between the two curves result in new inequalities relating SI to ROC quantities for the ideal observer. PMID:26367158

  18. Simple Low Level Features for Image Analysis

    NASA Astrophysics Data System (ADS)

    Falcoz, Paolo

    As human beings, we perceive the world around us mainly through our eyes, and give what we see the status of “reality”; as such we historically tried to create ways of recording this reality so we could augment or extend our memory. From early attempts in photography like the image produced in 1826 by the French inventor Nicéphore Niépce (Figure 2.1) to the latest high definition camcorders, the number of recorded pieces of reality increased exponentially, posing the problem of managing all that information. Most of the raw video material produced today has lost its memory augmentation function, as it will hardly ever be viewed by any human; pervasive CCTVs are an example. They generate an enormous amount of data each day, but there is not enough “human processing power” to view them. Therefore the need for effective automatic image analysis tools is great, and a lot effort has been put in it, both from the academia and the industry. In this chapter, a review of some of the most important image analysis tools are presented.

  19. Wavelet-based image analysis system for soil texture analysis

    NASA Astrophysics Data System (ADS)

    Sun, Yun; Long, Zhiling; Jang, Ping-Rey; Plodinec, M. John

    2003-05-01

    Soil texture is defined as the relative proportion of clay, silt and sand found in a given soil sample. It is an important physical property of soil that affects such phenomena as plant growth and agricultural fertility. Traditional methods used to determine soil texture are either time consuming (hydrometer), or subjective and experience-demanding (field tactile evaluation). Considering that textural patterns observed at soil surfaces are uniquely associated with soil textures, we propose an innovative approach to soil texture analysis, in which wavelet frames-based features representing texture contents of soil images are extracted and categorized by applying a maximum likelihood criterion. The soil texture analysis system has been tested successfully with an accuracy of 91% in classifying soil samples into one of three general categories of soil textures. In comparison with the common methods, this wavelet-based image analysis approach is convenient, efficient, fast, and objective.

  20. PAMS photo image retrieval prototype alternatives analysis

    SciTech Connect

    Conner, M.L.

    1996-04-30

    Photography and Audiovisual Services uses a system called the Photography and Audiovisual Management System (PAMS) to perform order entry and billing services. The PAMS system utilizes Revelation Technologies database management software, AREV. Work is currently in progress to link the PAMS AREV system to a Microsoft SQL Server database engine to provide photograph indexing and query capabilities. The link between AREV and SQLServer will use a technique called ``bonding.`` This photograph imaging subsystem will interface to the PAMS system and handle the image capture and retrieval portions of the project. The intent of this alternatives analysis is to examine the software and hardware alternatives available to meet the requirements for this project, and identify a cost-effective solution.

  1. Uses of software in digital image analysis: a forensic report

    NASA Astrophysics Data System (ADS)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  2. Analysis of Tank 241-AN-106 characterization and grout performance criteria

    SciTech Connect

    Liebetrau, A.M.; Anderson, C.M.

    1993-02-01

    This report provides an assessment of how well we can resolve the following issues concerning Tank 241-AN-106 at the Hanford Reservation, given the current state of information: How well we can characterize the contents of 241-AN-106; whether the degree of characterization is sufficient to use 241-AN-106 wastes to develop tests of grout adequacy. The wastes must be characterized not only to ensure grout adequacy but also to provide assurance that the wastes can be successfully and safely transferred. In this report, we evaluate the adequacy of characterization for transfer and tests of grout adequacy, and we evaluate the current status of acceptance criteria and grout formulation experiments.

  3. Analysis of katabatic flow using infrared imaging

    NASA Astrophysics Data System (ADS)

    Grudzielanek, M.; Cermak, J.

    2013-12-01

    We present a novel high-resolution IR method which is developed, tested and used for the analysis of katabatic flow. Modern thermal imaging systems allow for the recording of infrared picture sequences and thus the monitoring and analysis of dynamic processes. In order to identify, visualize and analyze dynamic air flow using infrared imaging, a highly reactive 'projection' surface is needed along the air flow. Here, a design for these types of analysis is proposed and evaluated. Air flow situations with strong air temperature gradients and fluctuations, such as katabatic flow, are particularly suitable for this new method. The method is applied here to analyze nocturnal cold air flows on gentle slopes. In combination with traditional methods the vertical and temporal dynamics of cold air flow are analyzed. Several assumptions on cold air flow dynamics can be confirmed explicitly for the first time. By observing the cold air flow in terms of frequency, size and period of the cold air fluctuations, drops are identified and organized in a newly derived classification system of cold air flow phases. In addition, new flow characteristics are detected, like sharp cold air caps and turbulence inside the drops. Vertical temperature gradients inside cold air drops and their temporal evolution are presented in high resolution Hovmöller-type diagrams and sequenced time lapse infrared videos.

  4. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  5. Skin age testing criteria: characterization of human skin structures by 500 MHz MRI multiple contrast and image processing

    NASA Astrophysics Data System (ADS)

    Sharma, Rakesh

    2010-07-01

    Ex vivo magnetic resonance microimaging (MRM) image characteristics are reported in human skin samples in different age groups. Human excised skin samples were imaged using a custom coil placed inside a 500 MHz NMR imager for high-resolution microimaging. Skin MRI images were processed for characterization of different skin structures. Contiguous cross-sectional T1-weighted 3D spin echo MRI, T2-weighted 3D spin echo MRI and proton density images were compared with skin histopathology and NMR peaks. In all skin specimens, epidermis and dermis thickening and hair follicle size were measured using MRM. Optimized parameters TE and TR and multicontrast enhancement generated better MRI visibility of different skin components. Within high MR signal regions near to the custom coil, MRI images with short echo time were comparable with digitized histological sections for skin structures of the epidermis, dermis and hair follicles in 6 (67%) of the nine specimens. Skin % tissue composition, measurement of the epidermis, dermis, sebaceous gland and hair follicle size, and skin NMR peaks were signatures of skin type. The image processing determined the dimensionality of skin tissue components and skin typing. The ex vivo MRI images and histopathology of the skin may be used to measure the skin structure and skin NMR peaks with image processing may be a tool for determining skin typing and skin composition.

  6. An analysis of the qualification criteria for small radioactive material shipping packages

    SciTech Connect

    McClure, J.D.

    1983-05-01

    The RAM package design certification process has two important elements, testing and acceptance. These terms sound very similar but they have specific meanings. Qualification testing in the context of this study is the imposition of simulated accident test conditions upon the candidate package design. (Normal transportation environments may also be included.) Following qualification testing, the acceptance criteria provide the performance levels which, if demonstrated, indicate the ability of the RAM package to sustain the severity of the qualification testing sequence and yet maintain specified levels of package integrity. This study has used Severities of Transportation Accidents as a data base to examine the regulatory test criteria which are required to be met by small packages containing Type B quantities of radioactive material (RAM). The basic findings indicate that the present regulatory test standards provide significantly higher levels of protection for the surface transportation modes (truck, rail) than for RAM packages shipped by aircraft. It should also be noted that various risk assessment studies have shown that the risk to the public due to severe transport accidents by surface and air transport modes is very low. A key element in this study was the quantification of the severity of the transportation accident environment and the severity of the present qualification test standards (called qualification test standards in this document) so that a direct comparison could be made between them to assess the effectiveness of the existing qualification test standards. The manner in which this was accomplished is described.

  7. Machine learning for medical images analysis.

    PubMed

    Criminisi, A

    2016-10-01

    This article discusses the application of machine learning for the analysis of medical images. Specifically: (i) We show how a special type of learning models can be thought of as automatically optimized, hierarchically-structured, rule-based algorithms, and (ii) We discuss how the issue of collecting large labelled datasets applies to both conventional algorithms as well as machine learning techniques. The size of the training database is a function of model complexity rather than a characteristic of machine learning methods. PMID:27374127

  8. Research on automatic human chromosome image analysis

    NASA Astrophysics Data System (ADS)

    Ming, Delie; Tian, Jinwen; Liu, Jian

    2007-11-01

    Human chromosome karyotyping is one of the essential tasks in cytogenetics, especially in genetic syndrome diagnoses. In this thesis, an automatic procedure is introduced for human chromosome image analysis. According to different status of touching and overlapping chromosomes, several segmentation methods are proposed to achieve the best results. Medial axis is extracted by the middle point algorithm. Chromosome band is enhanced by the algorithm based on multiscale B-spline wavelets, extracted by average gray profile, gradient profile and shape profile, and calculated by the WDD (Weighted Density Distribution) descriptors. The multilayer classifier is used in classification. Experiment results demonstrate that the algorithms perform well.

  9. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health. PMID:26734840

  10. Image reconstruction from Pulsed Fast Neutron Analysis

    NASA Astrophysics Data System (ADS)

    Bendahan, Joseph; Feinstein, Leon; Keeley, Doug; Loveman, Rob

    1999-06-01

    Pulsed Fast Neutron Analysis (PFNA) has been demonstrated to detect drugs and explosives in trucks and large cargo containers. PFNA uses a collimated beam of nanosecond-pulsed fast neutrons that interact with the cargo contents to produce gamma rays characteristic to their elemental composition. By timing the arrival of the emitted radiation to an array of gamma-ray detectors a three-dimensional elemental density map or image of the cargo is created. The process to determine the elemental densities is complex and requires a number of steps. The first step consists of extracting from the characteristic gamma-ray spectra the counts associated with the elements of interest. Other steps are needed to correct for physical quantities such as gamma-ray production cross sections and angular distributions. The image processing includes also phenomenological corrections that take into account the neutron attenuation through the cargo, and the attenuation of the gamma rays from the point they were generated to the gamma-ray detectors. Additional processing is required to map the elemental densities from the data acquisition system of coordinates to a rectilinear system. This paper describes the image processing used to compute the elemental densities from the counts observed in the gamma-ray detectors.

  11. Image reconstruction from Pulsed Fast Neutron Analysis

    SciTech Connect

    Bendahan, Joseph; Feinstein, Leon; Keeley, Doug; Loveman, Rob

    1999-06-10

    Pulsed Fast Neutron Analysis (PFNA) has been demonstrated to detect drugs and explosives in trucks and large cargo containers. PFNA uses a collimated beam of nanosecond-pulsed fast neutrons that interact with the cargo contents to produce gamma rays characteristic to their elemental composition. By timing the arrival of the emitted radiation to an array of gamma-ray detectors a three-dimensional elemental density map or image of the cargo is created. The process to determine the elemental densities is complex and requires a number of steps. The first step consists of extracting from the characteristic gamma-ray spectra the counts associated with the elements of interest. Other steps are needed to correct for physical quantities such as gamma-ray production cross sections and angular distributions. The image processing includes also phenomenological corrections that take into account the neutron attenuation through the cargo, and the attenuation of the gamma rays from the point they were generated to the gamma-ray detectors. Additional processing is required to map the elemental densities from the data acquisition system of coordinates to a rectilinear system. This paper describes the image processing used to compute the elemental densities from the counts observed in the gamma-ray detectors.

  12. Soil Surface Roughness through Image Analysis

    NASA Astrophysics Data System (ADS)

    Tarquis, A. M.; Saa-Requejo, A.; Valencia, J. L.; Moratiel, R.; Paz-Gonzalez, A.; Agro-Environmental Modeling

    2011-12-01

    Soil erosion is a complex phenomenon involving the detachment and transport of soil particles, storage and runoff of rainwater, and infiltration. The relative magnitude and importance of these processes depends on several factors being one of them surface micro-topography, usually quantified trough soil surface roughness (SSR). SSR greatly affects surface sealing and runoff generation, yet little information is available about the effect of roughness on the spatial distribution of runoff and on flow concentration. The methods commonly used to measure SSR involve measuring point elevation using a pin roughness meter or laser, both of which are labor intensive and expensive. Lately a simple and inexpensive technique based on percentage of shadow in soil surface image has been developed to determine SSR in the field in order to obtain measurement for wide spread application. One of the first steps in this technique is image de-noising and thresholding to estimate the percentage of black pixels in the studied area. In this work, a series of soil surface images have been analyzed applying several de-noising wavelet analysis and thresholding algorithms to study the variation in percentage of shadows and the shadows size distribution. Funding provided by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no. AGL2010- 21501/AGR and by Xunta de Galicia through project no INCITE08PXIB1621 are greatly appreciated.

  13. Noise analysis in laser speckle contrast imaging

    NASA Astrophysics Data System (ADS)

    Yuan, Shuai; Chen, Yu; Dunn, Andrew K.; Boas, David A.

    2010-02-01

    Laser speckle contrast imaging (LSCI) is becoming an established method for full-field imaging of blood flow dynamics in animal models. A reliable quantitative model with comprehensive noise analysis is necessary to fully utilize this technique in biomedical applications and clinical trials. In this study, we investigated several major noise sources in LSCI: periodic physiology noise, shot noise and statistical noise. (1) We observed periodic physiology noise in our experiments and found that its sources consist principally of motions induced by heart beats and/or ventilation. (2) We found that shot noise caused an offset of speckle contrast (SC) values, and this offset is directly related to the incident light intensity. (3) A mathematical model of statistical noise was also developed. The model indicated that statistical noise in speckle contrast imaging is related to the SC values and the total number of pixels used in the SC calculation. Our experimental results are consistent with theoretical predications, as well as with other published works.

  14. Difference Image Analysis of Galactic Microlensing. I. Data Analysis

    SciTech Connect

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.

    1999-08-20

    This is a preliminary report on the application of Difference Image Analysis (DIA) to Galactic bulge images. The aim of this analysis is to increase the sensitivity to the detection of gravitational microlensing. We discuss how the DIA technique simplifies the process of discovering microlensing events by detecting only objects that have variable flux. We illustrate how the DIA technique is not limited to detection of so-called ''pixel lensing'' events but can also be used to improve photometry for classical microlensing events by removing the effects of blending. We will present a method whereby DIA can be used to reveal the true unblended colors, positions, and light curves of microlensing events. We discuss the need for a technique to obtain the accurate microlensing timescales from blended sources and present a possible solution to this problem using the existing Hubble Space Telescope color-magnitude diagrams of the Galactic bulge and LMC. The use of such a solution with both classical and pixel microlensing searches is discussed. We show that one of the major causes of systematic noise in DIA is differential refraction. A technique for removing this systematic by effectively registering images to a common air mass is presented. Improvements to commonly used image differencing techniques are discussed. (c) 1999 The American Astronomical Society.

  15. Analysis of image quality based on perceptual preference

    NASA Astrophysics Data System (ADS)

    Xue, Liqin; Hua, Yuning; Zhao, Guangzhou; Qi, Yaping

    2007-11-01

    This paper deals with image quality analysis considering the impact of psychological factors involved in assessment. The attributes of image quality requirement were partitioned according to the visual perception characteristics and the preference of image quality were obtained by the factor analysis method. The features of image quality which support the subjective preference were identified, The adequacy of image is evidenced to be the top requirement issues to the display image quality improvement. The approach will be beneficial to the research of the image quality subjective quantitative assessment method.

  16. Wavelet Analysis of Space Solar Telescope Images

    NASA Astrophysics Data System (ADS)

    Zhu, Xi-An; Jin, Sheng-Zhen; Wang, Jing-Yu; Ning, Shu-Nian

    2003-12-01

    The scientific satellite SST (Space Solar Telescope) is an important research project strongly supported by the Chinese Academy of Sciences. Every day, SST acquires 50 GB of data (after processing) but only 10GB can be transmitted to the ground because of limited time of satellite passage and limited channel volume. Therefore, the data must be compressed before transmission. Wavelets analysis is a new technique developed over the last 10 years, with great potential of application. We start with a brief introduction to the essential principles of wavelet analysis, and then describe the main idea of embedded zerotree wavelet coding, used for compressing the SST images. The results show that this coding is adequate for the job.

  17. A review and classification of approaches for dealing with uncertainty in multi-criteria decision analysis for healthcare decisions.

    PubMed

    Broekhuizen, Henk; Groothuis-Oudshoorn, Catharina G M; van Til, Janine A; Hummel, J Marjan; IJzerman, Maarten J

    2015-05-01

    Multi-criteria decision analysis (MCDA) is increasingly used to support decisions in healthcare involving multiple and conflicting criteria. Although uncertainty is usually carefully addressed in health economic evaluations, whether and how the different sources of uncertainty are dealt with and with what methods in MCDA is less known. The objective of this study is to review how uncertainty can be explicitly taken into account in MCDA and to discuss which approach may be appropriate for healthcare decision makers. A literature review was conducted in the Scopus and PubMed databases. Two reviewers independently categorized studies according to research areas, the type of MCDA used, and the approach used to quantify uncertainty. Selected full text articles were read for methodological details. The search strategy identified 569 studies. The five approaches most identified were fuzzy set theory (45% of studies), probabilistic sensitivity analysis (15%), deterministic sensitivity analysis (31%), Bayesian framework (6%), and grey theory (3%). A large number of papers considered the analytic hierarchy process in combination with fuzzy set theory (31%). Only 3% of studies were published in healthcare-related journals. In conclusion, our review identified five different approaches to take uncertainty into account in MCDA. The deterministic approach is most likely sufficient for most healthcare policy decisions because of its low complexity and straightforward implementation. However, more complex approaches may be needed when multiple sources of uncertainty must be considered simultaneously. PMID:25630758

  18. Cost and schedule control systems criteria for contract performance measurement: contractor reporting/data analysis guide

    SciTech Connect

    Not Available

    1980-11-01

    The DOE Cost and Schedule Control Systems Criteria (CSCSC) require that a contractor's management control systems include methods and procedures designed to ensure that they will accomplish, in addition to other requirements, a summarization of data elements to the level of reporting to DOE specified in the contract under separate clause. Reports provided to DOE must relate contract cost, schedule, and technical accomplishment to a baseline plan, within the framework of both the contract Work Breakdown Structure (WBS) and the contractor's organizational structure. This Guide describes the reports available from contractors, with emphasis on the Cost Performance Report (CPR), and provides a framework for using the reported data as a basis for decision making. This Guide was developed to assist DOE Project Managers in assessing contractor performance through proper use of the CPR and supporting reports.

  19. The Scientific Image in Behavior Analysis.

    PubMed

    Keenan, Mickey

    2016-05-01

    Throughout the history of science, the scientific image has played a significant role in communication. With recent developments in computing technology, there has been an increase in the kinds of opportunities now available for scientists to communicate in more sophisticated ways. Within behavior analysis, though, we are only just beginning to appreciate the importance of going beyond the printing press to elucidate basic principles of behavior. The aim of this manuscript is to stimulate appreciation of both the role of the scientific image and the opportunities provided by a quick response code (QR code) for enhancing the functionality of the printed page. I discuss the limitations of imagery in behavior analysis ("Introduction"), and I show examples of what can be done with animations and multimedia for teaching philosophical issues that arise when teaching about private events ("Private Events 1 and 2"). Animations are also useful for bypassing ethical issues when showing examples of challenging behavior ("Challenging Behavior"). Each of these topics can be accessed only by scanning the QR code provided. This contingency has been arranged to help the reader embrace this new technology. In so doing, I hope to show its potential for going beyond the limitations of the printing press. PMID:27606187

  20. Deciding on Science: An Analysis of Higher Education Science Student Major Choice Criteria

    NASA Astrophysics Data System (ADS)

    White, Stephen Wilson

    The number of college students choosing to major in science, technology, engineering, and math (STEM) in the United States affects the size and quality of the American workforce (Winters, 2009). The number of graduates in these academic fields has been on the decline in the United States since the 1960s, which, according to Lips and McNeil (2009), has resulted in a diminished ability of the United States to compete in science and engineering on the world stage. The purpose of this research was to learn why students chose a STEM major and determine what decision criteria influenced this decision. According to Ajzen's (1991) theory of planned behavior (TPB), the key components of decision-making can be quantified and used as predictors of behavior. In this study the STEM majors' decision criteria were compared between different institution types (two-year, public four-year, and private four-year), and between demographic groups (age and sex). Career, grade, intrinsic, self-efficacy, and self-determination were reported as motivational factors by a majority of science majors participating in this study. Few students reported being influenced by friends and family when deciding to major in science. Science students overwhelmingly attributed the desire to solve meaningful problems as central to their decision to major in science. A majority of students surveyed credited a teacher for influencing their desire to pursue science as a college major. This new information about the motivational construct of the studied group of science majors can be applied to the previously stated problem of not enough STEM majors in the American higher education system to provide workers required to fill the demand of a globally STEM-competitive United States (National Academy of Sciences, National Academy of Engineering, & Institute of Medicine, 2010).

  1. Extracting roads based on Retinex and improved Canny operator with shape criteria in vague and unevenly illuminated aerial images

    NASA Astrophysics Data System (ADS)

    Ronggui, Ma; Weixing, Wang; Sheng, Liu

    2012-01-01

    An automatic road extraction method for vague aerial images is proposed in this paper. First, a high-resolution but low-contrast image is enhanced by using a Retinex-based algorithm. Then, the enhanced image is segmented with an improved Canny edge detection operator that can automatically threshold the image into a binary edge image. Subsequently, the linear and curved road segments are regulated by the Hough line transform and extracted based on several thresholds of road size and shapes, in which a number of morphological operators are used such as thinning (skeleton), junction detection, and endpoint detection. In experiments, a number of vague aerial images with bad uniformity are selected for testing. Similarity and discontinuation-based algorithms, such as Otsu thresholding, merge and split, edge detection-based algorithms, and the graph-based algorithm are compared with the new method. The experiment and comparison results show that the studied method can enhance vague, low-contrast, and unevenly illuminated color aerial road images; it can detect most road edges with fewer disturb elements and trace roads with good quality. The method in this study is promising.

  2. Monotonic correlation analysis of image quality measures for image fusion

    NASA Astrophysics Data System (ADS)

    Kaplan, Lance M.; Burks, Stephen D.; Moore, Richard K.; Nguyen, Quang

    2008-04-01

    The next generation of night vision goggles will fuse image intensified and long wave infra-red to create a hybrid image that will enable soldiers to better interpret their surroundings during nighttime missions. Paramount to the development of such goggles is the exploitation of image quality (IQ) measures to automatically determine the best image fusion algorithm for a particular task. This work introduces a novel monotonic correlation coefficient to investigate how well possible IQ features correlate to actual human performance, which is measured by a perception study. The paper will demonstrate how monotonic correlation can identify worthy features that could be overlooked by traditional correlation values.

  3. Monte Carlo-based interval transformation analysis for multi-criteria decision analysis of groundwater management strategies under uncertain naphthalene concentrations and health risks

    NASA Astrophysics Data System (ADS)

    Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong

    2016-08-01

    A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.

  4. Predicting the onset of filamentous bulking in biological wastewater treatment systems by exploiting image analysis information.

    PubMed

    Banadda, E N; Smets, I Y; Jenné, R; Van Impe, J F

    2005-08-01

    The performance of the activated sludge process is limited by the ability of the sedimentation tank (1) to separate the activated sludge from the treated effluent and (2) to concentrate it. Apart from bad operating strategies or poorly designed clarifiers, settling failures can mainly be attributed to filamentous bulking. Image analysis is a promising technique that can be used for early detection of filamentous bulking. The aim of this paper is therefore twofold. Foremost, correlations are sought between image analysis information (i.e., the total filament length per image, the mean form factor, the mean equivalent floc diameter, the mean floc roundness and the mean floc reduced radius of gyration) and classical measurements (i.e., the Sludge Volume Index (SVI)). Secondly, this information is both explored and exploited in order to identify dynamic ARX and state space-type models. Their performance is compared based on two criteria. PMID:16021475

  5. Analysis on Awareness of Functional Dyspepsia and Rome Criteria Among Japanese Internists by the Self-administered Questionnaires

    PubMed Central

    Tsuboi, Hirohito

    2014-01-01

    Background/Aims Functional dyspepsia (FD) is one of the commonest diseases in the field of Internal Medicine. The Japanese Society of Gastroenterology (JSGE) has been enlightening the term and concept of FD. Aim of this survey was to elucidate the understanding status of FD and Rome criteria and attitude toward FD among Japanese internists. Methods Data were collected at the time of lifelong education course for certified members of Japanese Society of Internal Medicine. Self-administered questionnaires were delivered to the medical doctors prior to the lectures. Results Analysis subjects were 1,623 (24-90 years old) internists among 1,660 medical doctors out of 4,264 attendees. The terms related to FD were known in 62.0-68.9% of internists, whereas 95.5% understood chronic gastritis. Internists who had been taking care of FD patients informed them as chronic gastritis (50.0%), FD in Japanese Kanji character (50.8%) and FD in Kanji and Katakana (18.6%). Logistic linear regression analysis revealed that positive factors for the understanding of FD and intensive care for FD patients were practitioner, caring many patients and certified physician by JSGE. Existence of Rome criteria was known in 39.9% of internists, and 31.8% out of them put it to practical use. The certified physician by JSGE was a positive factor for awareness, but not for utilization. Conclusions The results suggest the needs of enlightening the medical term FD in Japan and revision of Rome criteria for routine clinical practice. Precise recognition of FD may enhance efficient patient-based clinical practice. PMID:24466450

  6. Effect of tree nuts on metabolic syndrome criteria: a systematic review and meta-analysis of randomised controlled trials

    PubMed Central

    Blanco Mejia, Sonia; Kendall, Cyril W C; Viguiliouk, Effie; Augustin, Livia S; Ha, Vanessa; Cozma, Adrian I; Mirrahimi, Arash; Maroleanu, Adriana; Chiavaroli, Laura; Leiter, Lawrence A; de Souza, Russell J; Jenkins, David J A; Sievenpiper, John L

    2014-01-01

    Objective To provide a broader evidence summary to inform dietary guidelines of the effect of tree nuts on criteria of the metabolic syndrome (MetS). Design We conducted a systematic review and meta-analysis of the effect of tree nuts on criteria of the MetS. Data sources We searched MEDLINE, EMBASE, CINAHL and the Cochrane Library (through 4 April 2014). Eligibility criteria for selecting studies We included relevant randomised controlled trials (RCTs) of ≥3 weeks reporting at least one criterion of the MetS. Data extraction Two or more independent reviewers extracted all relevant data. Data were pooled using the generic inverse variance method using random effects models and expressed as mean differences (MD) with 95% CIs. Heterogeneity was assessed by the Cochran Q statistic and quantified by the I2 statistic. Study quality and risk of bias were assessed. Results Eligibility criteria were met by 49 RCTs including 2226 participants who were otherwise healthy or had dyslipidaemia, MetS or type 2 diabetes mellitus. Tree nut interventions lowered triglycerides (MD=−0.06 mmol/L (95% CI −0.09 to −0.03 mmol/L)) and fasting blood glucose (MD=−0.08 mmol/L (95% CI −0.16 to −0.01 mmol/L)) compared with control diet interventions. There was no effect on waist circumference, high-density lipoprotein cholesterol or blood pressure with the direction of effect favouring tree nuts for waist circumference. There was evidence of significant unexplained heterogeneity in all analyses (p<0.05). Conclusions Pooled analyses show a MetS benefit of tree nuts through modest decreases in triglycerides and fasting blood glucose with no adverse effects on other criteria across nut types. As our conclusions are limited by the short duration and poor quality of the majority of trials, as well as significant unexplained between-study heterogeneity, there remains a need for larger, longer, high-quality trials. Trial registration number NCT01630980. PMID:25074070

  7. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  8. Decerns: A framework for multi-criteria decision analysis

    SciTech Connect

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; Sullivan, Terry

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  9. Characterization of Tank 23H Supernate Per Saltstone Waste Acceptance Criteria Analysis Requirements -2005

    SciTech Connect

    Oji, L

    2005-05-05

    Variable depth Tank 23H samples (22-inch sample [HTF-014] and 185-inch sample [HTF-013]) were pulled from Tank 23H in February, 2005 for characterization. The characterization of the Tank 23H low activity waste is part of the overall liquid waste processing activities. This characterization examined the species identified in the Saltstone Waste Acceptance Criteria (WAC) for the transfer of waste into the Salt-Feed Tank (SFT). The samples were delivered to the Savannah River National Laboratory (SRNL) and analyzed. Apart from radium-226 with an average measured detection limit of < 2.64E+03 pCi/mL, which is about the same order of magnitude as the WAC limit (< 8.73E+03 pCi/mL), none of the species analyzed was found to approach the limits provided in the Saltstone WAC. The concentration of most of the species analyzed for the Tank 23H samples were 2-5 orders of magnitude lower than the WAC limits. The achievable detection limits for a number of the analytes were several orders of magnitude lower than the WAC limits, but one or two orders of magnitude higher than the requested detection limits. Analytes which fell into this category included plutonium-241, europium-154/155, antimony-125, tin-126, ruthenium/rhodium-106, selenium-79, nickel-59/63, ammonium ion, copper, total nickel, manganese and total organic carbon.

  10. Characterization of Tank 23H Supernate Per Saltstone Waste Acceptance Criteria Analysis Requirements-2005

    SciTech Connect

    Oji, L

    2005-06-01

    Variable depth Tank 23H samples (22-inch sample [HTF-014] and 185-inch sample [HTF-013]) were pulled from Tank 23H in February, 2005 for characterization. The characterization of the Tank 23H low activity waste is part of the overall liquid waste processing activities. This characterization examined the species identified in the Saltstone Waste Acceptance Criteria (WAC) for the transfer of waste into the Salt-Feed Tank (SFT). The samples were delivered to the Savannah River National Laboratory (SRNL) and analyzed. Apart from radium-226 with an average measured detection limit of < 2.64E+03 pCi/mL, which is about the same order of magnitude as the WAC limit (< 8.73E+03 pCi/mL), none of the species analyzed was found to approach the limits provided in the Saltstone WAC. The concentration of most of the species analyzed for the Tank 23H samples were 2-5 orders of magnitude lower than the WAC limits. The achievable detection limits for a number of the analytes were several orders of magnitude lower than the WAC limits, but one or two orders of magnitude higher than the requested detection limits. Analytes which fell into this category included plutonium-241, europium-154/155, antimony-125, tin-126, ruthenium/rhodium-106, selenium-79, nickel-59/63, ammonium ion, copper, total nickel, manganese and total organic carbon.

  11. F-106 data summary and model results relative to threat criteria and protection design analysis

    NASA Technical Reports Server (NTRS)

    Pitts, F. L.; Finelli, G. B.; Perala, R. A.; Rudolph, T. H.

    1986-01-01

    The NASA F-106 has acquired considerable data on the rates-of-change of electromagnetic parameters on the aircraft surface during 690 direct lightning strikes while penetrating thunderstorms at altitudes ranging from 15,000 to 40,000 feet. These in-situ measurements have provided the basis for the first statistical quantification of the lightning electromagnetic threat to aircrat appropriate for determining lightning indirect effects on aircraft. The data are presently being used in updating previous lightning criteria and standards developed over the years from ground-based measurements. The new lightning standards will, therefore, be the first which reflect actual aircraft responses measured at flight altitudes. The modeling technique developed to interpret and understand the direct strike electromagnetic data acquired on the F-106 provides a means to model the interaction of the lightning channel with the F-106. The reasonable results obtained with the model, compared to measured responses, yield confidence that the model may be credibly applied to other aircraft types and uses in the prediction of internal coupling effects in the design of lightning protection for new aircraft.

  12. Multi-criteria analysis for the determination of the best WEEE management scenario in Cyprus.

    PubMed

    Rousis, K; Moustakas, K; Malamis, S; Papadopoulos, A; Loizidou, M

    2008-01-01

    Waste from electrical and electronic equipment (WEEE) constitutes one of the most complicated solid waste streams in terms of its composition, and, as a result, it is difficult to be effectively managed. In view of the environmental problems derived from WEEE management, many countries have established national legislation to improve the reuse, recycling and other forms of recovery of this waste stream so as to apply suitable management schemes. In this work, alternative systems are examined for the WEEE management in Cyprus. These systems are evaluated by developing and applying the Multi-Criteria Decision Making (MCDM) method PROMETHEE. In particular, through this MCDM method, 12 alternative management systems were compared and ranked according to their performance and efficiency. The obtained results show that the management schemes/systems based on partial disassembly are the most suitable for implementation in Cyprus. More specifically, the optimum scenario/system that can be implemented in Cyprus is that of partial disassembly and forwarding of recyclable materials to the native existing market and disposal of the residues at landfill sites. PMID:18262405

  13. Analysis of autostereoscopic three-dimensional images using multiview wavelets.

    PubMed

    Saveljev, Vladimir; Palchikova, Irina

    2016-08-10

    We propose that multiview wavelets can be used in processing multiview images. The reference functions for the synthesis/analysis of multiview images are described. The synthesized binary images were observed experimentally as three-dimensional visual images. The symmetric multiview B-spline wavelets are proposed. The locations recognized in the continuous wavelet transform correspond to the layout of the test objects. The proposed wavelets can be applied to the multiview, integral, and plenoptic images. PMID:27534470

  14. Vector processing enhancements for real-time image analysis.

    SciTech Connect

    Shoaf, S.; APS Engineering Support Division

    2008-01-01

    A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.

  15. MaZda--a software package for image texture analysis.

    PubMed

    Szczypiński, Piotr M; Strzelecki, Michał; Materka, Andrzej; Klepaczko, Artur

    2009-04-01

    MaZda, a software package for 2D and 3D image texture analysis is presented. It provides a complete path for quantitative analysis of image textures, including computation of texture features, procedures for feature selection and extraction, algorithms for data classification, various data visualization and image segmentation tools. Initially, MaZda was aimed at analysis of magnetic resonance image textures. However, it revealed its effectiveness in analysis of other types of textured images, including X-ray and camera images. The software was utilized by numerous researchers in diverse applications. It was proven to be an efficient and reliable tool for quantitative image analysis, even in more accurate and objective medical diagnosis. MaZda was also successfully used in food industry to assess food product quality. MaZda can be downloaded for public use from the Institute of Electronics, Technical University of Lodz webpage. PMID:18922598

  16. Thermal image analysis for detecting facemask leakage

    NASA Astrophysics Data System (ADS)

    Dowdall, Jonathan B.; Pavlidis, Ioannis T.; Levine, James

    2005-03-01

    Due to the modern advent of near ubiquitous accessibility to rapid international transportation the epidemiologic trends of highly communicable diseases can be devastating. With the recent emergence of diseases matching this pattern, such as Severe Acute Respiratory Syndrome (SARS), an area of overt concern has been the transmission of infection through respiratory droplets. Approved facemasks are typically effective physical barriers for preventing the spread of viruses through droplets, but breaches in a mask"s integrity can lead to an elevated risk of exposure and subsequent infection. Quality control mechanisms in place during the manufacturing process insure that masks are defect free when leaving the factory, but there remains little to detect damage caused by transportation or during usage. A system that could monitor masks in real-time while they were in use would facilitate a more secure environment for treatment and screening. To fulfill this necessity, we have devised a touchless method to detect mask breaches in real-time by utilizing the emissive properties of the mask in the thermal infrared spectrum. Specifically, we use a specialized thermal imaging system to detect minute air leakage in masks based on the principles of heat transfer and thermodynamics. The advantage of this passive modality is that thermal imaging does not require contact with the subject and can provide instant visualization and analysis. These capabilities can prove invaluable for protecting personnel in scenarios with elevated levels of transmission risk such as hospital clinics, border check points, and airports.

  17. Vision-sensing image analysis for GTAW process control

    SciTech Connect

    Long, D.D.

    1994-11-01

    Image analysis of a gas tungsten arc welding (GTAW) process was completed using video images from a charge coupled device (CCD) camera inside a specially designed coaxial (GTAW) electrode holder. Video data was obtained from filtered and unfiltered images, with and without the GTAW arc present, showing weld joint features and locations. Data Translation image processing boards, installed in an IBM PC AT 386 compatible computer, and Media Cybernetics image processing software were used to investigate edge flange weld joint geometry for image analysis.

  18. ASCI 2010 appropriateness criteria for cardiac magnetic resonance imaging: a report of the Asian Society of Cardiovascular Imaging cardiac computed tomography and cardiac magnetic resonance imaging guideline working group

    PubMed Central

    Choi, Byoung Wook; Chan, Carmen; Jinzaki, Masahiro; Tsai, I-Chen; Yong, Hwan Seok; Yu, Wei

    2010-01-01

    There has been a growing need for standard Asian population guidelines for cardiac CT and cardiac MR due to differences in culture, healthcare system, ethnicity and disease prevalence. The Asian Society of Cardiovascular Imaging, as the only society dedicated to cardiovascular imaging in Asia, formed a cardiac CT and cardiac MR guideline working group in order to help Asian practitioners to establish cardiac CT and cardiac MR services. In this ASCI cardiac MR appropriateness criteria report, 23 Technical Panel members representing various Asian countries were invited to rate 50 indications that can frequently be encountered in clinical practice in Asia. Indications were rated on a scale of 1–9 to be categorized into ‘appropriate’ (7–9), ‘uncertain’ (4–6), or ‘inappropriate’ (1–3). According to median scores of the 23 members, the final ratings for indications were 24 appropriate, 18 uncertain and 8 inappropriate with 22 ‘highly-agreed’ (19 appropriate and 3 inappropriate) indications. This report is expected to have a significant impact on the cardiac MR practices in many Asian countries by promoting the appropriate use of cardiac MR. Electronic supplementary material The online version of this article (doi:10.1007/s10554-010-9687-z) contains supplementary material, which is available to authorized users. PMID:20734234

  19. Image analysis by integration of disparate information

    NASA Technical Reports Server (NTRS)

    Lemoigne, Jacqueline

    1993-01-01

    Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.

  20. Criteria for the use of regression analysis for remote sensing of sediment and pollutants

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R. (Principal Investigator)

    1982-01-01

    Data analysis procedures for quantification of water quality parameters that are already identified and are known to exist within the water body are considered. The liner multiple-regression technique was examined as a procedure for defining and calibrating data analysis algorithms for such instruments as spectrometers and multispectral scanners.

  1. Geostatistical analysis of groundwater level using Euclidean and non-Euclidean distance metrics and variable variogram fitting criteria

    NASA Astrophysics Data System (ADS)

    Theodoridou, Panagiota G.; Karatzas, George P.; Varouchakis, Emmanouil A.; Corzo Perez, Gerald A.

    2015-04-01

    Groundwater level is an important information in hydrological modelling. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram model is very important for the optimal method performance. This work compares three different criteria, the least squares sum method, the Akaike Information Criterion and the Cressie's Indicator, to assess the theoretical variogram that fits to the experimental one and investigates the impact on the prediction results. Moreover, five different distance functions (Euclidean, Minkowski, Manhattan, Canberra, and Bray-Curtis) are applied to calculate the distance between observations that affects both the variogram calculation and the Kriging estimator. Cross validation analysis in terms of Ordinary Kriging is applied by using sequentially a different distance metric and the above three variogram fitting criteria. The spatial dependence of the observations in the tested dataset is studied by fitting classical variogram models and the Matérn model. The proposed comparison analysis performed for a data set of two hundred fifty hydraulic head measurements distributed over an alluvial aquifer that covers an area of 210 km2. The study area is located in the Prefecture of Drama, which belongs to the Water District of East Macedonia (Greece). This area was selected in terms of hydro-geological data availability and geological homogeneity. The analysis showed that a combination of the Akaike information Criterion for the variogram fitting assessment and the Brays-Curtis distance metric provided the most accurate cross-validation results. The Power-law variogram model provided the best fit to the experimental data. The aforementioned approach for the specific dataset in terms of the Ordinary Kriging method improves the prediction efficiency in comparison to the classical Euclidean distance metric. Therefore, maps of the spatial

  2. Imaging Tests for the Diagnosis and Staging of Pancreatic Adenocarcinoma: A Meta-Analysis.

    PubMed

    Treadwell, Jonathan R; Zafar, Hanna M; Mitchell, Matthew D; Tipton, Kelley; Teitelbaum, Ursina; Jue, Jane

    2016-07-01

    Imaging tests are central to the diagnosis and staging of pancreatic adenocarcinoma. We performed a systematic review and meta-analysis of the pertinent evidence on 5 imaging tests (computed tomography (CT), magnetic resonance imaging, CT angiography, endoscopic ultrasound with fine-needle aspiration, and combined positron emission tomography with CT). Searches of several databases up to March 1, 2014, yielded 9776 articles, and 24 provided comparative effectiveness of 2 or more imaging tests. Multiple reviewers applied study inclusion criteria, extracted data from each study, rated the risk of bias, and graded the strength of evidence. Data included accuracy of diagnosis and resectability in primary untreated pancreatic adenocarcinoma, including tumor stage, nodal stage, metastases, and vascular involvement. Where possible, study results were combined using bivariate meta-analysis. Studies were at low or moderate risk of bias. Most comparisons between imaging tests were insufficient to permit conclusions, due to imprecision or inconsistency among study results. However, moderate-grade evidence revealed that CT and magnetic resonance imaging had similar sensitivities and specificities for both diagnosis and vascular involvement. Other conclusions were based on low-grade evidence. In general, more direct evidence is needed to inform decisions about imaging tests for pancreatic adenocarcinoma. PMID:26745859

  3. APPLICATION OF PRINCIPAL COMPONENT ANALYSIS TO RELAXOGRAPHIC IMAGES

    SciTech Connect

    STOYANOVA,R.S.; OCHS,M.F.; BROWN,T.R.; ROONEY,W.D.; LI,X.; LEE,J.H.; SPRINGER,C.S.

    1999-05-22

    Standard analysis methods for processing inversion recovery MR images traditionally have used single pixel techniques. In these techniques each pixel is independently fit to an exponential recovery, and spatial correlations in the data set are ignored. By analyzing the image as a complete dataset, improved error analysis and automatic segmentation can be achieved. Here, the authors apply principal component analysis (PCA) to a series of relaxographic images. This procedure decomposes the 3-dimensional data set into three separate images and corresponding recovery times. They attribute the 3 images to be spatial representations of gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF) content.

  4. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  5. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    NASA Astrophysics Data System (ADS)

    Liang, Jianming; Järvi, Timo; Kiuru, Aaro; Kormano, Martti; Svedström, Erkki

    2003-12-01

    The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT) and nuclear medicine (NM) studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  6. QuickSilver: A Phase II Study Using Magnetic Resonance Imaging Criteria to Identify “Good Prognosis” Rectal Cancer Patients Eligible for Primary Surgery

    PubMed Central

    2015-01-01

    Background Recently, two nonrandomized, prospective cohort studies used magnetic resonance imaging (MRI) to assess the circumferential resection margin to identify “good prognosis” rectal tumors eligible for primary surgery and have reported favorable outcomes. Objective The objective of this project was to conduct a Phase II trial to assess the safety and feasibility of MRI criteria to identify “good prognosis” rectal tumors eligible for primary surgery in the North American setting. Methods Patients with newly diagnosed primary rectal cancer attending surgical clinics at participating centers will be invited to participate in the study. The inclusion criteria for the study are: (1) diagnosis of rectal cancer (0-15 cm) from the anal verge on endoscopy and proximal extent of tumor at or below the sacral promontory on computed tomography (CT) or MRI; (2) meets all MRI criteria for “good prognosis” rectal tumor as defined by the study protocol; (3) 18 years or older; and (4) able to provide written consent. The initial assessment will include: (1) clinical and endoscopic examination of the primary tumor; (2) CT chest, abdomen, and pelvis; and (3) pelvic MRI. All potentially eligible cases will be presented at a multidisciplinary cancer conference to assess for eligibility based on the MRI criteria for “good prognosis” tumor which include: (1) predicted circumferential resection margin (CRM) > 1 mm; (2) definite T2, T2/early T3, or definite T3 tumor with < 5 mm of extramural depth of invasion (EMD); (3) any N0, N1, or N2; and (4) absence of extramural venous invasion (EMVI). All patients fulfilling the MRI criteria for “good prognosis” rectal cancer and the inclusion and exclusion criteria will be invited to participate in the study and proceed to primary surgery. The safety of the MRI criteria will be evaluated by assessing the positive CRM rate and is the primary outcome for the study. Results We expect to have a minimum of 300 potentially

  7. Wild Fire Risk Map in the Eastern Steppe of Mongolia Using Spatial Multi-Criteria Analysis

    NASA Astrophysics Data System (ADS)

    Nasanbat, Elbegjargal; Lkhamjav, Ochirkhuyag

    2016-06-01

    Grassland fire is a cause of major disturbance to ecosystems and economies throughout the world. This paper investigated to identify risk zone of wildfire distributions on the Eastern Steppe of Mongolia. The study selected variables for wildfire risk assessment using a combination of data collection, including Social Economic, Climate, Geographic Information Systems, Remotely sensed imagery, and statistical yearbook information. Moreover, an evaluation of the result is used field validation data and assessment. The data evaluation resulted divided by main three group factors Environmental, Social Economic factor, Climate factor and Fire information factor into eleven input variables, which were classified into five categories by risk levels important criteria and ranks. All of the explanatory variables were integrated into spatial a model and used to estimate the wildfire risk index. Within the index, five categories were created, based on spatial statistics, to adequately assess respective fire risk: very high risk, high risk, moderate risk, low and very low. Approximately more than half, 68 percent of the study area was predicted accuracy to good within the very high, high risk and moderate risk zones. The percentages of actual fires in each fire risk zone were as follows: very high risk, 42 percent; high risk, 26 percent; moderate risk, 13 percent; low risk, 8 percent; and very low risk, 11 percent. The main overall accuracy to correct prediction from the model was 62 percent. The model and results could be support in spatial decision making support system processes and in preventative wildfire management strategies. Also it could be help to improve ecological and biodiversity conservation management.

  8. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    NASA Astrophysics Data System (ADS)

    Teichert, K.; Süss, P.; Serna, J. I.; Monz, M.; Küfer, K. H.; Thieke, C.

    2011-06-01

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.

  9. ACR Appropriateness Criteria Crohn Disease.

    PubMed

    Kim, David H; Carucci, Laura R; Baker, Mark E; Cash, Brooks D; Dillman, Jonathan R; Feig, Barry W; Fowler, Kathryn J; Gage, Kenneth L; Noto, Richard B; Smith, Martin P; Yaghmai, Vahid; Yee, Judy; Lalani, Tasneem

    2015-10-01

    Crohn disease is a chronic inflammatory disorder involving the gastrointestinal tract, characterized by episodic flares and times of remission. Underlying structural damage occurs progressively, with recurrent bouts of inflammation. The diagnosis and management of this disease process is dependent on several clinical, laboratory, imaging, endoscopic, and histologic factors. In recent years, with the maturation of CT enterography, and MR enterography, imaging has played an increasingly important role in relation to Crohn Disease. In addition to these specialized examination modalities, ultrasound and routine CT have potential uses. Fluoroscopy, radiography, and nuclear medicine may be less beneficial depending on the clinical scenario. The imaging modality best suited to evaluating this disease may change, depending on the target population, severity of presentation, and specific clinical situation. This document presents seven clinical scenarios (variants) in both the adult and pediatric populations and rates the appropriateness of the available imaging options. They are summarized in a consolidated table, and the underlying rationale and supporting literature are presented in the accompanying narrative. The ACR Appropriateness Criteria are evidence-based guidelines for specific clinical conditions that are reviewed every three years by a multidisciplinary expert panel. The guideline development and review include an extensive analysis of current medical literature from peer-reviewed journals and the application of a well established consensus methodology (modified Delphi) to rate the appropriateness of imaging and treatment procedures by the panel. In those instances in which evidence is lacking or not definitive, expert opinion may be used to recommend imaging or treatment. PMID:26435118

  10. Quartic Rotation Criteria and Algorithms.

    ERIC Educational Resources Information Center

    Clarkson, Douglas B.; Jennrich, Robert I.

    1988-01-01

    Most of the current analytic rotation criteria for simple structure in factor analysis are summarized and identified as members of a general symmetric family of quartic criteria. A unified development of algorithms for orthogonal and direct oblique rotation using arbitrary criteria from this family is presented. (Author/TJH)

  11. High resolution ultraviolet imaging spectrometer for latent image analysis.

    PubMed

    Lyu, Hang; Liao, Ningfang; Li, Hongsong; Wu, Wenmin

    2016-03-21

    In this work, we present a close-range ultraviolet imaging spectrometer with high spatial resolution, and reasonably high spectral resolution. As the transmissive optical components cause chromatic aberration in the ultraviolet (UV) spectral range, an all-reflective imaging scheme is introduced to promote the image quality. The proposed instrument consists of an oscillating mirror, a Cassegrain objective, a Michelson structure, an Offner relay, and a UV enhanced CCD. The finished spectrometer has a spatial resolution of 29.30μm on the target plane; the spectral scope covers both near and middle UV band; and can obtain approximately 100 wavelength samples over the range of 240~370nm. The control computer coordinates all the components of the instrument and enables capturing a series of images, which can be reconstructed into an interferogram datacube. The datacube can be converted into a spectrum datacube, which contains spectral information of each pixel with many wavelength samples. A spectral calibration is carried out by using a high pressure mercury discharge lamp. A test run demonstrated that this interferometric configuration can obtain high resolution spectrum datacube. The pattern recognition algorithm is introduced to analyze the datacube and distinguish the latent traces from the base materials. This design is particularly good at identifying the latent traces in the application field of forensic imaging. PMID:27136837

  12. A framework for joint image-and-shape analysis

    NASA Astrophysics Data System (ADS)

    Gao, Yi; Tannenbaum, Allen; Bouix, Sylvain

    2014-03-01

    Techniques in medical image analysis are many times used for the comparison or regression on the intensities of images. In general, the domain of the image is a given Cartesian grids. Shape analysis, on the other hand, studies the similarities and differences among spatial objects of arbitrary geometry and topology. Usually, there is no function defined on the domain of shapes. Recently, there has been a growing needs for defining and analyzing functions defined on the shape space, and a coupled analysis on both the shapes and the functions defined on them. Following this direction, in this work we present a coupled analysis for both images and shapes. As a result, the statistically significant discrepancies in both the image intensities as well as on the underlying shapes are detected. The method is applied on both brain images for the schizophrenia and heart images for atrial fibrillation patients.

  13. A multi-criteria analysis of options for energy recovery from municipal solid waste in India and the UK.

    PubMed

    Yap, H Y; Nixon, J D

    2015-12-01

    Energy recovery from municipal solid waste plays a key role in sustainable waste management and energy security. However, there are numerous technologies that vary in suitability for different economic and social climates. This study sets out to develop and apply a multi-criteria decision making methodology that can be used to evaluate the trade-offs between the benefits, opportunities, costs and risks of alternative energy from waste technologies in both developed and developing countries. The technologies considered are mass burn incineration, refuse derived fuel incineration, gasification, anaerobic digestion and landfill gas recovery. By incorporating qualitative and quantitative assessments, a preference ranking of the alternative technologies is produced. The effect of variations in decision criteria weightings are analysed in a sensitivity analysis. The methodology is applied principally to compare and assess energy recovery from waste options in the UK and India. These two countries have been selected as they could both benefit from further development of their waste-to-energy strategies, but have different technical and socio-economic challenges to consider. It is concluded that gasification is the preferred technology for the UK, whereas anaerobic digestion is the preferred technology for India. We believe that the presented methodology will be of particular value for waste-to-energy decision-makers in both developed and developing countries. PMID:26275797

  14. Analysis of physical processes via imaging vectors

    NASA Astrophysics Data System (ADS)

    Volovodenko, V.; Efremova, N.; Efremov, V.

    2016-06-01

    Practically, all modeling processes in one way or another are random. The foremost formulated theoretical foundation embraces Markov processes, being represented in different forms. Markov processes are characterized as a random process that undergoes transitions from one state to another on a state space, whereas the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. In the Markov processes the proposition (model) of the future by no means changes in the event of the expansion and/or strong information progression relative to preceding time. Basically, modeling physical fields involves process changing in time, i.e. non-stationay processes. In this case, the application of Laplace transformation provides unjustified description complications. Transition to other possibilities results in explicit simplification. The method of imaging vectors renders constructive mathematical models and necessary transition in the modeling process and analysis itself. The flexibility of the model itself using polynomial basis leads to the possible rapid transition of the mathematical model and further analysis acceleration. It should be noted that the mathematical description permits operator representation. Conversely, operator representation of the structures, algorithms and data processing procedures significantly improve the flexibility of the modeling process.

  15. LANDSAT-4 image data quality analysis

    NASA Technical Reports Server (NTRS)

    Anuta, P. E. (Principal Investigator)

    1982-01-01

    Work done on evaluating the geometric and radiometric quality of early LANDSAT-4 sensor data is described. Band to band and channel to channel registration evaluations were carried out using a line correlator. Visual blink comparisons were run on an image display to observe band to band registration over 512 x 512 pixel blocks. The results indicate a .5 pixel line misregistration between the 1.55 to 1.75, 2.08 to 2.35 micrometer bands and the first four bands. Also a four 30M line and column misregistration of the thermal IR band was observed. Radiometric evaluation included mean and variance analysis of individual detectors and principal components analysis. Results indicate that detector bias for all bands is very close or within tolerance. Bright spots were observed in the thermal IR band on an 18 line by 128 pixel grid. No explanation for this was pursued. The general overall quality of the TM was judged to be very high.

  16. Dynamic chest image analysis: model-based pulmonary perfusion analysis with pyramid images

    NASA Astrophysics Data System (ADS)

    Liang, Jianming; Haapanen, Arto; Jaervi, Timo; Kiuru, Aaro J.; Kormano, Martti; Svedstrom, Erkki; Virkki, Raimo

    1998-07-01

    The aim of the study 'Dynamic Chest Image Analysis' is to develop computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected at different phases of the respiratory/cardiac cycles in a short period of time. We have proposed a framework for ventilation study with an explicit ventilation model based on pyramid images. In this paper, we extend the framework to pulmonary perfusion study. A perfusion model and the truncated pyramid are introduced. The perfusion model aims at extracting accurate, geographic perfusion parameters, and the truncated pyramid helps in understanding perfusion at multiple resolutions and speeding up the convergence process in optimization. Three cases are included to illustrate the experimental results.

  17. Multi-criteria evaluation of CMIP5 GCMs for climate change impact analysis

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Rana, Arun; Moradkhani, Hamid; Sharma, Ashish

    2015-12-01

    Climate change is expected to have severe impacts on global hydrological cycle along with food-water-energy nexus. Currently, there are many climate models used in predicting important climatic variables. Though there have been advances in the field, there are still many problems to be resolved related to reliability, uncertainty, and computing needs, among many others. In the present work, we have analyzed performance of 20 different global climate models (GCMs) from Climate Model Intercomparison Project Phase 5 (CMIP5) dataset over the Columbia River Basin (CRB) in the Pacific Northwest USA. We demonstrate a statistical multicriteria approach, using univariate and multivariate techniques, for selecting suitable GCMs to be used for climate change impact analysis in the region. Univariate methods includes mean, standard deviation, coefficient of variation, relative change (variability), Mann-Kendall test, and Kolmogorov-Smirnov test (KS-test); whereas multivariate methods used were principal component analysis (PCA), singular value decomposition (SVD), canonical correlation analysis (CCA), and cluster analysis. The analysis is performed on raw GCM data, i.e., before bias correction, for precipitation and temperature climatic variables for all the 20 models to capture the reliability and nature of the particular model at regional scale. The analysis is based on spatially averaged datasets of GCMs and observation for the period of 1970 to 2000. Ranking is provided to each of the GCMs based on the performance evaluated against gridded observational data on various temporal scales (daily, monthly, and seasonal). Results have provided insight into each of the methods and various statistical properties addressed by them employed in ranking GCMs. Further; evaluation was also performed for raw GCM simulations against different sets of gridded observational dataset in the area.

  18. Medical Image Analysis by Cognitive Information Systems - a Review.

    PubMed

    Ogiela, Lidia; Takizawa, Makoto

    2016-10-01

    This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis. PMID:27526188

  19. Image based SAR product simulation for analysis

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  20. Imaging biomarkers in multiple Sclerosis: From image analysis to population imaging.

    PubMed

    Barillot, Christian; Edan, Gilles; Commowick, Olivier

    2016-10-01

    The production of imaging data in medicine increases more rapidly than the capacity of computing models to extract information from it. The grand challenges of better understanding the brain, offering better care for neurological disorders, and stimulating new drug design will not be achieved without significant advances in computational neuroscience. The road to success is to develop a new, generic, computational methodology and to confront and validate this methodology on relevant diseases with adapted computational infrastructures. This new concept sustains the need to build new research paradigms to better understand the natural history of the pathology at the early phase; to better aggregate data that will provide the most complete representation of the pathology in order to better correlate imaging with other relevant features such as clinical, biological or genetic data. In this context, one of the major challenges of neuroimaging in clinical neurosciences is to detect quantitative signs of pathological evolution as early as possible to prevent disease progression, evaluate therapeutic protocols or even better understand and model the natural history of a given neurological pathology. Many diseases encompass brain alterations often not visible on conventional MRI sequences, especially in normal appearing brain tissues (NABT). MRI has often a low specificity for differentiating between possible pathological changes which could help in discriminating between the different pathological stages or grades. The objective of medical image analysis procedures is to define new quantitative neuroimaging biomarkers to track the evolution of the pathology at different levels. This paper illustrates this issue in one acute neuro-inflammatory pathology: Multiple Sclerosis (MS). It exhibits the current medical image analysis approaches and explains how this field of research will evolve in the next decade to integrate larger scale of information at the temporal, cellular

  1. Image pattern recognition supporting interactive analysis and graphical visualization

    NASA Technical Reports Server (NTRS)

    Coggins, James M.

    1992-01-01

    Image Pattern Recognition attempts to infer properties of the world from image data. Such capabilities are crucial for making measurements from satellite or telescope images related to Earth and space science problems. Such measurements can be the required product itself, or the measurements can be used as input to a computer graphics system for visualization purposes. At present, the field of image pattern recognition lacks a unified scientific structure for developing and evaluating image pattern recognition applications. The overall goal of this project is to begin developing such a structure. This report summarizes results of a 3-year research effort in image pattern recognition addressing the following three principal aims: (1) to create a software foundation for the research and identify image pattern recognition problems in Earth and space science; (2) to develop image measurement operations based on Artificial Visual Systems; and (3) to develop multiscale image descriptions for use in interactive image analysis.

  2. Atomic force microscope, molecular imaging, and analysis.

    PubMed

    Chen, Shu-wen W; Teulon, Jean-Marie; Godon, Christian; Pellequer, Jean-Luc

    2016-01-01

    Image visibility is a central issue in analyzing all kinds of microscopic images. An increase of intensity contrast helps to raise the image visibility, thereby to reveal fine image features. Accordingly, a proper evaluation of results with current imaging parameters can be used for feedback on future imaging experiments. In this work, we have applied the Laplacian function of image intensity as either an additive component (Laplacian mask) or a multiplying factor (Laplacian weight) for enhancing image contrast of high-resolution AFM images of two molecular systems, an unknown protein imaged in air, provided by AFM COST Action TD1002 (http://www.afm4nanomedbio.eu/), and tobacco mosaic virus (TMV) particles imaged in liquid. Based on both visual inspection and quantitative representation of contrast measurements, we found that the Laplacian weight is more effective than the Laplacian mask for the unknown protein, whereas for the TMV system the strengthened Laplacian mask is superior to the Laplacian weight. The present results indicate that a mathematical function, as exemplified by the Laplacian function, may yield varied processing effects with different operations. To interpret the diversity of molecular structure and topology in images, an explicit expression for processing procedures should be included in scientific reports alongside instrumental setups. PMID:26224520

  3. Chapter 1 Eligibility Factors and Weights: Using Probit Analysis To Determine Eligibility Criteria.

    ERIC Educational Resources Information Center

    Willis, John A.

    Kanawha County (West Virginia) schools use Z-scores to identify elementary students eligible for Chapter 1 services in reading and mathematics. A probit analysis of over 500 previously served students was used to determine the variables and weights in the Z-score equations. Independent variables were chosen from those commonly used to identify…

  4. Optimal design and evaluation criteria for acoustic emission pulse signature analysis

    NASA Technical Reports Server (NTRS)

    Houghton, J. R.; Townsend, M. A.; Packman, P. F.

    1977-01-01

    Successful pulse recording and evaluation is strongly dependent on the instrumentation system selected and the method of analyzing the pulse signature. The paper studies system design, signal analysis techniques, and interdependences with a view toward defining optimal approaches to pulse signal analysis. For this purpose, the instrumentation system is modeled, and analytical pulses, representative of the types of acoustic emissions to be distinguished are passed through the system. Particular attention is given to comparing frequency spectrum analysis and deconvolution referred to as time domain reconstruction of the pulse or pulse train. The possibility of optimal transducer-filter system parameters is investigated. Deconvolution of a pulse is shown to be a superior approach for transient pulse analysis. Reshaping of a transducer output back to the original input pulse is possible and gives an accurate representation of the generating pulse in the time domain. Any definable transducer and filter system can be used for measurement of pulses by means of the deconvolution method. Selection of design variables for general usage is discussed.

  5. ST segment/heart rate slope as a predictor of coronary artery disease: comparison with quantitative thallium imaging and conventional ST segment criteria

    SciTech Connect

    Finkelhor, R.S.; Newhouse, K.E.; Vrobel, T.R.; Miron, S.D.; Bahler, R.C.

    1986-08-01

    The ST segment shift relative to exercise-induced increments in heart rate, the ST/heart rate slope (ST/HR slope), has been proposed as a more accurate ECG criterion for diagnosing significant coronary artery disease (CAD). Its clinical utility, with the use of a standard treadmill protocol, was compared with quantitative stress thallium (TI) and standard treadmill criteria in 64 unselected patients who underwent coronary angiography. The overall diagnostic accuracy of the ST/HR slope was an improvement over TI and conventional ST criteria (81%, 67%, and 69%). For patients failing to reach 85% of their age-predicted maximal heart rate, its diagnostic accuracy was comparable with TI (77% and 74%). Its sensitivity in patients without prior myocardial infarctions was equivalent to that of thallium (91% and 95%). The ST/HR slope was directly related to the angiographic severity (Gensini score) of CAD in patients without a prior infarction (r = 0.61, p less than 0.001). The ST/HR slope was an improved ECG criterion for diagnosing CAD and compared favorably with TI imaging.

  6. Wndchrm – an open source utility for biological image analysis

    PubMed Central

    Shamir, Lior; Orlov, Nikita; Eckley, D Mark; Macura, Tomasz; Johnston, Josiah; Goldberg, Ilya G

    2008-01-01

    Background Biological imaging is an emerging field, covering a wide range of applications in biological and clinical research. However, while machinery for automated experimenting and data acquisition has been developing rapidly in the past years, automated image analysis often introduces a bottleneck in high content screening. Methods Wndchrm is an open source utility for biological image analysis. The software works by first extracting image content descriptors from the raw image, image transforms, and compound image transforms. Then, the most informative features are selected, and the feature vector of each image is used for classification and similarity measurement. Results Wndchrm has been tested using several publicly available biological datasets, and provided results which are favorably comparable to the performance of task-specific algorithms developed for these datasets. The simple user interface allows researchers who are not knowledgeable in computer vision methods and have no background in computer programming to apply image analysis to their data. Conclusion We suggest that wndchrm can be effectively used for a wide range of biological image analysis tasks. Using wndchrm can allow scientists to perform automated biological image analysis while avoiding the costly challenge of implementing computer vision and pattern recognition algorithms. PMID:18611266

  7. HOW TO DEAL WITH WASTE ACCEPTANCE UNCERTAINTY USING THE WASTE ACCEPTANCE CRITERIA FORECASTING AND ANALYSIS CAPABILITY SYSTEM (WACFACS)

    SciTech Connect

    Redus, K. S.; Hampshire, G. J.; Patterson, J. E.; Perkins, A. B.

    2002-02-25

    The Waste Acceptance Criteria Forecasting and Analysis Capability System (WACFACS) is used to plan for, evaluate, and control the supply of approximately 1.8 million yd3 of low-level radioactive, TSCA, and RCRA hazardous wastes from over 60 environmental restoration projects between FY02 through FY10 to the Oak Ridge Environmental Management Waste Management Facility (EMWMF). WACFACS is a validated decision support tool that propagates uncertainties inherent in site-related contaminant characterization data, disposition volumes during EMWMF operations, and project schedules to quantitatively determine the confidence that risk-based performance standards are met. Trade-offs in schedule, volumes of waste lots, and allowable concentrations of contaminants are performed to optimize project waste disposition, regulatory compliance, and disposal cell management.

  8. Wave-Optics Analysis of Pupil Imaging

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.; Bos, Brent J.

    2006-01-01

    Pupil imaging performance is analyzed from the perspective of physical optics. A multi-plane diffraction model is constructed by propagating the scalar electromagnetic field, surface by surface, along the optical path comprising the pupil imaging optical system. Modeling results are compared with pupil images collected in the laboratory. The experimental setup, although generic for pupil imaging systems in general, has application to the James Webb Space Telescope (JWST) optical system characterization where the pupil images are used as a constraint to the wavefront sensing and control process. Practical design considerations follow from the diffraction modeling which are discussed in the context of the JWST Observatory.

  9. Identification in residue analysis based on liquid chromatography with tandem mass spectrometry: Experimental evidence to update performance criteria.

    PubMed

    Mol, Hans G J; Zomer, Paul; García López, Mónica; Fussell, Richard J; Scholten, Jos; de Kok, Andre; Wolheim, Anne; Anastassiades, Michelangelo; Lozano, Ana; Fernandez Alba, Amadeo

    2015-05-11

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) is one of the most widely used techniques for identification (and quantification) of residues and contaminants across a number of different chemical domains. Although the same analytical technique is used, the parameters and criteria for identification vary depending on where in the world the analysis is performed and for what purpose (e.g. determination of pesticides, veterinary drugs, forensic toxicology, sports doping). The rationale for these differences is not clear and in most cases the criteria are essentially based on expert opinions rather than underpinned by experimental data. In the current study, the variability of the two key identification parameters, retention time and ion ratio, was assessed and compared against requirements set out in different legal and guidance documents. The study involved the analysis of 120 pesticides, representing various chemical classes, polarities, molecular weights, and detector response factors, in 21 different fruit and vegetable matrices of varying degrees of complexity. The samples were analysed non-fortified, and fortified at 10, 50 and 200 μg kg(-1), in five laboratories using different LC-MS/MS instruments and conditions. In total, over 135,000 extracted-ion chromatograms were manually verified to provide an extensive data set for the assessment. The experimental data do not support relative tolerances for retention time, or different tolerances for ion ratios depending on relative abundance of the two product ions measured. Retention times in today's chromatographic systems are sufficiently stable to justify an absolute tolerance of ±0.1 min. Ion ratios are stable as long as sufficient response is obtained for both product ions. Ion ratio deviations are typically within ±20% (relative), and within ±45% (relative) in case the response of product ions are close to the limit of detection. Ion ratio tolerances up to 50% did not result in false positives and

  10. Modified distance in average linkage based on M-estimator and MADn criteria in hierarchical cluster analysis

    NASA Astrophysics Data System (ADS)

    Muda, Nora; Othman, Abdul Rahman

    2015-10-01

    The process of grouping a set of objects into classes of similar objects is called clustering. It divides a large group of observations into smaller groups so that the observations within each group are relatively similar and the observations in different groups are relatively dissimilar. In this study, an agglomerative method in hierarchical cluster analysis is chosen and clusters were constructed by using an average linkage technique. An average linkage technique requires distance between clusters, which is calculated based on the average distance between all pairs of points, one group with another group. In calculating the average distance, the distance will not be robust when there is an outlier. Therefore, the average distance in average linkage needs to be modified in order to overcome the problem of outlier. Therefore, the criteria of outlier detection based on MADn criteria is used and the average distance is recalculated without the outlier. Next, the distance in average linkage is calculated based on a modified one step M-estimator (MOM). The groups of cluster are presented in dendrogram graph. To evaluate the goodness of a modified distance in the average linkage clustering, the bootstrap analysis is conducted on the dendrogram graph and the bootstrap value (BP) are assessed for each branch in dendrogram that formed the group, to ensure the reliability of the branches constructed. This study found that the average linkage technique with modified distance is significantly superior than the usual average linkage technique, if there is an outlier. Both of these techniques are said to be similar if there is no outlier.

  11. An optimal control approach to pilot/vehicle analysis and Neal-Smith criteria

    NASA Technical Reports Server (NTRS)

    Bacon, B. J.; Schmidt, D. K.

    1984-01-01

    The approach of Neal and Smith was merged with the advances in pilot modeling by means of optimal control techniques. While confirming the findings of Neal and Smith, a methodology that explicitly includes the pilot's objective in attitude tracking was developed. More importantly, the method yields the required system bandwidth along with a better pilot model directly applicable to closed-loop analysis of systems in any order.

  12. Cost and Schedule Control Systems Criteria for contract performance measurement. Data Analysis Guide

    SciTech Connect

    1986-03-01

    The Data Analysis Guide has been prepared to aid both DOE and industry personnel in the effective use of contract performance measurement data. It suggests techniques for analyzing contractor cost and schedule data to give insight into current contract performance status and help validate contractor estimates of future contract performance. The techniques contained herein should be modified and tailored to fit particular project and special needs.

  13. The Dynairship. [structural design criteria and feasibility analysis of an airplane - airship

    NASA Technical Reports Server (NTRS)

    Miller, W. M., Jr.

    1975-01-01

    A feasibility analysis for the construction and use of a combination airplane-airship named 'Dynairship' is undertaken. Payload capacities, fuel consumption, and the structural design of the craft are discussed and compared to a conventional commercial aircraft (a Boeing 747). Cost estimates of construction and operation of the craft are also discussed. The various uses of the craft are examined (i.e, in police work, materials handling, and ocean surveillance), and aerodynamic configurations and photographs are shown.

  14. Advanced image analysis for the preservation of cultural heritage

    NASA Astrophysics Data System (ADS)

    France, Fenella G.; Christens-Barry, William; Toth, Michael B.; Boydston, Kenneth

    2010-02-01

    The Library of Congress' Preservation Research and Testing Division has established an advanced preservation studies scientific program for research and analysis of the diverse range of cultural heritage objects in its collection. Using this system, the Library is currently developing specialized integrated research methodologies for extending preservation analytical capacities through non-destructive hyperspectral imaging of cultural objects. The research program has revealed key information to support preservation specialists, scholars and other institutions. The approach requires close and ongoing collaboration between a range of scientific and cultural heritage personnel - imaging and preservation scientists, art historians, curators, conservators and technology analysts. A research project of the Pierre L'Enfant Plan of Washington DC, 1791 had been undertaken to implement and advance the image analysis capabilities of the imaging system. Innovative imaging options and analysis techniques allow greater processing and analysis capacities to establish the imaging technique as the first initial non-invasive analysis and documentation step in all cultural heritage analyses. Mapping spectral responses, organic and inorganic data, topography semi-microscopic imaging, and creating full spectrum images have greatly extended this capacity from a simple image capture technique. Linking hyperspectral data with other non-destructive analyses has further enhanced the research potential of this image analysis technique.

  15. Three modality image registration of brain SPECT/CT and MR images for quantitative analysis of dopamine transporter imaging

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Yuzuho; Takeda, Yuta; Hara, Takeshi; Zhou, Xiangrong; Matsusako, Masaki; Tanaka, Yuki; Hosoya, Kazuhiko; Nihei, Tsutomu; Katafuchi, Tetsuro; Fujita, Hiroshi

    2016-03-01

    Important features in Parkinson's disease (PD) are degenerations and losses of dopamine neurons in corpus striatum. 123I-FP-CIT can visualize activities of the dopamine neurons. The activity radio of background to corpus striatum is used for diagnosis of PD and Dementia with Lewy Bodies (DLB). The specific activity can be observed in the corpus striatum on SPECT images, but the location and the shape of the corpus striatum on SPECT images only are often lost because of the low uptake. In contrast, MR images can visualize the locations of the corpus striatum. The purpose of this study was to realize a quantitative image analysis for the SPECT images by using image registration technique with brain MR images that can determine the region of corpus striatum. In this study, the image fusion technique was used to fuse SPECT and MR images by intervening CT image taken by SPECT/CT. The mutual information (MI) for image registration between CT and MR images was used for the registration. Six SPECT/CT and four MR scans of phantom materials are taken by changing the direction. As the results of the image registrations, 16 of 24 combinations were registered within 1.3mm. By applying the approach to 32 clinical SPECT/CT and MR cases, all of the cases were registered within 0.86mm. In conclusions, our registration method has a potential in superimposing MR images on SPECT images.

  16. Characterization and analysis of infrared images

    NASA Astrophysics Data System (ADS)

    Raglin, Adrienne; Wetmore, Alan; Ligon, David

    2006-05-01

    Stokes images in the long-wave infrared (LWIR) and methods for processing polarimetric data continue to be areas of interest. Stokes images which are sensitive to geometry and material differences are acquired by measuring the polarization state of the received electromagnetic radiation. The polarimetric data from Stokes images may provide enhancements to conventional IR imagery data. It is generally agreed that polarimetric images can reveal information about objects or features within a scene that are not available through other imaging techniques. This additional information may generate different approaches to segmentation, detection, and recognition of objects or features. Previous research where horizontal and vertical polarization data is used supports the use of this type of data for image processing tasks. In this work we analyze a sample polarimetric image to show both improved segmentation of objects and derivation of their inherent 3-D geometry.

  17. Image segmentation by iterative parallel region growing with application to data compression and image analysis

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    1988-01-01

    Image segmentation can be a key step in data compression and image analysis. However, the segmentation results produced by most previous approaches to region growing are suspect because they depend on the order in which portions of the image are processed. An iterative parallel segmentation algorithm avoids this problem by performing globally best merges first. Such a segmentation approach, and two implementations of the approach on NASA's Massively Parallel Processor (MPP) are described. Application of the segmentation approach to data compression and image analysis is then described, and results of such application are given for a LANDSAT Thematic Mapper image.

  18. Analysis of scanning probe microscope images using wavelets.

    PubMed

    Gackenheimer, C; Cayon, L; Reifenberger, R

    2006-03-01

    The utility of wavelet transforms for analysis of scanning probe images is investigated. Simulated scanning probe images are analyzed using wavelet transforms and compared to a parallel analysis using more conventional Fourier transform techniques. The wavelet method introduced in this paper is particularly useful as an image recognition algorithm to enhance nanoscale objects of a specific scale that may be present in scanning probe images. In its present form, the applied wavelet is optimal for detecting objects with rotational symmetry. The wavelet scheme is applied to the analysis of scanning probe data to better illustrate the advantages that this new analysis tool offers. The wavelet algorithm developed for analysis of scanning probe microscope (SPM) images has been incorporated into the WSxM software which is a versatile freeware SPM analysis package. PMID:16439061

  19. Covariance Based Pre-Filters and Screening Criteria for Conjunction Analysis

    NASA Astrophysics Data System (ADS)

    George, E., Chan, K.

    2012-09-01

    Several relationships are developed relating object size, initial covariance and range at closest approach to probability of collision. These relationships address the following questions: - Given the objects' initial covariance and combined hard body size, what is the maximum possible value of the probability of collision (Pc)? - Given the objects' initial covariance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the combined hard body radius, what is the minimum miss distance for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the miss distance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? The first relationship above allows the elimination of object pairs from conjunction analysis (CA) on the basis of the initial covariance and hard-body sizes of the objects. The application of this pre-filter to present day catalogs with estimated covariance results in the elimination of approximately 35% of object pairs as unable to ever conjunct with a probability of collision exceeding 1x10-6. Because Pc is directly proportional to object size and inversely proportional to covariance size, this pre-filter will have a significantly larger impact on future catalogs, which are expected to contain a much larger fraction of small debris tracked only by a limited subset of available sensors. This relationship also provides a mathematically rigorous basis for eliminating objects from analysis entirely based on element set age or quality - a practice commonly done by rough rules of thumb today. Further, these relations can be used to determine the required geometric screening radius for all objects. This analysis reveals the screening volumes for small objects are much larger than needed, while the screening volumes for

  20. Image analysis for dental bone quality assessment using CBCT imaging

    NASA Astrophysics Data System (ADS)

    Suprijanto; Epsilawati, L.; Hajarini, M. S.; Juliastuti, E.; Susanti, H.

    2016-03-01

    Cone beam computerized tomography (CBCT) is one of X-ray imaging modalities that are applied in dentistry. Its modality can visualize the oral region in 3D and in a high resolution. CBCT jaw image has potential information for the assessment of bone quality that often used for pre-operative implant planning. We propose comparison method based on normalized histogram (NH) on the region of inter-dental septum and premolar teeth. Furthermore, the NH characteristic from normal and abnormal bone condition are compared and analyzed. Four test parameters are proposed, i.e. the difference between teeth and bone average intensity (s), the ratio between bone and teeth average intensity (n) of NH, the difference between teeth and bone peak value (Δp) of NH, and the ratio between teeth and bone of NH range (r). The results showed that n, s, and Δp have potential to be the classification parameters of dental calcium density.

  1. Analysis of Anechoic Chamber Testing of the Hurricane Imaging Radiometer

    NASA Technical Reports Server (NTRS)

    Fenigstein, David; Ruf, Chris; James, Mark; Simmons, David; Miller, Timothy; Buckley, Courtney

    2010-01-01

    The Hurricane Imaging Radiometer System (HIRAD) is a new airborne passive microwave remote sensor developed to observe hurricanes. HIRAD incorporates synthetic thinned array radiometry technology, which use Fourier synthesis to reconstruct images from an array of correlated antenna elements. The HIRAD system response to a point emitter has been measured in an anechoic chamber. With this data, a Fourier inversion image reconstruction algorithm has been developed. Performance analysis of the apparatus is presented, along with an overview of the image reconstruction algorithm

  2. Image and Data-analysis Tools For Paleoclimatic Reconstructions

    NASA Astrophysics Data System (ADS)

    Pozzi, M.

    It comes here proposed a directory of instruments and computer science resources chosen in order to resolve the problematic ones that regard the paleoclimatic recon- structions. They will come discussed in particular the following points: 1) Numerical analysis of paleo-data (fossils abundances, species analyses, isotopic signals, chemical-physical parameters, biological data): a) statistical analyses (uni- variate, diversity, rarefaction, correlation, ANOVA, F and T tests, Chi^2) b) multidi- mensional analyses (principal components, corrispondence, cluster analysis, seriation, discriminant, autocorrelation, spectral analysis) neural analyses (backpropagation net, kohonen feature map, hopfield net genetic algorithms) 2) Graphical analysis (visu- alization tools) of paleo-data (quantitative and qualitative fossils abundances, species analyses, isotopic signals, chemical-physical parameters): a) 2-D data analyses (graph, histogram, ternary, survivorship) b) 3-D data analyses (direct volume rendering, iso- surfaces, segmentation, surface reconstruction, surface simplification,generation of tetrahedral grids). 3) Quantitative and qualitative digital image analysis (macro and microfossils image analysis, Scanning Electron Microscope. and Optical Polarized Microscope images capture and analysis, morphometric data analysis, 3-D reconstruc- tions): a) 2D image analysis (correction of image defects, enhancement of image de- tail, converting texture and directionality to grey scale or colour differences, visual enhancement using pseudo-colour, pseudo-3D, thresholding of image features, binary image processing, measurements, stereological measurements, measuring features on a white background) b) 3D image analysis (basic stereological procedures, two dimen- sional structures; area fraction from the point count, volume fraction from the point count, three dimensional structures: surface area and the line intercept count, three dimensional microstructures; line length and the

  3. Image analysis of neuropsychological test responses

    NASA Astrophysics Data System (ADS)

    Smith, Stephen L.; Hiller, Darren L.

    1996-04-01

    This paper reports recent advances in the development of an automated approach to neuropsychological testing. High performance image analysis algorithms have been developed as part of a convenient and non-invasive computer-based system to provide an objective assessment of patient responses to figure-copying tests. Tests of this type are important in determining the neurological function of patients following stroke through evaluation of their visuo-spatial performance. Many conventional neuropsychological tests suffer from the serious drawback that subjective judgement on the part of the tester is required in the measurement of the patient's response which leads to a qualitative neuropsychological assessment that can be both inconsistent and inaccurate. Results for this automated approach are presented for three clinical populations: patients suffering right hemisphere stroke are compared with adults with no known neurological disorder and a population comprising normal school children of 11 years is presented to demonstrate the sensitivity of the technique. As well as providing a more reliable and consistent diagnosis this technique is sufficiently sensitive to monitor a patient's progress over a period of time and will provide the neuropsychologist with a practical means of evaluating the effectiveness of therapy or medication administered as part of a rehabilitation program.

  4. Membrane-targeting peptides for nanoparticle-facilitated cellular imaging and analysis

    NASA Astrophysics Data System (ADS)

    Breger, Joyce; Delehanty, James B.; Boeneman Gemmill, Kelly; Field, Lauren D.; Blanco-Canosa, Juan B.; Dawson, Philip E.; Huston, Alan L.; Medintz, Igor L.

    2015-03-01

    The controlled delivery of nanomaterials to the plasma membrane is critical for the development of nanoscale probes that can eventually enable cellular imaging and analysis of membrane processes. Chief among the requisite criteria are delivery/targeting modalities that result in the long-term residence (e.g., days) of the nanoparticles on the plasma membrane while simultaneously not interfering with regular cellular physiology and homeostasis. Our laboratory has developed a suite of peptidyl motifs that target semiconductor nanocrystals (quantum dots (QDs) to the plasma membrane where they remain resident for up to three days. Notably, only small a percentage of the QDs are endocytosed over this time course and cellular viability is maintained. This talk will highlight the utility of these peptide-QD constructs for cellular imaging and analysis.

  5. EVALUATION OF COLOR ALTERATION ON FABRICS BY IMAGE ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Evaluation of color changes is usually done manually and is often inconsistent. Image analysis provides a method in which to evaluate color-related testing that is not only simple, but also consistent. Image analysis can also be used to measure areas that were considered too large for the colorimet...

  6. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables. PMID:26853193

  7. Slide Set: Reproducible image analysis and batch processing with ImageJ.

    PubMed

    Nanes, Benjamin A

    2015-11-01

    Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets common in biology. Here we present the Slide Set plugin for ImageJ, which provides a framework for reproducible image analysis and batch processing. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution. PMID:26554504

  8. Slide Set: reproducible image analysis and batch processing with ImageJ

    PubMed Central

    Nanes, Benjamin A.

    2015-01-01

    Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets that are common in biology. This paper introduces Slide Set, a framework for reproducible image analysis and batch processing with ImageJ. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution. PMID:26554504

  9. A revised analysis of Lawson criteria and its implications for ICF

    SciTech Connect

    Panarella, E. |

    1995-12-31

    Recently, a re-examination of the breakeven conditions for D-T plasmas has been presented. The results show that breakeven might not follow the Lawson nt rule, and in particular the plasma containment time seems to have lost the importance that it previously had. Moreover, a minimum particle density of the order of {approximately}10{sup 15} cm{sup {minus}3} has been found to be required for breakeven, which indicates that the inertial confinement fusion effort is in the right position to reach the fusion goal. In light of these results, a reassessment of Lawson`s analysis has been undertaken. Lawson considered the case of a pulsed system that followed this idealized cycle: the gas is heated instantaneously to a temperature T, which is maintained for a time t, after which the gas is allowed to cool. Conduction loss is neglected entirely, and the energy used to heat the gas and supply the radiation loss is regained as useful heat. In order to illustrate how the analysis by Lawson can be improved, the cycle to which the gas is subjected should be divided in three phases: 1st phase: rapid heating of the gas for a time t{sub 1} to bring it from the original ambient temperature to the fusion temperature T; 2nd phase: continuous injection of energy in the plasma for a time t{sub 2} to maintain the temperature T; 3rd phase: no more injection of energy and cooling of the gas to the ambient temperature in a time t{sub 3}.

  10. GIS-Based Multi-Criteria Analysis for Arabica Coffee Expansion in Rwanda

    PubMed Central

    Nzeyimana, Innocent; Hartemink, Alfred E.; Geissen, Violette

    2014-01-01

    The Government of Rwanda is implementing policies to increase the area of Arabica coffee production. Information on the suitable areas for sustainably growing Arabica coffee is still scarce. This study aimed to analyze suitable areas for Arabica coffee production. We analyzed the spatial distribution of actual and potential production zones for Arabica coffee, their productivity levels and predicted potential yields. We used a geographic information system (GIS) for a weighted overlay analysis to assess the major production zones of Arabica coffee and their qualitative productivity indices. Actual coffee yields were measured in the field and were used to assess potential productivity zones and yields using ordinary kriging with ArcGIS software. The production of coffee covers about 32 000 ha, or 2.3% of all cultivated land in the country. The major zones of production are the Kivu Lake Borders, Central Plateau, Eastern Plateau, and Mayaga agro-ecological zones, where coffee is mainly cultivated on moderate slopes. In the highlands, coffee is grown on steep slopes that can exceed 55%. About 21% percent of the country has a moderate yield potential, ranging between 1.0 and 1.6 t coffee ha−1, and 70% has a low yield potential (<1.0 t coffee ha−1). Only 9% of the country has a high yield potential of 1.6–2.4 t coffee ha−1. Those areas are found near Lake Kivu where the dominant soil Orders are Inceptisols and Ultisols. Moderate yield potential is found in the Birunga (volcano), Congo-Nile watershed Divide, Impala and Imbo zones. Low-yield regions (<1 t ha−1) occur in the eastern semi-dry lowlands, Central Plateau, Eastern Plateau, Buberuka Highlands, and Mayaga zones. The weighted overlay analysis and ordinary kriging indicated a large spatial variability of potential productivity indices. Increasing the area and productivity of coffee in Rwanda thus has considerable potential. PMID:25299459

  11. GIS-based multi-criteria analysis for Arabica coffee expansion in Rwanda.

    PubMed

    Nzeyimana, Innocent; Hartemink, Alfred E; Geissen, Violette

    2014-01-01

    The Government of Rwanda is implementing policies to increase the area of Arabica coffee production. Information on the suitable areas for sustainably growing Arabica coffee is still scarce. This study aimed to analyze suitable areas for Arabica coffee production. We analyzed the spatial distribution of actual and potential production zones for Arabica coffee, their productivity levels and predicted potential yields. We used a geographic information system (GIS) for a weighted overlay analysis to assess the major production zones of Arabica coffee and their qualitative productivity indices. Actual coffee yields were measured in the field and were used to assess potential productivity zones and yields using ordinary kriging with ArcGIS software. The production of coffee covers about 32 000 ha, or 2.3% of all cultivated land in the country. The major zones of production are the Kivu Lake Borders, Central Plateau, Eastern Plateau, and Mayaga agro-ecological zones, where coffee is mainly cultivated on moderate slopes. In the highlands, coffee is grown on steep slopes that can exceed 55%. About 21% percent of the country has a moderate yield potential, ranging between 1.0 and 1.6 t coffee ha-1, and 70% has a low yield potential (<1.0 t coffee ha-1). Only 9% of the country has a high yield potential of 1.6-2.4 t coffee ha-1. Those areas are found near Lake Kivu where the dominant soil Orders are Inceptisols and Ultisols. Moderate yield potential is found in the Birunga (volcano), Congo-Nile watershed Divide, Impala and Imbo zones. Low-yield regions (<1 t ha-1) occur in the eastern semi-dry lowlands, Central Plateau, Eastern Plateau, Buberuka Highlands, and Mayaga zones. The weighted overlay analysis and ordinary kriging indicated a large spatial variability of potential productivity indices. Increasing the area and productivity of coffee in Rwanda thus has considerable potential. PMID:25299459

  12. Lymphovascular and perineural invasion as selection criteria for adjuvant therapy in intrahepatic cholangiocarcinoma: a multi-institution analysis

    PubMed Central

    Fisher, Sarah B; Patel, Sameer H; Kooby, David A; Weber, Sharon; Bloomston, Mark; Cho, Clifford; Hatzaras, Ioannis; Schmidt, Carl; Winslow, Emily; Staley III, Charles A; Maithel, Shishir K

    2012-01-01

    Objectives Criteria for the selection of patients for adjuvant chemotherapy in intrahepatic cholangiocarcinoma (IHCC) are lacking. Some authors advocate treating patients with lymph node (LN) involvement; however, nodal assessment is often inadequate or not performed. This study aimed to identify surrogate criteria based on characteristics of the primary tumour. Methods A total of 58 patients who underwent resection for IHCC between January 2000 and January 2010 at any of three institutions were identified. Primary outcome was overall survival (OS). Results Median OS was 23.0 months. Median tumour size was 6.5 cm and the median number of lesions was one. Overall, 16% of patients had positive margins, 38% had perineural invasion (PNI), 40% had lymphovascular invasion (LVI) and 22% had LN involvement. A median of two LNs were removed and a median of zero were positive. Lymph nodes were not sampled in 34% of patients. Lymphovascular and perineural invasion were associated with reduced OS [9.6 months vs. 32.7 months (P= 0.020) and 10.7 months vs. 32.7 months (P= 0.008), respectively]. Lymph node involvement indicated a trend towards reduced OS (10.7 months vs. 30.0 months; P= 0.063). The presence of either LVI or PNI in node-negative patients was associated with a reduction in OS similar to that in node-positive patients (12.1 months vs. 10.7 months; P= 0.541). After accounting for adverse tumour factors, only LVI and PNI remained associated with decreased OS on multivariate analysis (hazard ratio 4.07, 95% confidence interval 1.60–10.40; P= 0.003). Conclusions Lymphovascular and perineural invasion are separately associated with a reduction in OS similar to that in patients with LN-positive disease. As nodal dissection is often not performed and the number of nodes retrieved is frequently inadequate, these tumour-specific factors should be considered as criteria for selection for adjuvant chemotherapy. PMID:22762399

  13. A linear mixture analysis-based compression for hyperspectral image analysis

    SciTech Connect

    C. I. Chang; I. W. Ginsberg

    2000-06-30

    In this paper, the authors present a fully constrained least squares linear spectral mixture analysis-based compression technique for hyperspectral image analysis, particularly, target detection and classification. Unlike most compression techniques that directly deal with image gray levels, the proposed compression approach generates the abundance fractional images of potential targets present in an image scene and then encodes these fractional images so as to achieve data compression. Since the vital information used for image analysis is generally preserved and retained in the abundance fractional images, the loss of information may have very little impact on image analysis. In some occasions, it even improves analysis performance. Airborne visible infrared imaging spectrometer (AVIRIS) data experiments demonstrate that it can effectively detect and classify targets while achieving very high compression ratios.

  14. Multi-criteria decision analysis of concentrated solar power with thermal energy storage and dry cooling.

    PubMed

    Klein, Sharon J W

    2013-12-17

    Decisions about energy backup and cooling options for parabolic trough (PT) concentrated solar power have technical, economic, and environmental implications. Although PT development has increased rapidly in recent years, energy policies do not address backup or cooling option requirements, and very few studies directly compare the diverse implications of these options. This is the first study to compare the annual capacity factor, levelized cost of energy (LCOE), water consumption, land use, and life cycle greenhouse gas (GHG) emissions of PT with different backup options (minimal backup (MB), thermal energy storage (TES), and fossil fuel backup (FF)) and different cooling options (wet (WC) and dry (DC). Multicriteria decision analysis was used with five preference scenarios to identify the highest-scoring energy backup-cooling combination for each preference scenario. MB-WC had the highest score in the Economic and Climate Change-Economy scenarios, while FF-DC and FF-WC had the highest scores in the Equal and Availability scenarios, respectively. TES-DC had the highest score for the Environmental scenario. DC was ranked 1-3 in all preference scenarios. Direct comparisons between GHG emissions and LCOE and between GHG emissions and land use suggest a preference for TES if backup is require for PT plants to compete with baseload generators. PMID:24245524

  15. Flux Analysis of the Trypanosoma brucei Glycolysis Based on a Multiobjective-Criteria Bioinformatic Approach

    PubMed Central

    Ghozlane, Amine; Bringaud, Frédéric; Soueidan, Hayssam; Dutour, Isabelle; Jourdan, Fabien; Thébault, Patricia

    2012-01-01

    Trypanosoma brucei is a protozoan parasite of major of interest in discovering new genes for drug targets. This parasite alternates its life cycle between the mammal host(s) (bloodstream form) and the insect vector (procyclic form), with two divergent glucose metabolism amenable to in vitro culture. While the metabolic network of the bloodstream forms has been well characterized, the flux distribution between the different branches of the glucose metabolic network in the procyclic form has not been addressed so far. We present a computational analysis (called Metaboflux) that exploits the metabolic topology of the procyclic form, and allows the incorporation of multipurpose experimental data to increase the biological relevance of the model. The alternatives resulting from the structural complexity of networks are formulated as an optimization problem solved by a metaheuristic where experimental data are modeled in a multiobjective function. Our results show that the current metabolic model is in agreement with experimental data and confirms the observed high metabolic flexibility of glucose metabolism. In addition, Metaboflux offers a rational explanation for the high flexibility in the ratio between final products from glucose metabolism, thsat is, flux redistribution through the malic enzyme steps. PMID:23097667

  16. Analysis of airborne MAIS imaging spectrometric data for mineral exploration

    SciTech Connect

    Wang Jinnian; Zheng Lanfen; Tong Qingxi

    1996-11-01

    The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data and chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.

  17. Low-cost image analysis system

    SciTech Connect

    Lassahn, G.D.

    1995-01-01

    The author has developed an Automatic Target Recognition system based on parallel processing using transputers. This approach gives a powerful, fast image processing system at relatively low cost. This system scans multi-sensor (e.g., several infrared bands) image data to find any identifiable target, such as physical object or a type of vegetation.

  18. Analysis of Images from Experiments Investigating Fragmentation of Materials

    SciTech Connect

    Kamath, C; Hurricane, O

    2007-09-10

    Image processing techniques have been used extensively to identify objects of interest in image data and extract representative characteristics for these objects. However, this can be a challenge due to the presence of noise in the images and the variation across images in a dataset. When the number of images to be analyzed is large, the algorithms used must also be relatively insensitive to the choice of parameters and lend themselves to partial or full automation. This not only avoids manual analysis which can be time consuming and error-prone, but also makes the analysis reproducible, thus enabling comparisons between images which have been processed in an identical manner. In this paper, we describe our approach to extracting features for objects of interest in experimental images. Focusing on the specific problem of fragmentation of materials, we show how we can extract statistics for the fragments and the gaps between them.

  19. Multimodal digital color imaging system for facial skin lesion analysis

    NASA Astrophysics Data System (ADS)

    Bae, Youngwoo; Lee, Youn-Heum; Jung, Byungjo

    2008-02-01

    In dermatology, various digital imaging modalities have been used as an important tool to quantitatively evaluate the treatment effect of skin lesions. Cross-polarization color image was used to evaluate skin chromophores (melanin and hemoglobin) information and parallel-polarization image to evaluate skin texture information. In addition, UV-A induced fluorescent image has been widely used to evaluate various skin conditions such as sebum, keratosis, sun damages, and vitiligo. In order to maximize the evaluation efficacy of various skin lesions, it is necessary to integrate various imaging modalities into an imaging system. In this study, we propose a multimodal digital color imaging system, which provides four different digital color images of standard color image, parallel and cross-polarization color image, and UV-A induced fluorescent color image. Herein, we describe the imaging system and present the examples of image analysis. By analyzing the color information and morphological features of facial skin lesions, we are able to comparably and simultaneously evaluate various skin lesions. In conclusion, we are sure that the multimodal color imaging system can be utilized as an important assistant tool in dermatology.

  20. Whole-breast irradiation: a subgroup analysis of criteria to stratify for prone position treatment

    SciTech Connect

    Ramella, Sara; Trodella, Lucio; Ippolito, Edy; Fiore, Michele; Cellini, Francesco; Stimato, Gerardina; Gaudino, Diego; Greco, Carlo; Ramponi, Sara; Cammilluzzi, Eugenio; Cesarini, Claudio; Piermattei, Angelo; Cesario, Alfredo; D'Angelillo, Rolando Maria

    2012-07-01

    To select among breast cancer patients and according to breast volume size those who may benefit from 3D conformal radiotherapy after conservative surgery applied with prone-position technique. Thirty-eight patients with early-stage breast cancer were grouped according to the target volume (TV) measured in the supine position: small ({<=}400 mL), medium (400-700 mL), and large ({>=}700 ml). An ad-hoc designed and built device was used for prone set-up to displace the contralateral breast away from the tangential field borders. All patients underwent treatment planning computed tomography in both the supine and prone positions. Dosimetric data to explore dose distribution and volume of normal tissue irradiated were calculated for each patient in both positions. Homogeneity index, hot spot areas, the maximum dose, and the lung constraints were significantly reduced in the prone position (p < 0.05). The maximum heart distance and the V{sub 5Gy} did not vary consistently in the 2 positions (p = 0.06 and p = 0.7, respectively). The number of necessary monitor units was significantly higher in the supine position (312 vs. 232, p < 0.0001). The subgroups analysis pointed out the advantage in lung sparing in all TV groups (small, medium and large) for all the evaluated dosimetric constraints (central lung distance, maximum lung distance, and V{sub 5Gy}, p < 0.0001). In the small TV group, a dose reduction in nontarget areas of 22% in the prone position was detected (p = 0.056); in the medium and high TV groups, the difference was of about -10% (p = NS). The decrease in hot spot areas in nontarget tissues was 73%, 47%, and 80% for small, medium, and large TVs in the prone position, respectively. Although prone breast radiotherapy is normally proposed in patients with breasts of large dimensions, this study gives evidence of dosimetric benefit in all patient subgroups irrespective of breast volume size.

  1. Analysis of deaths in patients awaiting heart transplantation: impact on patient selection criteria.

    PubMed Central

    Haywood, G. A.; Rickenbacher, P. R.; Trindade, P. T.; Gullestad, L.; Jiang, J. P.; Schroeder, J. S.; Vagelos, R.; Oyer, P.; Fowler, M. B.

    1996-01-01

    OBJECTIVE: To analyse the clinical characteristics of patients who died on the Stanford heart transplant waiting list and to develop a method for risk stratifying status 2 patients (outpatients). METHODS: Data were reviewed from all patients over 18 years, excluding retransplants, who were accepted for heart transplantation over an eight year period from 1986 to 1994. RESULTS: 548 patients were accepted for heart transplantation; 53 died on the waiting list, and 52 survived on the waiting list for over one year. On multivariate analysis only peak oxygen consumption (peak VO2: 11.7 (SD 2.7) v 15.1 (5.2) ml/kg/min, P = 0.02) and cardiac output (3.97 (1.03) v 4.79 (1.06) litres/min, P = 0.04) were found to be independent prognostic risk factors. Peak VO2 and cardiac index (CI) were then analysed in the last 141 consecutive patients accepted for cardiac transplantation. All deaths and 88% of the deteriorations to status 1 on the waiting list occurred in patients with either a CI < 2.0 or a VO2 < 12. In those with a CI < 2.0 and a VO2 < 12, 38% died or deteriorated to status 1 in the first year on the waiting list. Patients with CI > or = 2.0 and a VO2 > or = 12 all survived throughout follow up. Using a Cox's proportional hazards model with CI and peak VO2 as covariates, tables were constructed predicting the chance of surviving for (a) 60 days and (b) 1 year on the waiting list. CONCLUSIONS: These data provide a basis for risk stratification of status 2 patients on the heart transplant waiting list. PMID:8665337

  2. PIZZARO: Forensic analysis and restoration of image and video data.

    PubMed

    Kamenicky, Jan; Bartos, Michal; Flusser, Jan; Mahdian, Babak; Kotera, Jan; Novozamsky, Adam; Saic, Stanislav; Sroubek, Filip; Sorel, Michal; Zita, Ales; Zitova, Barbara; Sima, Zdenek; Svarc, Petr; Horinek, Jan

    2016-07-01

    This paper introduces a set of methods for image and video forensic analysis. They were designed to help to assess image and video credibility and origin and to restore and increase image quality by diminishing unwanted blur, noise, and other possible artifacts. The motivation came from the best practices used in the criminal investigation utilizing images and/or videos. The determination of the image source, the verification of the image content, and image restoration were identified as the most important issues of which automation can facilitate criminalists work. Novel theoretical results complemented with existing approaches (LCD re-capture detection and denoising) were implemented in the PIZZARO software tool, which consists of the image processing functionality as well as of reporting and archiving functions to ensure the repeatability of image analysis procedures and thus fulfills formal aspects of the image/video analysis work. Comparison of new proposed methods with the state of the art approaches is shown. Real use cases are presented, which illustrate the functionality of the developed methods and demonstrate their applicability in different situations. The use cases as well as the method design were solved in tight cooperation of scientists from the Institute of Criminalistics, National Drug Headquarters of the Criminal Police and Investigation Service of the Police of the Czech Republic, and image processing experts from the Czech Academy of Sciences. PMID:27182830

  3. Dehazing method through polarimetric imaging and multi-scale analysis

    NASA Astrophysics Data System (ADS)

    Cao, Lei; Shao, Xiaopeng; Liu, Fei; Wang, Lin

    2015-05-01

    An approach for haze removal utilizing polarimetric imaging and multi-scale analysis has been developed to solve one problem that haze weather weakens the interpretation of remote sensing because of the poor visibility and short detection distance of haze images. On the one hand, the polarization effects of the airlight and the object radiance in the imaging procedure has been considered. On the other hand, one fact that objects and haze possess different frequency distribution properties has been emphasized. So multi-scale analysis through wavelet transform has been employed to make it possible for low frequency components that haze presents and high frequency coefficients that image details or edges occupy are processed separately. According to the measure of the polarization feather by Stokes parameters, three linear polarized images (0°, 45°, and 90°) have been taken on haze weather, then the best polarized image min I and the worst one max I can be synthesized. Afterwards, those two polarized images contaminated by haze have been decomposed into different spatial layers with wavelet analysis, and the low frequency images have been processed via a polarization dehazing algorithm while high frequency components manipulated with a nonlinear transform. Then the ultimate haze-free image can be reconstructed by inverse wavelet reconstruction. Experimental results verify that the dehazing method proposed in this study can strongly promote image visibility and increase detection distance through haze for imaging warning and remote sensing systems.

  4. Spatio-spectral image analysis using classical and neural algorithms

    SciTech Connect

    Roberts, S.; Gisler, G.R.; Theiler, J.

    1996-12-31

    Remote imaging at high spatial resolution has a number of environmental, industrial, and military applications. Analysis of high-resolution multi-spectral images usually involves either spectral analysis of single pixels in a multi- or hyper-spectral image or spatial analysis of multi-pixels in a panchromatic or monochromatic image. Although insufficient for some pattern recognition applications individually, the combination of spatial and spectral analytical techniques may allow the identification of more complex signatures that might not otherwise be manifested in the individual spatial or spectral domains. We report on some preliminary investigation of unsupervised classification methodologies (using both ``classical`` and ``neural`` algorithms) to identify potentially revealing features in these images. We apply dimension-reduction preprocessing to the images, duster, and compare the clusterings obtained by different algorithms. Our classification results are analyzed both visually and with a suite of objective, quantitative measures.

  5. Dynamic infrared imaging in identification of breast cancer tissue with combined image processing and frequency analysis.

    PubMed

    Joro, R; Lääperi, A-L; Soimakallio, S; Järvenpää, R; Kuukasjärvi, T; Toivonen, T; Saaristo, R; Dastidar, P

    2008-01-01

    Five combinations of image-processing algorithms were applied to dynamic infrared (IR) images of six breast cancer patients preoperatively to establish optimal enhancement of cancer tissue before frequency analysis. mid-wave photovoltaic (PV) IR cameras with 320x254 and 640x512 pixels were used. The signal-to-noise ratio and the specificity for breast cancer were evaluated with the image-processing combinations from the image series of each patient. Before image processing and frequency analysis the effect of patient movement was minimized with a stabilization program developed and tested in the study by stabilizing image slices using surface markers set as measurement points on the skin of the imaged breast. A mathematical equation for superiority value was developed for comparison of the key ratios of the image-processing combinations. The ability of each combination to locate the mammography finding of breast cancer in each patient was compared. Our results show that data collected with a 640x512-pixel mid-wave PV camera applying image-processing methods optimizing signal-to-noise ratio, morphological image processing and linear image restoration before frequency analysis possess the greatest superiority value, showing the cancer area most clearly also in the match centre of the mammography estimation. PMID:18666012

  6. Vector sparse representation of color image using quaternion matrix analysis.

    PubMed

    Xu, Yi; Yu, Licheng; Xu, Hongteng; Zhang, Hao; Nguyen, Truong

    2015-04-01

    Traditional sparse image models treat color image pixel as a scalar, which represents color channels separately or concatenate color channels as a monochrome image. In this paper, we propose a vector sparse representation model for color images using quaternion matrix analysis. As a new tool for color image representation, its potential applications in several image-processing tasks are presented, including color image reconstruction, denoising, inpainting, and super-resolution. The proposed model represents the color image as a quaternion matrix, where a quaternion-based dictionary learning algorithm is presented using the K-quaternion singular value decomposition (QSVD) (generalized K-means clustering for QSVD) method. It conducts the sparse basis selection in quaternion space, which uniformly transforms the channel images to an orthogonal color space. In this new color space, it is significant that the inherent color structures can be completely preserved during vector reconstruction. Moreover, the proposed sparse model is more efficient comparing with the current sparse models for image restoration tasks due to lower redundancy between the atoms of different color channels. The experimental results demonstrate that the proposed sparse image model avoids the hue bias issue successfully and shows its potential as a general and powerful tool in color image analysis and processing domain. PMID:25643407

  7. An approach to multi-temporal MODIS image analysis using image classification and segmentation

    NASA Astrophysics Data System (ADS)

    Senthilnath, J.; Bajpai, Shivesh; Omkar, S. N.; Diwakar, P. G.; Mani, V.

    2012-11-01

    This paper discusses an approach for river mapping and flood evaluation based on multi-temporal time series analysis of satellite images utilizing pixel spectral information for image classification and region-based segmentation for extracting water-covered regions. Analysis of MODIS satellite images is applied in three stages: before flood, during flood and after flood. Water regions are extracted from the MODIS images using image classification (based on spectral information) and image segmentation (based on spatial information). Multi-temporal MODIS images from "normal" (non-flood) and flood time-periods are processed in two steps. In the first step, image classifiers such as Support Vector Machines (SVM) and Artificial Neural Networks (ANN) separate the image pixels into water and non-water groups based on their spectral features. The classified image is then segmented using spatial features of the water pixels to remove the misclassified water. From the results obtained, we evaluate the performance of the method and conclude that the use of image classification (SVM and ANN) and region-based image segmentation is an accurate and reliable approach for the extraction of water-covered regions.

  8. Evaluating community investments in the mining sector using multi-criteria decision analysis to integrate SIA with business planning

    SciTech Connect

    Esteves, A.M.

    2008-05-15

    Gaining senior management's commitment to long-term social development projects, which are characterised by uncertainty and complexity, is made easier if projects are shown to benefit the site's strategic goals. However, even though the business case for community investment may have been accepted at a general level, as a strategy for competitive differentiation, risk mitigation and a desire to deliver - and to be seen to deliver - a 'net benefit' to affected communities, mining operations are still faced with implementation challenges. Case study research on mining companies, including interviews with social investment decision-makers, has assisted in developing the Social Investment Decision Analysis Tool (SIDAT), a decision model for evaluating social projects in order to create value for both the company and the community. Multi-criteria decision analysis techniques integrating business planning processes with social impact assessment have proved useful in assisting mining companies think beyond the traditional drivers (i.e. seeking access to required lands and peaceful relations with neighbours), to broader issues of how they can meet their business goals and contribute to sustainable development in the regions in which they operate.

  9. Using Multi-Criteria Analysis for the Study of Human Impact on Agro-Forestry Ecosystem in the Region of Khenchela (algeria)

    NASA Astrophysics Data System (ADS)

    Bouzekri, A.; Benmessaoud, H.

    2016-06-01

    The objective of this work is to study and analyze the human impact on agro-forestry-pastoral ecosystem of Khenchela region through the application of multi-criteria analysis methods to integrate geographic information systems, our methodology is based on a weighted linear combination of information on four criteria chosen in our analysis representative in the vicinity of variables in relation to roads, urban areas, water resources and agricultural space, the results shows the effect of urbanization and socio-economic activity on the degradation of the physical environment and found that 32% of the total area are very sensitive to human impact.

  10. Multiple sclerosis medical image analysis and information management.

    PubMed

    Liu, Lifeng; Meier, Dominik; Polgar-Turcsanyi, Mariann; Karkocha, Pawel; Bakshi, Rohit; Guttmann, Charles R G

    2005-01-01

    Magnetic resonance imaging (MRI) has become a central tool for patient management, as well as research, in multiple sclerosis (MS). Measurements of disease burden and activity derived from MRI through quantitative image analysis techniques are increasingly being used. There are many complexities and challenges in building computerized processing pipelines to ensure efficiency, reproducibility, and quality control for MRI scans from MS patients. Such paradigms require advanced image processing and analysis technologies, as well as integrated database management systems to ensure the most utility for clinical and research purposes. This article reviews pipelines available for quantitative clinical MRI research in MS, including image segmentation, registration, time-series analysis, performance validation, visualization techniques, and advanced medical imaging software packages. To address the complex demands of the sequential processes, the authors developed a workflow management system that uses a centralized database and distributed computing system for image processing and analysis. The implementation of their system includes a web-form-based Oracle database application for information management and event dispatching, and multiple modules for image processing and analysis. The seamless integration of processing pipelines with the database makes it more efficient for users to navigate complex, multistep analysis protocols, reduces the user's learning curve, reduces the time needed for combining and activating different computing modules, and allows for close monitoring for quality-control purposes. The authors' system can be extended to general applications in clinical trials and to routine processing for image-based clinical research. PMID:16385023

  11. Development of a quantitative autoradiography image analysis system

    SciTech Connect

    Hoffman, T.J.; Volkert, W.A.; Holmes R.A.

    1986-03-01

    A low cost image analysis system suitable for quantitative autoradiography (QAR) analysis has been developed. Autoradiographs can be digitized using a conventional Newvicon television camera interfaced to an IBM-XT microcomputer. Software routines for image digitization and capture permit the acquisition of thresholded or windowed images with graphic overlays that can be stored on storage devices. Image analysis software performs all background and non-linearity corrections prior to display as black/white or pseudocolor images. The relationship of pixel intensity to a standard radionuclide concentration allows the production of quantitative maps of tissue radiotracer concentrations. An easily modified subroutine is provided for adaptation to use appropriate operational equations when parameters such as regional cerebral blood flow or regional cerebral glucose metabolism are under investigation. This system could provide smaller research laboratories with the capability of QAR analysis at relatively low cost.

  12. An image analysis system for near-infrared (NIR) fluorescence lymph imaging

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.

    2011-03-01

    Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.

  13. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  14. Carotid plaque characterization using CT and MRI scans for synergistic image analysis

    NASA Astrophysics Data System (ADS)

    Getzin, Matthew; Xu, Yiqin; Rao, Arhant; Madi, Saaussan; Bahadur, Ali; Lennartz, Michelle R.; Wang, Ge

    2014-09-01

    Noninvasive determination of plaque vulnerability has been a holy grail of medical imaging. Despite advances in tomographic technologies , there is currently no effective way to identify vulnerable atherosclerotic plaques with high sensitivity and specificity. Computed tomography (CT) and magnetic resonance imaging (MRI) are widely used, but neither provides sufficient information of plaque properties. Thus, we are motivated to combine CT and MRI imaging to determine if the composite information can better reflect the histological determination of plaque vulnerability. Two human endarterectomy specimens (1 symptomatic carotid and 1 stable femoral) were imaged using Scanco Medical Viva CT40 and Bruker Pharmascan 16cm 7T Horizontal MRI / MRS systems. μCT scans were done at 55 kVp and tube current of 70 mA. Samples underwent RARE-VTR and MSME pulse sequences to measure T1, T2 values, and proton density. The specimens were processed for histology and scored for vulnerability using the American Heart Association criteria. Single modality-based analyses were performed through segmentation of key imaging biomarkers (i.e. calcification and lumen), image registration, measurement of fibrous capsule, and multi-component T1 and T2 decay modeling. Feature differences were analyzed between the unstable and stable controls, symptomatic carotid and femoral plaque, respectively. By building on the techniques used in this study, synergistic CT+MRI analysis may provide a promising solution for plaque characterization in vivo.

  15. Rapid analysis and exploration of fluorescence microscopy images.

    PubMed

    Pavie, Benjamin; Rajaram, Satwik; Ouyang, Austin; Altschuler, Jason M; Steininger, Robert J; Wu, Lani F; Altschuler, Steven J

    2014-01-01

    Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard. Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens. PMID:24686220

  16. A critical assessment of the performance criteria in confirmatory analysis for veterinary drug residue analysis using mass spectrometric detection in selected reaction monitoring mode.

    PubMed

    Berendsen, Bjorn J A; Meijer, Thijs; Wegh, Robin; Mol, Hans G J; Smyth, Wesley G; Armstrong Hewitt, S; van Ginkel, Leen; Nielen, Michel W F

    2016-05-01

    Besides the identification point system to assure adequate set-up of instrumentation, European Commission Decision 2002/657/EC includes performance criteria regarding relative ion abundances in mass spectrometry and chromatographic retention time. In confirmatory analysis, the relative abundance of two product ions, acquired in selected reaction monitoring mode, the ion ratio should be within certain ranges for confirmation of the identity of a substance. The acceptable tolerance of the ion ratio varies with the relative abundance of the two product ions and for retention time, CD 2002/657/EC allows a tolerance of 5%. Because of rapid technical advances in analytical instruments and new approaches applied in the field of contaminant testing in food products (multi-compound and multi-class methods) a critical assessment of these criteria is justified. In this study a large number of representative, though challenging sample extracts were prepared, including muscle, urine, milk and liver, spiked with 100 registered and banned veterinary drugs at levels ranging from 0.5 to 100 µg/kg. These extracts were analysed using SRM mode using different chromatographic conditions and mass spectrometers from different vendors. In the initial study, robust data was collected using four different instrumental set-ups. Based on a unique and highly relevant data set, consisting of over 39 000 data points, the ion ratio and retention time criteria for applicability in confirmatory analysis were assessed. The outcomes were verified based on a collaborative trial including laboratories from all over the world. It was concluded that the ion ratio deviation is not related to the value of the ion ratio, but rather to the intensity of the lowest product ion. Therefore a fixed ion ratio deviation tolerance of 50% (relative) is proposed, which also is applicable for compounds present at sub-ppb levels or having poor ionisation efficiency. Furthermore, it was observed that retention time

  17. Research of second harmonic generation images based on texture analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yao; Li, Yan; Gong, Haiming; Zhu, Xiaoqin; Huang, Zufang; Chen, Guannan

    2014-09-01

    Texture analysis plays a crucial role in identifying objects or regions of interest in an image. It has been applied to a variety of medical image processing, ranging from the detection of disease and the segmentation of specific anatomical structures, to differentiation between healthy and pathological tissues. Second harmonic generation (SHG) microscopy as a potential noninvasive tool for imaging biological tissues has been widely used in medicine, with reduced phototoxicity and photobleaching. In this paper, we clarified the principles of texture analysis including statistical, transform, structural and model-based methods and gave examples of its applications, reviewing studies of the technique. Moreover, we tried to apply texture analysis to the SHG images for the differentiation of human skin scar tissues. Texture analysis method based on local binary pattern (LBP) and wavelet transform was used to extract texture features of SHG images from collagen in normal and abnormal scars, and then the scar SHG images were classified into normal or abnormal ones. Compared with other texture analysis methods with respect to the receiver operating characteristic analysis, LBP combined with wavelet transform was demonstrated to achieve higher accuracy. It can provide a new way for clinical diagnosis of scar types. At last, future development of texture analysis in SHG images were discussed.

  18. Repetition, Power Imbalance, and Intentionality: Do These Criteria Conform to Teenagers' Perception of Bullying? A Role-Based Analysis

    ERIC Educational Resources Information Center

    Cuadrado-Gordillo, Isabel

    2012-01-01

    The criteria that researchers use to classify aggressive behaviour as bullying are "repetition", "power imbalance", and "intent to hurt". However, studies that have analyzed adolescents' perceptions of bullying find that most adolescents do not simultaneously consider these three criteria. This paper examines adolescents' perceptions of bullying…

  19. An exploratory analysis of Indiana and Illinois biotic assemblage data in support of state nutrient criteria development

    EPA Science Inventory

    EPA recognizes the importance of nutrient criteria in protecting designated uses from eutrophication effects associated with elevated phosphorus and nitrogen in streams and has worked with states over the past 12 years to assist them in developing nutrient criteria. Towards that ...

  20. Multistage hierarchy for fast image analysis

    NASA Astrophysics Data System (ADS)

    Grudin, Maxim A.; Harvey, David M.; Timchenko, Leonid I.

    1996-12-01

    In this paper, a novel approach is proposed, which allows for an efficient reduction of the amount of visual data required for representing structural information in the image. This is a multistage architecture which investigates partial correlations between structural image components. Mathematical description of the multistage hierarchical processing is provided, together with the network architecture. Initially the image is partitioned to be processed in parallel channels. In each channel, the structural components are transformed and subsequently separated, depending on their structural significance, to be then combined with the components from other channels for further processing. The output result is represented as a pattern vector, whose components are computed one at a time to allow the quickest possible response. The input gray- scale image is transformed before the processing begins, so that each pixel contains information about the spatial structure of its neighborhood. The most correlated information is extracted first, making the algorithm tolerant to minor structural changes.

  1. Introducing PLIA: Planetary Laboratory for Image Analysis

    NASA Astrophysics Data System (ADS)

    Peralta, J.; Hueso, R.; Barrado, N.; Sánchez-Lavega, A.

    2005-08-01

    We present a graphical software tool developed under IDL software to navigate, process and analyze planetary images. The software has a complete Graphical User Interface and is cross-platform. It can also run under the IDL Virtual Machine without the need to own an IDL license. The set of tools included allow image navigation (orientation, centring and automatic limb determination), dynamical and photometric atmospheric measurements (winds and cloud albedos), cylindrical and polar projections, as well as image treatment under several procedures. Being written in IDL, it is modular and easy to modify and grow for adding new capabilities. We show several examples of the software capabilities with Galileo-Venus observations: Image navigation, photometrical corrections, wind profiles obtained by cloud tracking, cylindrical projections and cloud photometric measurements. Acknowledgements: This work has been funded by Spanish MCYT PNAYA2003-03216, fondos FEDER and Grupos UPV 15946/2004. R. Hueso acknowledges a post-doc fellowship from Gobierno Vasco.

  2. Histology image analysis for carcinoma detection and grading

    PubMed Central

    He, Lei; Long, L. Rodney; Antani, Sameer; Thoma, George R.

    2012-01-01

    This paper presents an overview of the image analysis techniques in the domain of histopathology, specifically, for the objective of automated carcinoma detection and classification. As in other biomedical imaging areas such as radiology, many computer assisted diagnosis (CAD) systems have been implemented to aid histopathologists and clinicians in cancer diagnosis and research, which have been attempted to significantly reduce the labor and subjectivity of traditional manual intervention with histology images. The task of automated histology image analysis is usually not simple due to the unique characteristics of histology imaging, including the variability in image preparation techniques, clinical interpretation protocols, and the complex structures and very large size of the images themselves. In this paper we discuss those characteristics, provide relevant background information about slide preparation and interpretation, and review the application of digital image processing techniques to the field of histology image analysis. In particular, emphasis is given to state-of-the-art image segmentation methods for feature extraction and disease classification. Four major carcinomas of cervix, prostate, breast, and lung are selected to illustrate the functions and capabilities of existing CAD systems. PMID:22436890

  3. MR brain image analysis in dementia: From quantitative imaging biomarkers to ageing brain models and imaging genetics.

    PubMed

    Niessen, Wiro J

    2016-10-01

    MR brain image analysis has constantly been a hot topic research area in medical image analysis over the past two decades. In this article, it is discussed how the field developed from the construction of tools for automatic quantification of brain morphology, function, connectivity and pathology, to creating models of the ageing brain in normal ageing and disease, and tools for integrated analysis of imaging and genetic data. The current and future role of the field in improved understanding of the development of neurodegenerative disease is discussed, and its potential for aiding in early and differential diagnosis and prognosis of different types of dementia. For the latter, the use of reference imaging data and reference models derived from large clinical and population imaging studies, and the application of machine learning techniques on these reference data, are expected to play a key role. PMID:27344937

  4. Localised manifold learning for cardiac image analysis

    NASA Astrophysics Data System (ADS)

    Bhatia, Kanwal K.; Price, Anthony N.; Hajnal, Jo V.; Rueckert, Daniel

    2012-02-01

    Manifold learning is increasingly being used to discover the underlying structure of medical image data. Traditional approaches operate on whole images with a single measure of similarity used to compare entire images. In this way, information on the locality of differences is lost and smaller trends may be masked by dominant global differences. In this paper, we propose the use of multiple local manifolds to analyse regions of images without any prior knowledge of which regions are important. Localised manifolds are created by partitioning images into regular subsections with a manifold constructed for each patch. We propose a framework for incorporating information from the neighbours of each patch to calculate a coherent embedding. This generates a simultaneous dimensionality reduction of all patches and results in the creation of embeddings which are spatially-varying. Additionally, a hierarchical method is presented to enable a multi-scale embedding solution. We use this to extract spatially-varying respiratory and cardiac motions from cardiac MRI. Although there is a complex interplay between these motions, we show how they can be separated on a regional basis. We demonstrate the utility of the localised joint embedding over a global embedding of whole images and over embedding individual patches independently.

  5. Radar images analysis for scattering surfaces characterization

    NASA Astrophysics Data System (ADS)

    Piazza, Enrico

    1998-10-01

    According to the different problems and techniques related to the detection and recognition of airplanes and vehicles moving on the Airport surface, the present work mainly deals with the processing of images gathered by a high-resolution radar sensor. The radar images used to test the investigated algorithms are relative to sequence of images obtained in some field experiments carried out by the Electronic Engineering Department of the University of Florence. The radar is the Ka band radar operating in the'Leonardo da Vinci' Airport in Fiumicino (Rome). The images obtained from the radar scan converter are digitized and putted in x, y, (pixel) co- ordinates. For a correct matching of the images, these are corrected in true geometrical co-ordinates (meters) on the basis of fixed points on an airport map. Correlating the airplane 2-D multipoint template with actual radar images, the value of the signal in the points involved in the template can be extracted. Results for a lot of observation show a typical response for the main section of the fuselage and the wings. For the fuselage, the back-scattered echo is low at the prow, became larger near the center on the aircraft and than it decrease again toward the tail. For the wings the signal is growing with a pretty regular slope from the fuselage to the tips, where the signal is the strongest.

  6. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  7. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  8. Unsupervised analysis of small animal dynamic Cerenkov luminescence imaging

    NASA Astrophysics Data System (ADS)

    Spinelli, Antonello E.; Boschi, Federico

    2011-12-01

    Clustering analysis (CA) and principal component analysis (PCA) were applied to dynamic Cerenkov luminescence images (dCLI). In order to investigate the performances of the proposed approaches, two distinct dynamic data sets obtained by injecting mice with 32P-ATP and 18F-FDG were acquired using the IVIS 200 optical imager. The k-means clustering algorithm has been applied to dCLI and was implemented using interactive data language 8.1. We show that cluster analysis allows us to obtain good agreement between the clustered and the corresponding emission regions like the bladder, the liver, and the tumor. We also show a good correspondence between the time activity curves of the different regions obtained by using CA and manual region of interest analysis on dCLIT and PCA images. We conclude that CA provides an automatic unsupervised method for the analysis of preclinical dynamic Cerenkov luminescence image data.

  9. Digital Image Analysis for DETCHIP(®) Code Determination.

    PubMed

    Lyon, Marcus; Wilson, Mark V; Rouhier, Kerry A; Symonsbergen, David J; Bastola, Kiran; Thapa, Ishwor; Holmes, Andrea E; Sikich, Sharmin M; Jackson, Abby

    2012-08-01

    DETECHIP(®) is a molecular sensing array used for identification of a large variety of substances. Previous methodology for the analysis of DETECHIP(®) used human vision to distinguish color changes induced by the presence of the analyte of interest. This paper describes several analysis techniques using digital images of DETECHIP(®). Both a digital camera and flatbed desktop photo scanner were used to obtain Jpeg images. Color information within these digital images was obtained through the measurement of red-green-blue (RGB) values using software such as GIMP, Photoshop and ImageJ. Several different techniques were used to evaluate these color changes. It was determined that the flatbed scanner produced in the clearest and more reproducible images. Furthermore, codes obtained using a macro written for use within ImageJ showed improved consistency versus pervious methods. PMID:25267940

  10. Anima: modular workflow system for comprehensive image data analysis.

    PubMed

    Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa

    2014-01-01

    Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541

  11. Anima: Modular Workflow System for Comprehensive Image Data Analysis

    PubMed Central

    Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa

    2014-01-01

    Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541

  12. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    PubMed

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making. PMID:26739955

  13. Basic research planning in mathematical pattern recognition and image analysis

    NASA Technical Reports Server (NTRS)

    Bryant, J.; Guseman, L. F., Jr.

    1981-01-01

    Fundamental problems encountered while attempting to develop automated techniques for applications of remote sensing are discussed under the following categories: (1) geometric and radiometric preprocessing; (2) spatial, spectral, temporal, syntactic, and ancillary digital image representation; (3) image partitioning, proportion estimation, and error models in object scene interference; (4) parallel processing and image data structures; and (5) continuing studies in polarization; computer architectures and parallel processing; and the applicability of "expert systems" to interactive analysis.

  14. An Analysis of the Magneto-Optic Imaging System

    NASA Technical Reports Server (NTRS)

    Nath, Shridhar

    1996-01-01

    The Magneto-Optic Imaging system is being used for the detection of defects in airframes and other aircraft structures. The system has been successfully applied to detecting surface cracks, but has difficulty in the detection of sub-surface defects such as corrosion. The intent of the grant was to understand the physics of the MOI better, in order to use it effectively for detecting corrosion and for classifying surface defects. Finite element analysis, image classification, and image processing are addressed.

  15. Uncooled LWIR imaging: applications and market analysis

    NASA Astrophysics Data System (ADS)

    Takasawa, Satomi

    2015-05-01

    The evolution of infrared (IR) imaging sensor technology for defense market has played an important role in developing commercial market, as dual use of the technology has expanded. In particular, technologies of both reduction in pixel pitch and vacuum package have drastically evolved in the area of uncooled Long-Wave IR (LWIR; 8-14 μm wavelength region) imaging sensor, increasing opportunity to create new applications. From the macroscopic point of view, the uncooled LWIR imaging market is divided into two areas. One is a high-end market where uncooled LWIR imaging sensor with sensitivity as close to that of cooled one as possible is required, while the other is a low-end market which is promoted by miniaturization and reduction in price. Especially, in the latter case, approaches towards consumer market have recently appeared, such as applications of uncooled LWIR imaging sensors to night visions for automobiles and smart phones. The appearance of such a kind of commodity surely changes existing business models. Further technological innovation is necessary for creating consumer market, and there will be a room for other companies treating components and materials such as lens materials and getter materials and so on to enter into the consumer market.

  16. Continuous-wave terahertz scanning image resolution analysis and restoration

    NASA Astrophysics Data System (ADS)

    Li, Qi; Yin, Qiguo; Yao, Rui; Ding, Shenghui; Wang, Qi

    2010-03-01

    Resolution of continuous-wave (CW) terahertz scanning image is limited by many factors among which the aperture effect of finite focus diameter is very important. We have investigated the factors that affect terahertz (THz) image resolution in details through theory analysis and simulation. On the other hand, in order to enhance THz image resolution, Richardson-Lucy algorithm has been introduced as a promising approach to improve image details. By analyzing the imaging theory, it is proposed that intensity distribution function of actual THz laser focal spot can be approximatively used as point spread function (PSF) in the restoration algorithm. The focal spot image could be obtained by applying the pyroelectric camera, and mean filtering result of the focal spot image is used as the PSF. Simulation and experiment show that the algorithm implemented is comparatively effective.

  17. Multi-Scale Fractal Analysis of Image Texture and Pattern

    NASA Technical Reports Server (NTRS)

    Emerson, Charles W.; Lam, Nina Siu-Ngan; Quattrochi, Dale A.

    1999-01-01

    Analyses of the fractal dimension of Normalized Difference Vegetation Index (NDVI) images of homogeneous land covers near Huntsville, Alabama revealed that the fractal dimension of an image of an agricultural land cover indicates greater complexity as pixel size increases, a forested land cover gradually grows smoother, and an urban image remains roughly self-similar over the range of pixel sizes analyzed (10 to 80 meters). A similar analysis of Landsat Thematic Mapper images of the East Humboldt Range in Nevada taken four months apart show a more complex relation between pixel size and fractal dimension. The major visible difference between the spring and late summer NDVI images is the absence of high elevation snow cover in the summer image. This change significantly alters the relation between fractal dimension and pixel size. The slope of the fractal dimension-resolution relation provides indications of how image classification or feature identification will be affected by changes in sensor spatial resolution.

  18. Multi-Scale Fractal Analysis of Image Texture and Pattern

    NASA Technical Reports Server (NTRS)

    Emerson, Charles W.; Lam, Nina Siu-Ngan; Quattrochi, Dale A.

    1999-01-01

    Analyses of the fractal dimension of Normalized Difference Vegetation Index (NDVI) images of homogeneous land covers near Huntsville, Alabama revealed that the fractal dimension of an image of an agricultural land cover indicates greater complexity as pixel size increases, a forested land cover gradually grows smoother, and an urban image remains roughly self-similar over the range of pixel sizes analyzed (10 to 80 meters). A similar analysis of Landsat Thematic Mapper images of the East Humboldt Range in Nevada taken four months apart show a more complex relation between pixel size and fractal dimension. The major visible difference between the spring and late summer NDVI images of the absence of high elevation snow cover in the summer image. This change significantly alters the relation between fractal dimension and pixel size. The slope of the fractal dimensional-resolution relation provides indications of how image classification or feature identification will be affected by changes in sensor spatial resolution.

  19. A Multi-Criteria Decision Analysis Model to Assess the Safety of Botanicals Utilizing Data on History of Use

    PubMed Central

    Neely, T.; Walsh-Mason, B.; Russell, P.; Horst, A. Van Der; O’Hagan, S.; Lahorkar, P.

    2011-01-01

    Botanicals (herbal materials and extracts) are widely used in traditional medicines throughout the world. Many have an extensive history of safe use over several hundreds of years. There is now a growing consumer interest in food and cosmetic products, which contain botanicals. There are many publications describing the safety assessment approaches for botanicals, based on the history of safe use. However, they do not define what constitutes a history of safe use, a decision that is ultimately a subjective one. The multi-criteria decision analysis (MCDA), is a model that has been developed, which assesses the safety of botanical ingredients using a history of use approach. The model evaluates the similarity of the botanical ingredient of interest to its historic counterpart – the comparator, the evidence supporting the history of use, and any evidence of concern. The assessment made is whether a botanical ingredient is as safe as its comparator botanical, which has a history of use. In order to establish compositional similarity between the botanical ingredient and its comparator, an analytical ‘similarity scoring’ approach has been developed. Applicability of the model is discussed with an example, Brahmi ( Bacopa monnieri). This evolution of the risk assessment of botanicals gives an objective, transparent, and transferable safety assessment approach. PMID:22025816

  20. Optical image acquisition system for colony analysis

    NASA Astrophysics Data System (ADS)

    Wang, Weixing; Jin, Wenbiao

    2006-02-01

    For counting of both colonies and plaques, there is a large number of applications including food, dairy, beverages, hygiene, environmental monitoring, water, toxicology, sterility testing, AMES testing, pharmaceuticals, paints, sterile fluids and fungal contamination. Recently, many researchers and developers have made efforts for this kind of systems. By investigation, some existing systems have some problems since they belong to a new technology product. One of the main problems is image acquisition. In order to acquire colony images with good quality, an illumination box was constructed as: the box includes front lightning and back lightning, which can be selected by users based on properties of colony dishes. With the illumination box, lightning can be uniform; colony dish can be put in the same place every time, which make image processing easy. A digital camera in the top of the box connected to a PC computer with a USB cable, all the camera functions are controlled by the computer.

  1. System Matrix Analysis for Computed Tomography Imaging

    PubMed Central

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  2. System Matrix Analysis for Computed Tomography Imaging.

    PubMed

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  3. [Analysis of bone tissues by intravital imaging].

    PubMed

    Mizuno, Hiroki; Yamashita, Erika; Ishii, Masaru

    2016-05-01

    In recent years,"the fluorescent imaging techniques"has made rapid advances, it has become possible to observe the dynamics of living cells in individuals or tissues. It has been considered that it is extremely difficult to observe the living bone marrow directly because bone marrow is surrounded by a hard calcareous. But now, we established a method for observing the cells constituting the bone marrow of living mice in real time by the use of the intravital two-photon imaging system. In this article, we show the latest data and the reports about the hematopoietic stem cells and the leukemia cells by using the intravital imaging techniques, and also discuss its further application. PMID:27117619

  4. Texture Analysis for Classification of Risat-Ii Images

    NASA Astrophysics Data System (ADS)

    Chakraborty, D.; Thakur, S.; Jeyaram, A.; Krishna Murthy, Y. V. N.; Dadhwal, V. K.

    2012-08-01

    RISAT-II or Radar Imaging satellite - II is a microwave-imaging satellite lunched by ISRO to take images of the earth during day and night as well as all weather condition. This satellite enhances the ISRO's capability for disaster management application together with forestry, agricultural, urban and oceanographic applications. The conventional pixel based classification technique cannot classify these type of images since it do not take into account the texture information of the image. This paper presents a method to classify the high-resolution RISAT-II microwave images based on texture analysis. It suppress the speckle noise from the microwave image before analysis the texture of the image since speckle is essentially a form of noise, which degrades the quality of an image; make interpretation (visual or digital) more difficult. A local adaptive median filter is developed that uses local statistics to detect the speckle noise of microwave image and to replace it with a local median value. Local Binary Pattern (LBP) operator is proposed to measure the texture around each pixel of the speckle suppressed microwave image. It considers a series of circles (2D) centered on the pixel with incremental radius values and the intersected pixels on the perimeter of the circles of radius r (where r = 1, 3 and 5) are used for measuring the LBP of the center pixel. The significance of LBP is that it measure the texture around each pixel of the image and computationally simple. ISODATA method is used to cluster the transformed LBP image. The proposed method adequately classifies RISAT-II X band microwave images without human intervention.

  5. Four challenges in medical image analysis from an industrial perspective.

    PubMed

    Weese, Jürgen; Lorenz, Cristian

    2016-10-01

    Today's medical imaging systems produce a huge amount of images containing a wealth of information. However, the information is hidden in the data and image analysis algorithms are needed to extract it, to make it readily available for medical decisions and to enable an efficient work flow. Advances in medical image analysis over the past 20 years mean there are now many algorithms and ideas available that allow to address medical image analysis tasks in commercial solutions with sufficient performance in terms of accuracy, reliability and speed. At the same time new challenges have arisen. Firstly, there is a need for more generic image analysis technologies that can be efficiently adapted for a specific clinical task. Secondly, efficient approaches for ground truth generation are needed to match the increasing demands regarding validation and machine learning. Thirdly, algorithms for analyzing heterogeneous image data are needed. Finally, anatomical and organ models play a crucial role in many applications, and algorithms to construct patient-specific models from medical images with a minimum of user interaction are needed. These challenges are complementary to the on-going need for more accurate, more reliable and faster algorithms, and dedicated algorithmic solutions for specific applications. PMID:27344939

  6. Disability in Physical Education Textbooks: An Analysis of Image Content

    ERIC Educational Resources Information Center

    Taboas-Pais, Maria Ines; Rey-Cao, Ana

    2012-01-01

    The aim of this paper is to show how images of disability are portrayed in physical education textbooks for secondary schools in Spain. The sample was composed of 3,316 images published in 36 textbooks by 10 publishing houses. A content analysis was carried out using a coding scheme based on categories employed in other similar studies and adapted…

  7. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  8. Ringed impact craters on Venus: An analysis from Magellan images

    NASA Technical Reports Server (NTRS)

    Alexopoulos, Jim S.; Mckinnon, William B.

    1992-01-01

    We have analyzed cycle 1 Magellan images covering approximately 90 percent of the venusian surface and have identified 55 unequivocal peak-ring craters and multiringed impact basins. This comprehensive study (52 peak-ring craters and at least 3 multiringed impact basins) complements our earlier independent analysis of Arecibo and Venera images and initial Magellan data and that of the Magellan team.

  9. Higher Education Institution Image: A Correspondence Analysis Approach.

    ERIC Educational Resources Information Center

    Ivy, Jonathan

    2001-01-01

    Investigated how marketing is used to convey higher education institution type image in the United Kingdom and South Africa. Using correspondence analysis, revealed the unique positionings created by old and new universities and technikons in these countries. Also identified which marketing tools they use in conveying their image. (EV)

  10. Geopositioning Precision Analysis of Multiple Image Triangulation Using Lro Nac Lunar Images

    NASA Astrophysics Data System (ADS)

    Di, K.; Xu, B.; Liu, B.; Jia, M.; Liu, Z.

    2016-06-01

    This paper presents an empirical analysis of the geopositioning precision of multiple image triangulation using Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) images at the Chang'e-3(CE-3) landing site. Nine LROC NAC images are selected for comparative analysis of geopositioning precision. Rigorous sensor models of the images are established based on collinearity equations with interior and exterior orientation elements retrieved from the corresponding SPICE kernels. Rational polynomial coefficients (RPCs) of each image are derived by least squares fitting using vast number of virtual control points generated according to rigorous sensor models. Experiments of different combinations of images are performed for comparisons. The results demonstrate that the plane coordinates can achieve a precision of 0.54 m to 2.54 m, with a height precision of 0.71 m to 8.16 m when only two images are used for three-dimensional triangulation. There is a general trend that the geopositioning precision, especially the height precision, is improved with the convergent angle of the two images increasing from several degrees to about 50°. However, the image matching precision should also be taken into consideration when choosing image pairs for triangulation. The precisions of using all the 9 images are 0.60 m, 0.50 m, 1.23 m in along-track, cross-track, and height directions, which are better than most combinations of two or more images. However, triangulation with selected fewer images could produce better precision than that using all the images.

  11. Analysis of PETT images in psychiatric disorders

    SciTech Connect

    Brodie, J.D.; Gomez-Mont, F.; Volkow, N.D.; Corona, J.F.; Wolf, A.P.; Wolkin, A.; Russell, J.A.G.; Christman, D.; Jaeger, J.

    1983-01-01

    A quantitative method is presented for studying the pattern of metabolic activity in a set of Positron Emission Transaxial Tomography (PETT) images. Using complex Fourier coefficients as a feature vector for each image, cluster, principal components, and discriminant function analyses are used to empirically describe metabolic differences between control subjects and patients with DSM III diagnosis for schizophrenia or endogenous depression. We also present data on the effects of neuroleptic treatment on the local cerebral metabolic rate of glucose utilization (LCMRGI) in a group of chronic schizophrenics using the region of interest approach. 15 references, 4 figures, 3 tables.

  12. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  13. Challenges and opportunities for quantifying roots and rhizosphere interactions through imaging and image analysis.

    PubMed

    Downie, H F; Adu, M O; Schmidt, S; Otten, W; Dupuy, L X; White, P J; Valentine, T A

    2015-07-01

    The morphology of roots and root systems influences the efficiency by which plants acquire nutrients and water, anchor themselves and provide stability to the surrounding soil. Plant genotype and the biotic and abiotic environment significantly influence root morphology, growth and ultimately crop yield. The challenge for researchers interested in phenotyping root systems is, therefore, not just to measure roots and link their phenotype to the plant genotype, but also to understand how the growth of roots is influenced by their environment. This review discusses progress in quantifying root system parameters (e.g. in terms of size, shape and dynamics) using imaging and image analysis technologies and also discusses their potential for providing a better understanding of root:soil interactions. Significant progress has been made in image acquisition techniques, however trade-offs exist between sample throughput, sample size, image resolution and information gained. All of these factors impact on downstream image analysis processes. While there have been significant advances in computation power, limitations still exist in statistical processes involved in image analysis. Utilizing and combining different imaging systems, integrating measurements and image analysis where possible, and amalgamating data will allow researchers to gain a better understanding of root:soil interactions. PMID:25211059

  14. Spatially Weighted Principal Component Analysis for Imaging Classification

    PubMed Central

    Guo, Ruixin; Ahn, Mihye; Zhu, Hongtu

    2014-01-01

    The aim of this paper is to develop a supervised dimension reduction framework, called Spatially Weighted Principal Component Analysis (SWPCA), for high dimensional imaging classification. Two main challenges in imaging classification are the high dimensionality of the feature space and the complex spatial structure of imaging data. In SWPCA, we introduce two sets of novel weights including global and local spatial weights, which enable a selective treatment of individual features and incorporation of the spatial structure of imaging data and class label information. We develop an e cient two-stage iterative SWPCA algorithm and its penalized version along with the associated weight determination. We use both simulation studies and real data analysis to evaluate the finite-sample performance of our SWPCA. The results show that SWPCA outperforms several competing principal component analysis (PCA) methods, such as supervised PCA (SPCA), and other competing methods, such as sparse discriminant analysis (SDA). PMID:26089629

  15. Electron Microscopy and Image Analysis for Selected Materials

    NASA Technical Reports Server (NTRS)

    Williams, George

    1999-01-01

    This particular project was completed in collaboration with the metallurgical diagnostics facility. The objective of this research had four major components. First, we required training in the operation of the environmental scanning electron microscope (ESEM) for imaging of selected materials including biological specimens. The types of materials range from cyanobacteria and diatoms to cloth, metals, sand, composites and other materials. Second, to obtain training in surface elemental analysis technology using energy dispersive x-ray (EDX) analysis, and in the preparation of x-ray maps of these same materials. Third, to provide training for the staff of the metallurgical diagnostics and failure analysis team in the area of image processing and image analysis technology using NIH Image software. Finally, we were to assist in the sample preparation, observing, imaging, and elemental analysis for Mr. Richard Hoover, one of NASA MSFC's solar physicists and Marshall's principal scientist for the agency-wide virtual Astrobiology Institute. These materials have been collected from various places around the world including the Fox Tunnel in Alaska, Siberia, Antarctica, ice core samples from near Lake Vostoc, thermal vents in the ocean floor, hot springs and many others. We were successful in our efforts to obtain high quality, high resolution images of various materials including selected biological ones. Surface analyses (EDX) and x-ray maps were easily prepared with this technology. We also discovered and used some applications for NIH Image software in the metallurgical diagnostics facility.

  16. Automated Analysis of Mammography Phantom Images

    NASA Astrophysics Data System (ADS)

    Brooks, Kenneth Wesley

    The present work stems from the hypothesis that humans are inconsistent when making subjective analyses of images and that human decisions for moderately complex images may be performed by a computer with complete objectivity, once a human acceptance level has been established. The following goals were established to test the hypothesis: (1) investigate observer variability within the standard mammographic phantom evaluation process; (2) evaluate options for high-resolution image digitization and utilize the most appropriate technology for standard mammographic phantom film digitization; (3) develop a machine-based vision system for evaluating standard mammographic phantom images to eliminate effects of human variabilities; and (4) demonstrate the completed system's performance against human observers for accreditation and for manufacturing quality control of standard mammographic phantom images. The following methods and procedures were followed to achieve the goals of the research: (1) human variabilities in the American College of Radiology accreditation process were simulated by observer studies involving 30 medical physicists and these were compared to the same number of diagnostic radiologists and untrained control group of observers; (2) current digitization technologies were presented and performance test procedures were developed; three devices were tested which represented commercially available high, intermediate and low-end contrast and spatial resolution capabilities; (3) optimal image processing schemes were applied and tested which performed low, intermediate and high-level computer vision tasks; and (4) the completed system's performance was tested against human observers for accreditation and for manufacturing quality control of standard mammographic phantom images. The results from application of the procedures were as follows: (1) the simulated American College of Radiology mammography accreditation program phantom evaluation process demonstrated

  17. The Inner Magnetospheric Imager (IMI): Instrument heritage and orbit viewing analysis

    NASA Technical Reports Server (NTRS)

    Wilson, Gordon R.

    1992-01-01

    For the last two years an engineering team in the Program Development Office at MSFC has been doing design studies for the proposed Inner Magnetospheric Imager (IMI) mission. This team had a need for more information about the instruments that this mission would carry so that they could get a better handle on instrument volume, mass, power, and telemetry needs as well as information to help assess the possible cost of such instruments and what technology development they would need. To get this information, an extensive literature search was conducted as well as interviews with several members of the IMI science working group. The results of this heritage survey are summarized below. There was also a need to evaluate the orbits proposed for this mission from the stand point of their suitability for viewing the various magnetospheric features that are planned for this mission. This was accomplished by first, identifying the factors which need to be considered in selecting an orbit, second, translating these considerations into specific criteria, and third, evaluating the proposed orbits against these criteria. The specifics of these criteria and the results of the orbit analysis are contained in the last section of this report.

  18. Image analysis in dual modality tomography for material classification

    NASA Astrophysics Data System (ADS)

    Basarab-Horwath, I.; Daniels, A. T.; Green, R. G.

    2001-08-01

    A dual modality tomographic system is described for material classification in a simulated multi-component flow regime. It combines two tomographic modalities, electrical current and light, to image the interrogated area. Derived image parameters did not allow material classification. PCA analysis was performed on this data set producing a new parameter set, which allowed material classification. This procedure reduces the dimensionality of the data set and also offers a pre-processing technique prior to analysis by another classifier.

  19. Non-Imaging Software/Data Analysis Requirements

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The analysis software needs of the non-imaging planetary data user are discussed. Assumptions as to the nature of the planetary science data centers where the data are physically stored are advanced, the scope of the non-imaging data is outlined, and facilities that users are likely to need to define and access data are identified. Data manipulation and analysis needs and display graphics are discussed.

  20. Segmentation and learning in the quantitative analysis of microscopy images

    NASA Astrophysics Data System (ADS)

    Ruggiero, Christy; Ross, Amy; Porter, Reid

    2015-02-01

    In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.

  1. Analysis on correlation imaging based on fractal interpolation

    NASA Astrophysics Data System (ADS)

    Li, Bailing; Zhang, Wenwen; Chen, Qian; Gu, Guohua

    2015-10-01

    One fractal interpolation algorithm has been discussed in detail and the statistical self-similarity characteristics of light field have been analized in correlated experiment. For the correlation imaging experiment in condition of low sampling frequent, an image analysis approach based on fractal interpolation algorithm is proposed. This approach aims to improve the resolution of original image which contains a fewer number of pixels and highlight the image contour feature which is fuzzy. By using this method, a new model for the light field has been established. For the case of different moments of the intensity in the receiving plane, the local field division also has been established and then the iterated function system based on the experimental data set can be obtained by choosing the appropriate compression ratio under a scientific error estimate. On the basis of the iterative function, an explicit fractal interpolation function expression is given out in this paper. The simulation results show that the correlation image reconstructed by fractal interpolation has good approximations to the original image. The number of pixels of image after interpolation is significantly increased. This method will effectively solve the difficulty of image pixel deficiency and significantly improved the outline of objects in the image. The rate of deviation as the parameter has been adopted in the paper in order to evaluate objectively the effect of the algorithm. To sum up, fractal interpolation method proposed in this paper not only keeps the overall image but also increases the local information of the original image.

  2. The ImageJ ecosystem: An open platform for biomedical image analysis.

    PubMed

    Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368

  3. Image Segmentation Analysis for NASA Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    2010-01-01

    NASA collects large volumes of imagery data from satellite-based Earth remote sensing sensors. Nearly all of the computerized image analysis of this data is performed pixel-by-pixel, in which an algorithm is applied directly to individual image pixels. While this analysis approach is satisfactory in many cases, it is usually not fully effective in extracting the full information content from the high spatial resolution image data that s now becoming increasingly available from these sensors. The field of object-based image analysis (OBIA) has arisen in recent years to address the need to move beyond pixel-based analysis. The Recursive Hierarchical Segmentation (RHSEG) software developed by the author is being used to facilitate moving from pixel-based image analysis to OBIA. The key unique aspect of RHSEG is that it tightly intertwines region growing segmentation, which produces spatially connected region objects, with region object classification, which groups sets of region objects together into region classes. No other practical, operational image segmentation approach has this tight integration of region growing object finding with region classification This integration is made possible by the recursive, divide-and-conquer implementation utilized by RHSEG, in which the input image data is recursively subdivided until the image data sections are small enough to successfully mitigat the combinatorial explosion caused by the need to compute the dissimilarity between each pair of image pixels. RHSEG's tight integration of region growing object finding and region classification is what enables the high spatial fidelity of the image segmentations produced by RHSEG. This presentation will provide an overview of the RHSEG algorithm and describe how it is currently being used to support OBIA or Earth Science applications such as snow/ice mapping and finding archaeological sites from remotely sensed data.

  4. Hyperspectral image analysis using artificial color

    NASA Astrophysics Data System (ADS)

    Fu, Jian; Caulfield, H. John; Wu, Dongsheng; Tadesse, Wubishet

    2010-03-01

    By definition, HSC (HyperSpectral Camera) images are much richer in spectral data than, say, a COTS (Commercial-Off-The-Shelf) color camera. But data are not information. If we do the task right, useful information can be derived from the data in HSC images. Nature faced essentially the identical problem. The incident light is so complex spectrally that measuring it with high resolution would provide far more data than animals can handle in real time. Nature's solution was to do irreversible POCS (Projections Onto Convex Sets) to achieve huge reductions in data with minimal reduction in information. Thus we can arrange for our manmade systems to do what nature did - project the HSC image onto two or more broad, overlapping curves. The task we have undertaken in the last few years is to develop this idea that we call Artificial Color. What we report here is the use of the measured HSC image data projected onto two or three convex, overlapping, broad curves in analogy with the sensitivity curves of human cone cells. Testing two quite different HSC images in that manner produced the desired result: good discrimination or segmentation that can be done very simply and hence are likely to be doable in real time with specialized computers. Using POCS on the HSC data to reduce the processing complexity produced excellent discrimination in those two cases. For technical reasons discussed here, the figures of merit for the kind of pattern recognition we use is incommensurate with the figures of merit of conventional pattern recognition. We used some force fitting to make a comparison nevertheless, because it shows what is also obvious qualitatively. In our tasks our method works better.

  5. Analysis of Multipath Pixels in SAR Images

    NASA Astrophysics Data System (ADS)

    Zhao, J. W.; Wu, J. C.; Ding, X. L.; Zhang, L.; Hu, F. M.

    2016-06-01

    As the received radar signal is the sum of signal contributions overlaid in one single pixel regardless of the travel path, the multipath effect should be seriously tackled as the multiple bounce returns are added to direct scatter echoes which leads to ghost scatters. Most of the existing solution towards the multipath is to recover the signal propagation path. To facilitate the signal propagation simulation process, plenty of aspects such as sensor parameters, the geometry of the objects (shape, location, orientation, mutual position between adjacent buildings) and the physical parameters of the surface (roughness, correlation length, permittivity)which determine the strength of radar signal backscattered to the SAR sensor should be given in previous. However, it's not practical to obtain the highly detailed object model in unfamiliar area by field survey as it's a laborious work and time-consuming. In this paper, SAR imaging simulation based on RaySAR is conducted at first aiming at basic understanding of multipath effects and for further comparison. Besides of the pre-imaging simulation, the product of the after-imaging, which refers to radar images is also taken into consideration. Both Cosmo-SkyMed ascending and descending SAR images of Lupu Bridge in Shanghai are used for the experiment. As a result, the reflectivity map and signal distribution map of different bounce level are simulated and validated by 3D real model. The statistic indexes such as the phase stability, mean amplitude, amplitude dispersion, coherence and mean-sigma ratio in case of layover are analyzed with combination of the RaySAR output.

  6. Iterative weighted risk estimation for nonlinear image restoration with analysis priors

    NASA Astrophysics Data System (ADS)

    Ramani, Sathish; Rosen, Jeffrey; Liu, Zhihao; Fessler, Jeffrey A.

    2012-03-01

    Image acquisition systems invariably introduce blur, which necessitates the use of deblurring algorithms for image restoration. Restoration techniques involving regularization require appropriate selection of the regularization parameter that controls the quality of the restored result. We focus on the problem of automatic adjustment of this parameter for nonlinear image restoration using analysis-type regularizers such as total variation (TV). For this purpose, we use two variants of Stein's unbiased risk estimate (SURE), Predicted-SURE and Projected-SURE, that are applicable for parameter selection in inverse problems involving Gaussian noise. These estimates require the Jacobian matrix of the restoration algorithm evaluated with respect to the data. We derive analytical expressions to recursively update the desired Jacobian matrix for a fast variant of the iterative reweighted least-squares restoration algorithm that can accommodate a variety of regularization criteria. Our method can also be used to compute a nonlinear version of the generalized cross-validation (NGCV) measure for parameter tuning. We demonstrate using simulations that Predicted-SURE, Projected-SURE, and NGCV-based adjustment of the regularization parameter yields near-MSE-optimal results for image restoration using TV, an analysis-type 1-regularization, and a smooth convex edge-preserving regularizer.

  7. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic

  8. Independent component analysis applications on THz sensing and imaging

    NASA Astrophysics Data System (ADS)

    Balci, Soner; Maleski, Alexander; Nascimento, Matheus Mello; Philip, Elizabath; Kim, Ju-Hyung; Kung, Patrick; Kim, Seongsin M.

    2016-05-01

    We report Independent Component Analysis (ICA) technique applied to THz spectroscopy and imaging to achieve a blind source separation. A reference water vapor absorption spectrum was extracted via ICA, then ICA was utilized on a THz spectroscopic image in order to clean the absorption of water molecules from each pixel. For this purpose, silica gel was chosen as the material of interest for its strong water absorption. The resulting image clearly showed that ICA effectively removed the water content in the detected signal allowing us to image the silica gel beads distinctively even though it was totally embedded in water before ICA was applied.

  9. Synthetic aperture sonar imaging using joint time-frequency analysis

    NASA Astrophysics Data System (ADS)

    Wang, Genyuan; Xia, Xiang-Gen

    1999-03-01

    The non-ideal motion of the hydrophone usually induces the aperture error of the synthetic aperture sonar (SAS), which is one of the most important factors degrading the SAS imaging quality. In the SAS imaging, the return signals are usually nonstationary due to the non-ideal hydrophone motion. In this paper, joint time-frequency analysis (JTFA), as a good technique for analyzing nonstationary signals, is used in the SAS imaging. Based on the JTFA of the sonar return signals, a novel SAS imaging algorithm is proposed. The algorithm is verified by simulation examples.

  10. Fiji - an Open Source platform for biological image analysis

    PubMed Central

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2013-01-01

    Fiji is a distribution of the popular Open Source software ImageJ focused on biological image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image processing algorithms. Fiji facilitates the transformation of novel algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities. PMID:22743772

  11. Guidance on priority setting in health care (GPS-Health): the inclusion of equity criteria not captured by cost-effectiveness analysis

    PubMed Central

    2014-01-01

    This Guidance for Priority Setting in Health Care (GPS-Health), initiated by the World Health Organization, offers a comprehensive map of equity criteria that are relevant to health care priority setting and should be considered in addition to cost-effectiveness analysis. The guidance, in the form of a checklist, is especially targeted at decision makers who set priorities at national and sub-national levels, and those who interpret findings from cost-effectiveness analysis. It is also targeted at researchers conducting cost-effectiveness analysis to improve reporting of their results in the light of these other criteria. The guidance was develop through a series of expert consultation meetings and involved three steps: i) methods and normative concepts were identified through a systematic review; ii) the review findings were critically assessed in the expert consultation meetings which resulted in a draft checklist of normative criteria; iii) the checklist was validated though an extensive hearing process with input from a range of relevant stakeholders. The GPS-Health incorporates criteria related to the disease an intervention targets (severity of disease, capacity to benefit, and past health loss); characteristics of social groups an intervention targets (socioeconomic status, area of living, gender; race, ethnicity, religion and sexual orientation); and non-health consequences of an intervention (financial protection, economic productivity, and care for others). PMID:25246855

  12. Parameter-Based Performance Analysis of Object-Based Image Analysis Using Aerial and Quikbird-2 Images

    NASA Astrophysics Data System (ADS)

    Kavzoglu, T.; Yildiz, M.

    2014-09-01

    Opening new possibilities for research, very high resolution (VHR) imagery acquired by recent commercial satellites and aerial systems requires advanced approaches and techniques that can handle large volume of data with high local variance. Delineation of land use/cover information from VHR images is a hot research topic in remote sensing. In recent years, object-based image analysis (OBIA) has become a popular solution for image analysis tasks as it considers shape, texture and content information associated with the image objects. The most important stage of OBIA is the image segmentation process applied prior to classification. Determination of optimal segmentation parameters is of crucial importance for the performance of the selected classifier. In this study, effectiveness and applicability of the segmentation method in relation to its parameters was analysed using two VHR images, an aerial photo and a Quickbird-2 image. Multi-resolution segmentation technique was employed with its optimal parameters of scale, shape and compactness that were defined after an extensive trail process on the data sets. Nearest neighbour classifier was applied on the segmented images, and then the accuracy assessment was applied. Results show that segmentation parameters have a direct effect on the classification accuracy, and low values of scale-shape combinations produce the highest classification accuracies. Also, compactness parameter was found to be having minimal effect on the construction of image objects, hence it can be set to a constant value in image classification.

  13. Eliciting and Combining Decision Criteria Using a Limited Palette of Utility Functions and Uncertainty Distributions: Illustrated by Application to Pest Risk Analysis.

    PubMed

    Holt, Johnson; Leach, Adrian W; Schrader, Gritta; Petter, Françoise; Macleod, Alan; van der Gaag, Dirk Jan; Baker, Richard H A; Mumford, John D

    2013-07-01

    Utility functions in the form of tables or matrices have often been used to combine discretely rated decision-making criteria. Matrix elements are usually specified individually, so no one rule or principle can be easily stated for the utility function as a whole. A series of five matrices are presented that aggregate criteria two at a time using simple rules that express a varying degree of constraint of the lower rating over the higher. A further nine possible matrices were obtained by using a different rule either side of the main axis of the matrix to describe situations where the criteria have a differential influence on the outcome. Uncertainties in the criteria are represented by three alternative frequency distributions from which the assessors select the most appropriate. The output of the utility function is a distribution of rating frequencies that is dependent on the distributions of the input criteria. In pest risk analysis (PRA), seven of these utility functions were required to mimic the logic by which assessors for the European and Mediterranean Plant Protection Organization arrive at an overall rating of pest risk. The framework enables the development of PRAs that are consistent and easy to understand, criticize, compare, and change. When tested in workshops, PRA practitioners thought that the approach accorded with both the logic and the level of resolution that they used in the risk assessments. PMID:23834916

  14. Texture analysis on MRI images of non-Hodgkin lymphoma.

    PubMed

    Harrison, L; Dastidar, P; Eskola, H; Järvenpää, R; Pertovaara, H; Luukkaala, T; Kellokumpu-Lehtinen, P-L; Soimakallio, S

    2008-04-01

    The aim here is to show that texture parameters of magnetic resonance imaging (MRI) data changes in lymphoma tissue during chemotherapy. Ten patients having non-Hodgkin lymphoma masses in the abdomen were imaged for chemotherapy response evaluation three consecutive times. The analysis was performed with MaZda texture analysis (TA) application. The best discrimination in lymphoma MRI texture was obtained within T2-weighted images between the pre-treatment and the second response evaluation stage. TA proved to be a promising quantitative means of representing lymphoma tissue changes during medication follow-up. PMID:18342845

  15. The Land Analysis System (LAS) for multispectral image processing

    USGS Publications Warehouse

    Wharton, S. W.; Lu, Y. C.; Quirk, Bruce K.; Oleson, Lyndon R.; Newcomer, J. A.; Irani, Frederick M.

    1988-01-01

    The Land Analysis System (LAS) is an interactive software system available in the public domain for the analysis, display, and management of multispectral and other digital image data. LAS provides over 240 applications functions and utilities, a flexible user interface, complete online and hard-copy documentation, extensive image-data file management, reformatting, conversion utilities, and high-level device independent access to image display hardware. The authors summarize the capabilities of the current release of LAS (version 4.0) and discuss plans for future development. Particular emphasis is given to the issue of system portability and the importance of removing and/or isolating hardware and software dependencies.

  16. (Hyper)-graphical models in biomedical image analysis.

    PubMed

    Paragios, Nikos; Ferrante, Enzo; Glocker, Ben; Komodakis, Nikos; Parisot, Sarah; Zacharaki, Evangelia I

    2016-10-01

    Computational vision, visual computing and biomedical image analysis have made tremendous progress over the past two decades. This is mostly due the development of efficient learning and inference algorithms which allow better and richer modeling of image and visual understanding tasks. Hyper-graph representations are among the most prominent tools to address such perception through the casting of perception as a graph optimization problem. In this paper, we briefly introduce the importance of such representations, discuss their strength and limitations, provide appropriate strategies for their inference and present their application to address a variety of problems in biomedical image analysis. PMID:27377331

  17. Infrared thermal facial image sequence registration analysis and verification

    NASA Astrophysics Data System (ADS)

    Chen, Chieh-Li; Jian, Bo-Lin

    2015-03-01

    To study the emotional responses of subjects to the International Affective Picture System (IAPS), infrared thermal facial image sequence is preprocessed for registration before further analysis such that the variance caused by minor and irregular subject movements is reduced. Without affecting the comfort level and inducing minimal harm, this study proposes an infrared thermal facial image sequence registration process that will reduce the deviations caused by the unconscious head shaking of the subjects. A fixed image for registration is produced through the localization of the centroid of the eye region as well as image translation and rotation processes. Thermal image sequencing will then be automatically registered using the two-stage genetic algorithm proposed. The deviation before and after image registration will be demonstrated by image quality indices. The results show that the infrared thermal image sequence registration process proposed in this study is effective in localizing facial images accurately, which will be beneficial to the correlation analysis of psychological information related to the facial area.

  18. Pathology imaging informatics for quantitative analysis of whole-slide images

    PubMed Central

    Kothari, Sonal; Phan, John H; Stokes, Todd H; Wang, May D

    2013-01-01

    Objectives With the objective of bringing clinical decision support systems to reality, this article reviews histopathological whole-slide imaging informatics methods, associated challenges, and future research opportunities. Target audience This review targets pathologists and informaticians who have a limited understanding of the key aspects of whole-slide image (WSI) analysis and/or a limited knowledge of state-of-the-art technologies and analysis methods. Scope First, we discuss the importance of imaging informatics in pathology and highlight the challenges posed by histopathological WSI. Next, we provide a thorough review of current methods for: quality control of histopathological images; feature extraction that captures image properties at the pixel, object, and semantic levels; predictive modeling that utilizes image features for diagnostic or prognostic applications; and data and information visualization that explores WSI for de novo discovery. In addition, we highlight future research directions and discuss the impact of large public repositories of histopathological data, such as the Cancer Genome Atlas, on the field of pathology informatics. Following the review, we present a case study to illustrate a clinical decision support system that begins with quality control and ends with predictive modeling for several cancer endpoints. Currently, state-of-the-art software tools only provide limited image processing capabilities instead of complete data analysis for clinical decision-making. We aim to inspire researchers to conduct more research in pathology imaging informatics so that clinical decision support can become a reality. PMID:23959844

  19. Cloud based toolbox for image analysis, processing and reconstruction tasks.

    PubMed

    Bednarz, Tomasz; Wang, Dadong; Arzhaeva, Yulia; Lagerstrom, Ryan; Vallotton, Pascal; Burdett, Neil; Khassapov, Alex; Szul, Piotr; Chen, Shiping; Sun, Changming; Domanski, Luke; Thompson, Darren; Gureyev, Timur; Taylor, John A

    2015-01-01

    This chapter describes a novel way of carrying out image analysis, reconstruction and processing tasks using cloud based service provided on the Australian National eResearch Collaboration Tools and Resources (NeCTAR) infrastructure. The toolbox allows users free access to a wide range of useful blocks of functionalities (imaging functions) that can be connected together in workflows allowing creation of even more complex algorithms that can be re-run on different data sets, shared with others or additionally adjusted. The functions given are in the area of cellular imaging, advanced X-ray image analysis, computed tomography and 3D medical imaging and visualisation. The service is currently available on the website www.cloudimaging.net.au . PMID:25381109

  20. Segmented infrared image analysis for rotating machinery fault diagnosis

    NASA Astrophysics Data System (ADS)

    Duan, Lixiang; Yao, Mingchao; Wang, Jinjiang; Bai, Tangbo; Zhang, Laibin

    2016-07-01

    As a noncontact and non-intrusive technique, infrared image analysis becomes promising for machinery defect diagnosis. However, the insignificant information and strong noise in infrared image limit its performance. To address this issue, this paper presents an image segmentation approach to enhance the feature extraction in infrared image analysis. A region selection criterion named dispersion degree is also formulated to discriminate fault representative regions from unrelated background information. Feature extraction and fusion methods are then applied to obtain features from selected regions for further diagnosis. Experimental studies on a rotor fault simulator demonstrate that the presented segmented feature enhancement approach outperforms the one from the original image using both Naïve Bayes classifier and support vector machine.

  1. Image analysis and compression: renewed focus on texture

    NASA Astrophysics Data System (ADS)

    Pappas, Thrasyvoulos N.; Zujovic, Jana; Neuhoff, David L.

    2010-01-01

    We argue that a key to further advances in the fields of image analysis and compression is a better understanding of texture. We review a number of applications that critically depend on texture analysis, including image and video compression, content-based retrieval, visual to tactile image conversion, and multimodal interfaces. We introduce the idea of "structurally lossless" compression of visual data that allows significant differences between the original and decoded images, which may be perceptible when they are viewed side-by-side, but do not affect the overall quality of the image. We then discuss the development of objective texture similarity metrics, which allow substantial point-by-point deviations between textures that according to human judgment are essentially identical.

  2. PML diagnostic criteria

    PubMed Central

    Aksamit, Allen J.; Clifford, David B.; Davis, Larry; Koralnik, Igor J.; Sejvar, James J.; Bartt, Russell; Major, Eugene O.; Nath, Avindra

    2013-01-01

    Objective: To establish criteria for the diagnosis of progressive multifocal leukoencephalopathy (PML). Methods: We reviewed available literature to identify various diagnostic criteria employed. Several search strategies employing the terms “progressive multifocal leukoencephalopathy” with or without “JC virus” were performed with PubMed, SCOPUS, and EMBASE search engines. The articles were reviewed by a committee of individuals with expertise in the disorder in order to determine the most useful applicable criteria. Results: A consensus statement was developed employing clinical, imaging, pathologic, and virologic evidence in support of the diagnosis of PML. Two separate pathways, histopathologic and clinical, for PML diagnosis are proposed. Diagnostic classification includes certain, probable, possible, and not PML. Conclusion: Definitive diagnosis of PML requires neuropathologic demonstration of the typical histopathologic triad (demyelination, bizarre astrocytes, and enlarged oligodendroglial nuclei) coupled with the techniques to show the presence of JC virus. The presence of clinical and imaging manifestations consistent with the diagnosis and not better explained by other disorders coupled with the demonstration of JC virus by PCR in CSF is also considered diagnostic. Algorithms for establishing the diagnosis have been recommended. PMID:23568998

  3. Geostationary microwave imagers detection criteria

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1986-01-01

    Geostationary orbit is investigated as a vantage point from which to sense remotely the surface features of the planet and its atmosphere, with microwave sensors. The geometrical relationships associated with geostationary altitude are developed to produce an efficient search pattern for the detection of emitting media and metal objects. Power transfer equations are derived from the roots of first principles and explain the expected values of the signal-to-clutter ratios for the detection of aircraft, ships, and buoys and for the detection of natural features where they are manifested as cold and warm eddies. The transport of microwave power is described for modeled detection where the direction of power flow is explained by the Zeroth and Second Laws of Thermodynamics. Mathematical expressions are derived that elucidate the detectability of natural emitting media and metal objects. Signal-to-clutter ratio comparisons are drawn among detectable objects that show relative detectability with a thermodynamic sensor and with a short-pulse radar.

  4. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images

    PubMed Central

    Khansari, Maziyar M; O’Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-01-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method’s discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  5. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images.

    PubMed

    Khansari, Maziyar M; O'Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-07-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method's discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  6. Multispectral image analysis for algal biomass quantification.

    PubMed

    Murphy, Thomas E; Macon, Keith; Berberoglu, Halil

    2013-01-01

    This article reports a novel multispectral image processing technique for rapid, noninvasive quantification of biomass concentration in attached and suspended algae cultures. Monitoring the biomass concentration is critical for efficient production of biofuel feedstocks, food supplements, and bioactive chemicals. Particularly, noninvasive and rapid detection techniques can significantly aid in providing delay-free process control feedback in large-scale cultivation platforms. In this technique, three-band spectral images of Anabaena variabilis cultures were acquired and separated into their red, green, and blue components. A correlation between the magnitude of the green component and the areal biomass concentration was generated. The correlation predicted the biomass concentrations of independently prepared attached and suspended cultures with errors of 7 and 15%, respectively, and the effect of varying lighting conditions and background color were investigated. This method can provide necessary feedback for dilution and harvesting strategies to maximize photosynthetic conversion efficiency in large-scale operation. PMID:23554374

  7. A criticism of applications with multi-criteria decision analysis that are used for the site selection for the disposal of municipal solid wastes

    SciTech Connect

    Kemal Korucu, M.; Erdagi, Bora

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer The existing structure of the multi-criteria decision analysis for site selection is criticized. Black-Right-Pointing-Pointer Fundamental problematic points based on the critics are defined. Black-Right-Pointing-Pointer Some modifications are suggested in order to provide solutions to these problematical points. Black-Right-Pointing-Pointer A new structure for the decision making mechanism is proposed. Black-Right-Pointing-Pointer The feasibility of the new method is subjected to an evaluation process. - Abstract: The main aim of this study is to criticize the process of selecting the most appropriate site for the disposal of municipal solid wastes which is one of the problematic issues of waste management operations. These kinds of problems are pathological symptoms of existing problematical human-nature relationship which is related to the syndrome called ecological crisis. In this regard, solving the site selection problem, which is just a small part of a larger entity, for the good of ecological rationality and social justice is only possible by founding a new and extensive type of human-nature relationship. In this study, as a problematic point regarding the discussions on ecological problems, the existing structure of the applications using multi-criteria decision analysis in the process of site selection with three main criteria is criticized. Based on this critique, fundamental problematic points (to which applications are insufficient to find solutions) will be defined. Later, some modifications will be suggested in order to provide solutions to these problematical points. Finally, the criticism addressed to the structure of the method with three main criteria and the feasibility of the new method with four main criteria is subjected to an evaluation process. As a result, it is emphasized that the new structure with four main criteria may be effective in solution of the fundamental problematic points.

  8. Computerized microscopic image analysis of follicular lymphoma

    NASA Astrophysics Data System (ADS)

    Sertel, Olcay; Kong, Jun; Lozanski, Gerard; Catalyurek, Umit; Saltz, Joel H.; Gurcan, Metin N.

    2008-03-01

    Follicular Lymphoma (FL) is a cancer arising from the lymphatic system. Originating from follicle center B cells, FL is mainly comprised of centrocytes (usually middle-to-small sized cells) and centroblasts (relatively large malignant cells). According to the World Health Organization's recommendations, there are three histological grades of FL characterized by the number of centroblasts per high-power field (hpf) of area 0.159 mm2. In current practice, these cells are manually counted from ten representative fields of follicles after visual examination of hematoxylin and eosin (H&E) stained slides by pathologists. Several studies clearly demonstrate the poor reproducibility of this grading system with very low inter-reader agreement. In this study, we are developing a computerized system to assist pathologists with this process. A hybrid approach that combines information from several slides with different stains has been developed. Thus, follicles are first detected from digitized microscopy images with immunohistochemistry (IHC) stains, (i.e., CD10 and CD20). The average sensitivity and specificity of the follicle detection tested on 30 images at 2×, 4× and 8× magnifications are 85.5+/-9.8% and 92.5+/-4.0%, respectively. Since the centroblasts detection is carried out in the H&E-stained slides, the follicles in the IHC-stained images are mapped to H&E-stained counterparts. To evaluate the centroblast differentiation capabilities of the system, 11 hpf images have been marked by an experienced pathologist who identified 41 centroblast cells and 53 non-centroblast cells. A non-supervised clustering process differentiates the centroblast cells from noncentroblast cells, resulting in 92.68% sensitivity and 90.57% specificity.

  9. Measurement and analysis of image sensors

    NASA Astrophysics Data System (ADS)

    Vitek, Stanislav

    2005-06-01

    For astronomical applications is necessary to have high precision in sensing and processing the image data. In this time are used the large CCD sensors from the various reasons. For the replacement of CCD sensors with CMOS sensing devices is important to know transfer characteristics of used CCD sensors. In the special applications like the robotic telescopes (fully automatic, without human interactions) seems to be good using of specially designed smart sensors, which have integrated more functions and have more features than CCDs.

  10. Analysis of pregerminated barley using hyperspectral image analysis.

    PubMed

    Arngren, Morten; Hansen, Per Waaben; Eriksen, Birger; Larsen, Jan; Larsen, Rasmus

    2011-11-01

    Pregermination is one of many serious degradations to barley when used for malting. A pregerminated barley kernel can under certain conditions not regerminate and is reduced to animal feed of lower quality. Identifying pregermination at an early stage is therefore essential in order to segregate the barley kernels into low or high quality. Current standard methods to quantify pregerminated barley include visual approaches, e.g. to identify the root sprout, or using an embryo staining method, which use a time-consuming procedure. We present an approach using a near-infrared (NIR) hyperspectral imaging system in a mathematical modeling framework to identify pregerminated barley at an early stage of approximately 12 h of pregermination. Our model only assigns pregermination as the cause for a single kernel's lack of germination and is unable to identify dormancy, kernel damage etc. The analysis is based on more than 750 Rosalina barley kernels being pregerminated at 8 different durations between 0 and 60 h based on the BRF method. Regerminating the kernels reveals a grouping of the pregerminated kernels into three categories: normal, delayed and limited germination. Our model employs a supervised classification framework based on a set of extracted features insensitive to the kernel orientation. An out-of-sample classification error of 32% (CI(95%): 29-35%) is obtained for single kernels when grouped into the three categories, and an error of 3% (CI(95%): 0-15%) is achieved on a bulk kernel level. The model provides class probabilities for each kernel, which can assist in achieving homogeneous germination profiles. This research can further be developed to establish an automated and faster procedure as an alternative to the standard procedures for pregerminated barley. PMID:21932866

  11. Multispectral/hyperspectral image enhancement for biological cell analysis

    SciTech Connect

    Nuffer, Lisa L.; Medvick, Patricia A.; Foote, Harlan P.; Solinsky, James C.

    2006-08-01

    The paper shows new techniques for analyzing cell images taken with a microscope using multiple filters to form a datacube of spectral image planes. Because of the many neighboring spectral samples, much of the datacube appears as redundant, similar tissue. The analysis is based on the nonGaussian statistics of the image data, allowing for remapping of the data into image components that are dissimilar, and hence isolate subtle, spatial object regions of interest in the tissues. This individual component image set can be recombined into a single RGB color image useful in real-time location of regions of interest. The algorithms are susceptible to parallelization using Field Programmable Gate Array hardware processing.

  12. Method for measuring anterior chamber volume by image analysis

    NASA Astrophysics Data System (ADS)

    Zhai, Gaoshou; Zhang, Junhong; Wang, Ruichang; Wang, Bingsong; Wang, Ningli

    2007-12-01

    Anterior chamber volume (ACV) is very important for an oculist to make rational pathological diagnosis as to patients who have some optic diseases such as glaucoma and etc., yet it is always difficult to be measured accurately. In this paper, a method is devised to measure anterior chamber volumes based on JPEG-formatted image files that have been transformed from medical images using the anterior-chamber optical coherence tomographer (AC-OCT) and corresponding image-processing software. The corresponding algorithms for image analysis and ACV calculation are implemented in VC++ and a series of anterior chamber images of typical patients are analyzed, while anterior chamber volumes are calculated and are verified that they are in accord with clinical observation. It shows that the measurement method is effective and feasible and it has potential to improve accuracy of ACV calculation. Meanwhile, some measures should be taken to simplify the handcraft preprocess working as to images.

  13. Seismoelectric beamforming imaging: a sensitivity analysis

    NASA Astrophysics Data System (ADS)

    El Khoury, P.; Revil, A.; Sava, P.

    2015-06-01

    The electrical current density generated by the propagation of a seismic wave at the interface characterized by a drop in electrical, hydraulic or mechanical properties produces an electrical field of electrokinetic nature. This field can be measured remotely with a signal-to-noise ratio depending on the background noise and signal attenuation. The seismoelectric beamforming approach is an emerging imaging technique based on scanning a porous material using appropriately delayed seismic sources. The idea is to focus the hydromechanical energy on a regular spatial grid and measure the converted electric field remotely at each focus time. This method can be used to image heterogeneities with a high definition and to provide structural information to classical geophysical methods. A numerical experiment is performed to investigate the resolution of the seismoelectric beamforming approach with respect to the main wavelength of the seismic waves. The 2-D model consists of a fictitious water-filled bucket in which a cylindrical sandstone core sample is set up vertically. The hydrophones/seismic sources are located on a 50-cm diameter circle in the bucket and the seismic energy is focused on the grid points in order to scan the medium and determine the geometry of the porous plug using the output electric potential image. We observe that the resolution of the method is given by a density of eight scanning points per wavelength. Additional numerical tests were also performed to see the impact of a wrong velocity model upon the seismoelectric map displaying the heterogeneities of the material.

  14. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  15. A guide to human in vivo microcirculatory flow image analysis.

    PubMed

    Massey, Michael J; Shapiro, Nathan I

    2016-01-01

    Various noninvasive microscopic camera technologies have been used to visualize the sublingual microcirculation in patients. We describe a comprehensive approach to bedside in vivo sublingual microcirculation video image capture and analysis techniques in the human clinical setting. We present a user perspective and guide suitable for clinical researchers and developers interested in the capture and analysis of sublingual microcirculatory flow videos. We review basic differences in the cameras, optics, light sources, operation, and digital image capture. We describe common techniques for image acquisition and discuss aspects of video data management, including data transfer, metadata, and database design and utilization to facilitate the image analysis pipeline. We outline image analysis techniques and reporting including video preprocessing and image quality evaluation. Finally, we propose a framework for future directions in the field of microcirculatory flow videomicroscopy acquisition and analysis. Although automated scoring systems have not been sufficiently robust for widespread clinical or research use to date, we discuss promising innovations that are driving new development. PMID:26861691

  16. New approach to gallbladder ultrasonic images analysis and lesions recognition.

    PubMed

    Bodzioch, Sławomir; Ogiela, Marek R

    2009-03-01

    This paper presents a new approach to gallbladder ultrasonic image processing and analysis towards detection of disease symptoms on processed images. First, in this paper, there is presented a new method of filtering gallbladder contours from USG images. A major stage in this filtration is to segment and section off areas occupied by the said organ. In most cases this procedure is based on filtration that plays a key role in the process of diagnosing pathological changes. Unfortunately ultrasound images present among the most troublesome methods of analysis owing to the echogenic inconsistency of structures under observation. This paper provides for an inventive algorithm for the holistic extraction of gallbladder image contours. The algorithm is based on rank filtration, as well as on the analysis of histogram sections on tested organs. The second part concerns detecting lesion symptoms of the gallbladder. Automating a process of diagnosis always comes down to developing algorithms used to analyze the object of such diagnosis and verify the occurrence of symptoms related to given affection. Usually the final stage is to make a diagnosis based on the detected symptoms. This last stage can be carried out through either dedicated expert systems or more classic pattern analysis approach like using rules to determine illness basing on detected symptoms. This paper discusses the pattern analysis algorithms for gallbladder image interpretation towards classification of the most frequent illness symptoms of this organ. PMID:19124224

  17. Effect of nutrition survey 'cleaning criteria' on estimates of malnutrition prevalence and disease burden: secondary data analysis.

    PubMed

    Crowe, Sonya; Seal, Andrew; Grijalva-Eternod, Carlos; Kerac, Marko

    2014-01-01

    Tackling childhood malnutrition is a global health priority. A key indicator is the estimated prevalence of malnutrition, measured by nutrition surveys. Most aspects of survey design are standardised, but data 'cleaning criteria' are not. These aim to exclude extreme values which may represent measurement or data-entry errors. The effect of different cleaning criteria on malnutrition prevalence estimates was unknown. We applied five commonly used data cleaning criteria (WHO 2006; EPI-Info; WHO 1995 fixed; WHO 1995 flexible; SMART) to 21 national Demographic and Health Survey datasets. These included a total of 163,228 children, aged 6-59 months. We focused on wasting (low weight-for-height), a key indicator for treatment programmes. Choice of cleaning criteria had a marked effect: SMART were least inclusive, resulting in the lowest reported malnutrition prevalence, while WHO 2006 were most inclusive, resulting in the highest. Across the 21 countries, the proportion of records excluded was 3 to 5 times greater when using SMART compared to WHO 2006 criteria, resulting in differences in the estimated prevalence of total wasting of between 0.5 and 3.8%, and differences in severe wasting of 0.4-3.9%. The magnitude of difference was associated with the standard deviation of the survey sample, a statistic that can reflect both population heterogeneity and data quality. Using these results to estimate case-loads for treatment programmes resulted in large differences for all countries. Wasting prevalence and caseload estimations are strongly influenced by choice of cleaning criterion. Because key policy and programming decisions depend on these statistics, variations in analytical practice could lead to inconsistent and potentially inappropriate implementation of malnutrition treatment programmes. We therefore call for mandatory reporting of cleaning criteria use so that results can be compared and interpreted appropriately. International consensus is urgently needed

  18. Towards a Quantitative OCT Image Analysis

    PubMed Central

    Garcia Garrido, Marina; Beck, Susanne C.; Mühlfriedel, Regine; Julien, Sylvie; Schraermeyer, Ulrich; Seeliger, Mathias W.

    2014-01-01

    Background Optical coherence tomography (OCT) is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study. Methods Spectral-Domain Optical Coherence Tomography (OCT), confocal Scanning-La­ser Ophthalmoscopy (SLO), and Fluorescein Angiography (FA) were performed in mice (Mus musculus), gerbils (Gerbillus perpadillus), and cynomolgus monkeys (Macaca fascicularis) using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer). Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/). Results Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates) with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP), the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated. Conclusions OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions

  19. Analysis of Published Criteria for Clinically Inactive Disease in a Large Juvenile Dermatomyositis Cohort Shows That Skin Disease Is Underestimated

    PubMed Central

    Almeida, Beverley; Campanilho‐Marques, Raquel; Arnold, Katie; Pilkington, Clarissa A.; Wedderburn, Lucy R.; Armon, Kate; Briggs, Vanja; Ellis‐Gage, Joe; Roper, Holly; Watts, Joanna; Baildam, Eileen; Hanna, Louise; Lloyd, Olivia; McCann, Liza; Roberts, Ian; McGovern, Ann; Riley, Phil; Al‐Abadi, Eslam; Ryder, Clive; Scott, Janis; Southwood, Taunton; Thomas, Beverley; Amin, Tania; Burton, Deborah; Jackson, Gillian; Van Rooyen, Vanessa; Wood, Mark; Wyatt, Sue; Browne, Michael; Davidson, Joyce; Ferguson, Sue; Gardner‐Medwin, Janet; Martin, Neil; Waxman, Liz; Foster, Helen; Friswell, Mark; Jandial, Sharmila; Qiao, Lisa; Sen, Ethan; Smith, Eve; Stevenson, Vicky; Swift, Alison; Wade, Debbie; Watson, Stuart; Crate, Lindsay; Frost, Anna; Jordan, Mary; Mosley, Ellen; Satyapal, Rangaraj; Stretton, Elizabeth; Venning, Helen; Warrier, Kishore; Almeida, Beverley; Arnold, Katie; Beard, Laura; Brown, Virginia; Campanilho‐Marques, Raquel; Enayat, Elli; Glackin, Yvonne; Halkon, Elizabeth; Hasson, Nathan; Juggins, Audrey; Kassoumeri, Laura; Lunt, Sian; Maillard, Sue; Nistala, Kiran; Pilkington, Clarissa; Simou, Stephanie; Smith, Sally; Varsani, Hemlata; Wedderburn, Lucy; Murray, Kevin; Ioannou, John; Suffield, Linda; Al‐Obaidi, Muthana; Leach, Sam; Lee, Helen; Smith, Helen; Inness, Emma; Kendall, Eunice; Mayers, David; Wilkinson, Nick; Clinch, Jacqui; Pluess‐Hall, Helen

    2015-01-01

    Objective The Pediatric Rheumatology International Trials Organisation (PRINTO) recently published criteria for classification of patients with juvenile dermatomyositis (DM) as having clinically inactive disease. The criteria require that at least 3 of 4 conditions be met, i.e., creatine kinase level ≤150 units/liter, Childhood Myositis Assessment Scale score ≥48, Manual Muscle Testing in 8 muscles score ≥78, and physician's global assessment of overall disease activity (PGA) ≤0.2. The present study was undertaken to test these criteria in a UK cohort of patients with juvenile DM. Methods We assessed 1,114 patient visits for the 4 items in the PRINTO criteria for clinically inactive disease. Each visit was analyzed to determine whether skin disease was present. The Disease Activity Score (DAS) for juvenile DM was determined in 59 patients. Results At 307 of the 1,114 visits, clinically inactive disease was achieved based on the 3 muscle criteria (but with a PGA of >0.2); rash was present at 65.8% of these visits and nailfold capillary abnormalities at 35.2%. When PGA ≤0.2 was one of the 3 criteria that were met, the frequency of skin signs was significantly lower (rash in 23.1% and nailfold capillary abnormalities in 8.7%). If PGA was considered an essential criterion for clinically inactive disease (P‐CID), patients with active skin disease were less likely to be categorized as having clinically inactive disease (a median DAS skin score of 0 [of a possible maximum of 9] in visits where the PGA was ≤0.2, versus a median DAS skin score of 4 in patients meeting the 3 muscle criteria [with a PGA of >0.2]; P < 0.001). Use of the P‐CID led to improvements in the positive predictive value and the positive likelihood ratio (85.4% and 11.0, respectively, compared to 72.9% and 5.1 with the current criteria). Conclusion There was a high frequency of skin disease among patients with juvenile DM who did not meet the PGA criterion for inactive disease but met

  20. Integrated wavelets for medical image analysis

    NASA Astrophysics Data System (ADS)

    Heinlein, Peter; Schneider, Wilfried

    2003-11-01

    Integrated wavelets are a new method for discretizing the continuous wavelet transform (CWT). Independent of the choice of discrete scale and orientation parameters they yield tight families of convolution operators. Thus these families can easily be adapted to specific problems. After presenting the fundamental ideas, we focus primarily on the construction of directional integrated wavelets and their application to medical images. We state an exact algorithm for implementing this transform and present applications from the field of digital mammography. The first application covers the enhancement of microcalcifications in digital mammograms. Further, we exploit the directional information provided by integrated wavelets for better separation of microcalcifications from similar structures.

  1. Imaging for dismantlement verification: information management and analysis algorithms

    SciTech Connect

    Seifert, Allen; Miller, Erin A.; Myjak, Mitchell J.; Robinson, Sean M.; Jarman, Kenneth D.; Misner, Alex C.; Pitts, W. Karl; Woodring, Mitchell L.

    2010-09-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute. However, this process must be performed with care. Computing the perimeter, area, and intensity of an object, for example, might reveal sensitive information relating to shape, size, and material composition. This paper presents three analysis algorithms that reduce full image information to non-sensitive feature information. Ultimately, the algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We evaluate the algorithms on both their technical performance in image analysis, and their application with and without an explicitly constructed information barrier. The underlying images can be highly detailed, since they are dynamically generated behind the information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography.

  2. Image analysis tools and emerging algorithms for expression proteomics

    PubMed Central

    English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

    2012-01-01

    Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

  3. A TSVD Analysis of Microwave Inverse Scattering for Breast Imaging

    PubMed Central

    Shea, Jacob D.; Van Veen, Barry D.; Hagness, Susan C.

    2013-01-01

    A variety of methods have been applied to the inverse scattering problem for breast imaging at microwave frequencies. While many techniques have been leveraged toward a microwave imaging solution, they are all fundamentally dependent on the quality of the scattering data. Evaluating and optimizing the information contained in the data are, therefore, instrumental in understanding and achieving optimal performance from any particular imaging method. In this paper, a method of analysis is employed for the evaluation of the information contained in simulated scattering data from a known dielectric profile. The method estimates optimal imaging performance by mapping the data through the inverse of the scattering system. The inverse is computed by truncated singular-value decomposition of a system of scattering equations. The equations are made linear by use of the exact total fields in the imaging volume, which are available in the computational domain. The analysis is applied to anatomically realistic numerical breast phantoms. The utility of the method is demonstrated for a given imaging system through the analysis of various considerations in system design and problem formulation. The method offers an avenue for decoupling the problem of data selection from the problem of image formation from that data. PMID:22113770

  4. Evaluating the performance of clinical criteria for predicting mismatch repair gene mutations in Lynch syndrome: a comprehensive analysis of 3,671 families.

    PubMed

    Steinke, Verena; Holzapfel, Stefanie; Loeffler, Markus; Holinski-Feder, Elke; Morak, Monika; Schackert, Hans K; Görgens, Heike; Pox, Christian; Royer-Pokora, Brigitte; von Knebel-Doeberitz, Magnus; Büttner, Reinhard; Propping, Peter; Engel, Christoph

    2014-07-01

    Carriers of mismatch repair (MMR) gene mutations have a high lifetime risk for colorectal and endometrial cancers, as well as other malignancies. As mutation analysis to detect these patients is expensive and time-consuming, clinical criteria and tumor-tissue analysis are widely used as pre-screening methods. The aim of our study was to evaluate the performance of commonly applied clinical criteria (the Amsterdam I and II Criteria, and the original and revised Bethesda Guidelines) and the results of tumor-tissue analysis in predicting MMR gene mutations. We analyzed 3,671 families from the German HNPCC Registry and divided them into nine mutually exclusive groups with different clinical criteria. A total of 680 families (18.5%) were found to have a pathogenic MMR gene mutation. Among all 1,284 families with microsatellite instability-high (MSI-H) colorectal cancer, the overall mutation detection rate was 53.0%. Mutation frequencies and their distribution between the four MMR genes differed significantly between clinical groups (p < 0.001). The highest frequencies were found in families fulfilling the Amsterdam Criteria (46.4%). Families with loss of MSH2 expression had higher mutation detection rates (69.5%) than families with loss of MLH1 expression (43.1%). MMR mutations were found significantly more often in families with at least one MSI-H small-bowel cancer (p < 0.001). No MMR mutations were found among patients under 40-years-old with only colorectal adenoma. Familial clustering of Lynch syndrome-related tumors, early age of onset, and familial occurrence of small-bowel cancer were clinically relevant predictors for Lynch syndrome. PMID:24493211

  5. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    SciTech Connect

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin; Hollingsworth, Alan B.; Qian, Wei

    2015-11-15

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy.

  6. Ballistics projectile image analysis for firearm identification.

    PubMed

    Li, Dongguang

    2006-10-01

    This paper is based upon the observation that, when a bullet is fired, it creates characteristic markings on the cartridge case and projectile. From these markings, over 30 different features can be distinguished, which, in combination, produce a "fingerprint" for a firearm. By analyzing features within such a set of firearm fingerprints, it will be possible to identify not only the type and model of a firearm, but also each and every individual weapon just as effectively as human fingerprint identification. A new analytic system based on the fast Fourier transform for identifying projectile specimens by the line-scan imaging technique is proposed in this paper. This paper develops optical, photonic, and mechanical techniques to map the topography of the surfaces of forensic projectiles for the purpose of identification. Experiments discussed in this paper are performed on images acquired from 16 various weapons. Experimental results show that the proposed system can be used for firearm identification efficiently and precisely through digitizing and analyzing the fired projectiles specimens. PMID:17022254

  7. Analysis operator learning and its application to image reconstruction.

    PubMed

    Hawe, Simon; Kleinsteuber, Martin; Diepold, Klaus

    2013-06-01

    Exploiting a priori known structural information lies at the core of many image reconstruction methods that can be stated as inverse problems. The synthesis model, which assumes that images can be decomposed into a linear combination of very few atoms of some dictionary, is now a well established tool for the design of image reconstruction algorithms. An interesting alternative is the analysis model, where the signal is multiplied by an analysis operator and the outcome is assumed to be sparse. This approach has only recently gained increasing interest. The quality of reconstruction methods based on an analysis model severely depends on the right choice of the suitable operator. In this paper, we present an algorithm for learning an analysis operator from training images. Our method is based on l(p)-norm minimization on the set of full rank matrices with normalized columns. We carefully introduce the employed conjugate gradient method on manifolds, and explain the underlying geometry of the constraints. Moreover, we compare our approach to state-of-the-art methods for image denoising, inpainting, and single image super-resolution. Our numerical results show competitive performance of our general approach in all presented applications compared to the specialized state-of-the-art techniques. PMID:23412611

  8. The Spectral Image Processing System (SIPS) - Interactive visualization and analysis of imaging spectrometer data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1993-01-01

    The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).

  9. Subcellular chemical and morphological analysis by stimulated Raman scattering microscopy and image analysis techniques

    PubMed Central

    D’Arco, Annalisa; Brancati, Nadia; Ferrara, Maria Antonietta; Indolfi, Maurizio; Frucci, Maria; Sirleto, Luigi

    2016-01-01

    The visualization of heterogeneous morphology, segmentation and quantification of image features is a crucial point for nonlinear optics microscopy applications, spanning from imaging of living cells or tissues to biomedical diagnostic. In this paper, a methodology combining stimulated Raman scattering microscopy and image analysis technique is presented. The basic idea is to join the potential of vibrational contrast of stimulated Raman scattering and the strength of imaging analysis technique in order to delineate subcellular morphology with chemical specificity. Validation tests on label free imaging of polystyrene-beads and of adipocyte cells are reported and discussed. PMID:27231626

  10. Subcellular chemical and morphological analysis by stimulated Raman scattering microscopy and image analysis techniques.

    PubMed

    D'Arco, Annalisa; Brancati, Nadia; Ferrara, Maria Antonietta; Indolfi, Maurizio; Frucci, Maria; Sirleto, Luigi

    2016-05-01

    The visualization of heterogeneous morphology, segmentation and quantification of image features is a crucial point for nonlinear optics microscopy applications, spanning from imaging of living cells or tissues to biomedical diagnostic. In this paper, a methodology combining stimulated Raman scattering microscopy and image analysis technique is presented. The basic idea is to join the potential of vibrational contrast of stimulated Raman scattering and the strength of imaging analysis technique in order to delineate subcellular morphology with chemical specificity. Validation tests on label free imaging of polystyrene-beads and of adipocyte cells are reported and discussed. PMID:27231626

  11. A parallel solution for high resolution histological image analysis.

    PubMed

    Bueno, G; González, R; Déniz, O; García-Rojo, M; González-García, J; Fernández-Carrobles, M M; Vállez, N; Salido, J

    2012-10-01

    This paper describes a general methodology for developing parallel image processing algorithms based on message passing for high resolution images (on the order of several Gigabytes). These algorithms have been applied to histological images and must be executed on massively parallel processing architectures. Advances in new technologies for complete slide digitalization in pathology have been combined with developments in biomedical informatics. However, the efficient use of these digital slide systems is still a challenge. The image processing that these slides are subject to is still limited both in terms of data processed and processing methods. The work presented here focuses on the need to design and develop parallel image processing tools capable of obtaining and analyzing the entire gamut of information included in digital slides. Tools have been developed to assist pathologists in image analysis and diagnosis, and they cover low and high-level image processing methods applied to histological images. Code portability, reusability and scalability have been tested by using the following parallel computing architectures: distributed memory with massive parallel processors and two networks, INFINIBAND and Myrinet, composed of 17 and 1024 nodes respectively. The parallel framework proposed is flexible, high performance solution and it shows that the efficient processing of digital microscopic images is possible and may offer important benefits to pathology laboratories. PMID:22522064

  12. SNR analysis of 3D magnetic resonance tomosynthesis (MRT) imaging

    NASA Astrophysics Data System (ADS)

    Kim, Min-Oh; Kim, Dong-Hyun

    2012-03-01

    In conventional 3D Fourier transform (3DFT) MR imaging, signal-to-noise ratio (SNR) is governed by the well-known relationship of being proportional to the voxel size and square root of the imaging time. Here, we introduce an alternative 3D imaging approach, termed MRT (Magnetic Resonance Tomosynthesis), which can generate a set of tomographic MR images similar to multiple 2D projection images in x-ray. A multiple-oblique-view (MOV) pulse sequence is designed to acquire the tomography-like images used in tomosynthesis process and an iterative back-projection (IBP) reconstruction method is used to reconstruct 3D images. SNR analysis is performed and shows that resolution and SNR tradeoff is not governed as with typical 3DFT MR imaging case. The proposed method provides a higher SNR than the conventional 3D imaging method with a partial loss of slice-direction resolution. It is expected that this method can be useful for extremely low SNR cases.

  13. Vibration analysis using digital image processing for in vitro imaging systems

    NASA Astrophysics Data System (ADS)

    Wang, Zhonghua; Wang, Shaohong; Gonzalez, Carlos

    2011-09-01

    A non-invasive self-measurement method for analyzing vibrations within a biological imaging system is presented. This method utilizes the system's imaging sensor, digital image processing and a custom dot matrix calibration target for in-situ vibration measurements. By taking a series of images of the target within a fixed field of view and time interval, averaging the dot profiles in each image, the in-plane coherent spacing of each dot can be identified in both the horizontal and vertical directions. The incoherent movement in the pattern spacing caused by vibration is then resolved from each image. Accounting for the CMOS imager rolling shutter, vibrations are then measured with different sampling times for intra-frame and inter-frame, the former provides the frame time and the later the image sampling time. The power spectrum density (PSD) analysis is then performed using both measurements to provide the incoherent system displacements and identify potential vibration sources. The PSD plots provide descriptive statistics of the displacement distribution due to random vibration contents. This approach has been successful in identifying vibration sources and measuring vibration geometric moments in imaging systems.

  14. A Fluorescent Live Imaging Screening Assay Based on Translocation Criteria Identifies Novel Cytoplasmic Proteins Implicated in G Protein-coupled Receptor Signaling Pathways.

    PubMed

    Lecat, Sandra; Matthes, Hans W D; Pepperkok, Rainer; Simpson, Jeremy C; Galzi, Jean-Luc

    2015-05-01

    Several cytoplasmic proteins that are involved in G protein-coupled receptor signaling cascades are known to translocate to the plasma membrane upon receptor activation, such as beta-arrestin2. Based on this example and in order to identify new cytoplasmic proteins implicated in the ON-and-OFF cycle of G protein-coupled receptor, a live-imaging screen of fluorescently labeled cytoplasmic proteins was performed using translocation criteria. The screening of 193 fluorescently tagged human proteins identified eight proteins that responded to activation of the tachykinin NK2 receptor by a change in their intracellular localization. Previously we have presented the functional characterization of one of these proteins, REDD1, that translocates to the plasma membrane. Here we report the results of the entire screening. The process of cell activation was recorded on videos at different time points and all the videos can be visualized on a dedicated website. The proteins BAIAP3 and BIN1, partially translocated to the plasma membrane upon activation of NK2 receptors. Proteins ARHGAP12 and PKM2 translocated toward membrane blebs. Three proteins that associate with the cytoskeleton were of particular interest : PLEKHH2 rearranged from individual dots located near the cell-substrate adhesion surface into lines of dots. The speriolin-like protein, SPATC1L, redistributed to cell-cell junctions. The Chloride intracellular Channel protein, CLIC2, translocated from actin-enriched plasma membrane bundles to cell-cell junctions upon activation of NK2 receptors. CLIC2, and one of its close paralogs, CLIC4, were further shown to respond with the same translocation pattern to muscarinic M3 and lysophosphatidic LPA receptors. This screen allowed us to identify potential actors in signaling pathways downstream of G protein-coupled receptors and could be scaled-up for high-content screening. PMID:25759509

  15. Towards large-scale histopathological image analysis: hashing-based image retrieval.

    PubMed

    Zhang, Xiaofan; Liu, Wei; Dundar, Murat; Badve, Sunil; Zhang, Shaoting

    2015-02-01

    Automatic analysis of histopathological images has been widely utilized leveraging computational image-processing methods and modern machine learning techniques. Both computer-aided diagnosis (CAD) and content-based image-retrieval (CBIR) systems have been successfully developed for diagnosis, disease detection, and decision support in this area. Recently, with the ever-increasing amount of annotated medical data, large-scale and data-driven methods have emerged to offer a promise of bridging the semantic gap between images and diagnostic information. In this paper, we focus on developing scalable image-retrieval techniques to cope intelligently with massive histopathological images. Specifically, we present a supervised kernel hashing technique which leverages a small amount of supervised information in learning to compress a 10 000-dimensional image feature vector into only tens of binary bits with the informative signatures preserved. These binary codes are then indexed into a hash table that enables real-time retrieval of images in a large database. Critically, the supervised information is employed to bridge the semantic gap between low-level image features and high-level diagnostic information. We build a scalable image-retrieval framework based on the supervised hashing technique and validate its performance on several thousand histopathological images acquired from breast microscopic tissues. Extensive evaluations are carried out in terms of image classification (i.e., benign versus actionable categorization) and retrieval tests. Our framework achieves about 88.1% classification accuracy as well as promising time efficiency. For example, the framework can execute around 800 queries in only 0.01 s, comparing favorably with other commonly used dimensionality reduction and feature selection methods. PMID:25314696

  16. Rapid enumeration of viable bacteria by image analysis

    NASA Technical Reports Server (NTRS)

    Singh, A.; Pyle, B. H.; McFeters, G. A.

    1989-01-01

    A direct viable counting method for enumerating viable bacteria was modified and made compatible with image analysis. A comparison was made between viable cell counts determined by the spread plate method and direct viable counts obtained using epifluorescence microscopy either manually or by automatic image analysis. Cultures of Escherichia coli, Salmonella typhimurium, Vibrio cholerae, Yersinia enterocolitica and Pseudomonas aeruginosa were incubated at 35 degrees C in a dilute nutrient medium containing nalidixic acid. Filtered samples were stained for epifluorescence microscopy and analysed manually as well as by image analysis. Cells enlarged after incubation were considered viable. The viable cell counts determined using image analysis were higher than those obtained by either the direct manual count of viable cells or spread plate methods. The volume of sample filtered or the number of cells in the original sample did not influence the efficiency of the method. However, the optimal concentration of nalidixic acid (2.5-20 micrograms ml-1) and length of incubation (4-8 h) varied with the culture tested. The results of this study showed that under optimal conditions, the modification of the direct viable count method in combination with image analysis microscopy provided an efficient and quantitative technique for counting viable bacteria in a short time.

  17. Automated Analysis of Dynamic Ca2+ Signals in Image Sequences

    PubMed Central

    Francis, Michael; Waldrup, Josh; Qian, Xun; Taylor, Mark S.

    2014-01-01

    Intracellular Ca2+ signals are commonly studied with fluorescent Ca2+ indicator dyes and microscopy techniques. However, quantitative analysis of Ca2+ imaging data is time consuming and subject to bias. Automated signal analysis algorithms based on region of interest (ROI) detection have been implemented for one-dimensional line scan measurements, but there is no current algorithm which integrates optimized identification and analysis of ROIs in two-dimensional image sequences. Here an algorithm for rapid acquisition and analysis of ROIs in image sequences is described. It utilizes ellipses fit to noise filtered signals in order to determine optimal ROI placement, and computes Ca2+ signal parameters of amplitude, duration and spatial spread. This algorithm was implemented as a freely available plugin for ImageJ (NIH) software. Together with analysis scripts written for the open source statistical processing software R, this approach provides a high-capacity pipeline for performing quick statistical analysis of experimental output. The authors suggest that use of this analysis protocol will lead to a more complete and unbiased characterization of physiologic Ca2+ signaling. PMID:24962784

  18. Police witness identification images: a geometric morphometric analysis.

    PubMed

    Hayes, Susan; Tullberg, Cameron

    2012-11-01

    Research into witness identification images typically occurs within the laboratory and involves subjective likeness and recognizability judgments. This study analyzed whether actual witness identification images systematically alter the facial shapes of the suspects described. The shape analysis tool, geometric morphometrics, was applied to 46 homologous facial landmarks displayed on 50 witness identification images and their corresponding arrest photographs, using principal component analysis and multivariate regressions. The results indicate that compared with arrest photographs, witness identification images systematically depict suspects with lowered and medially located eyebrows (p = <0.000001). This was found to occur independently of the Police Artist, and did not occur with composites produced under laboratory conditions. There are several possible explanations for this finding, including any, or all, of the following: The suspect was frowning at the time of the incident, the witness had negative feelings toward the suspect, this is an effect of unfamiliar face processing, the suspect displayed fear at the time of their arrest photograph. PMID:22536846