Science.gov

Sample records for achievable image quality

  1. Dose reduction of up to 89% while maintaining image quality in cardiovascular CT achieved with prospective ECG gating

    NASA Astrophysics Data System (ADS)

    Londt, John H.; Shreter, Uri; Vass, Melissa; Hsieh, Jiang; Ge, Zhanyu; Adda, Olivier; Dowe, David A.; Sabllayrolles, Jean-Louis

    2007-03-01

    We present the results of dose and image quality performance evaluation of a novel, prospective ECG-gated Coronary CT Angiography acquisition mode (SnapShot Pulse, LightSpeed VCT-XT scanner, GE Healthcare, Waukesha, WI), and compare it to conventional retrospective ECG gated helical acquisition in clinical and phantom studies. Image quality phantoms were used to measure noise, slice sensitivity profile, in-plane resolution, low contrast detectability and dose, using the two acquisition modes. Clinical image quality and diagnostic confidence were evaluated in a study of 31 patients scanned with the two acquisition modes. Radiation dose reduction in clinical practice was evaluated by tracking 120 consecutive patients scanned with the prospectively gated scan mode. In the phantom measurements, the prospectively gated mode resulted in equivalent or better image quality measures at dose reductions of up to 89% compared to non-ECG modulated conventional helical scans. In the clinical study, image quality was rated excellent by expert radiologist reviewing the cases, with pathology being identical using the two acquisition modes. The average dose to patients in the clinical practice study was 5.6 mSv, representing 50% reduction compared to a similar patient population scanned with the conventional helical mode.

  2. Achieving Quality in Occupational Health

    NASA Technical Reports Server (NTRS)

    O'Donnell, Michele (Editor); Hoffler, G. Wyckliffe (Editor)

    1997-01-01

    The conference convened approximately 100 registered participants of invited guest speakers, NASA presenters, and a broad spectrum of the Occupational Health disciplines representing NASA Headquarters and all NASA Field Centers. Centered on the theme, "Achieving Quality in Occupational Health," conferees heard presentations from award winning occupational health program professionals within the Agency and from private industry; updates on ISO 9000 status, quality assurance, and information technologies; workshops on ergonomics and respiratory protection; an overview from the newly commissioned NASA Occupational Health Assessment Team; and a keynote speech on improving women's health. In addition, NASA occupational health specialists presented 24 poster sessions and oral deliveries on various aspects of current practice at their field centers.

  3. Gifted Student Academic Achievement and Program Quality

    ERIC Educational Resources Information Center

    Jordan, Katrina Ann Woolsey

    2010-01-01

    Gifted academic achievement has been identified as a major area of interest for educational researchers. The purpose of this study was to ascertain whether there was a relation between the quality of gifted programs as perceived by teachers, coordinators and supervisors of the gifted and the achievement of the same gifted students in 6th and 7th…

  4. Retinal Image Quality During Accommodation

    PubMed Central

    López-Gil, N.; Martin, J.; Liu, T.; Bradley, A.; Díaz-Muñoz, D.; Thibos, L.

    2013-01-01

    Purpose We asked if retinal image quality is maximum during accommodation, or sub-optimal due to accommodative error, when subjects perform an acuity task. Methods Subjects viewed a monochromatic (552nm), high-contrast letter target placed at various viewing distances. Wavefront aberrations of the accommodating eye were measured near the endpoint of an acuity staircase paradigm. Refractive state, defined as the optimum target vergence for maximising retinal image quality, was computed by through-focus wavefront analysis to find the power of the virtual correcting lens that maximizes visual Strehl ratio. Results Despite changes in ocular aberrations and pupil size during binocular viewing, retinal image quality and visual acuity typically remain high for all target vergences. When accommodative errors lead to sub-optimal retinal image quality, acuity and measured image quality both decline. However, the effect of accommodation errors of on visual acuity are mitigated by pupillary constriction associated with accommodation and binocular convergence and also to binocular summation of dissimilar retinal image blur. Under monocular viewing conditions some subjects displayed significant accommodative lag that reduced visual performance, an effect that was exacerbated by pharmacological dilation of the pupil. Conclusions Spurious measurement of accommodative error can be avoided when the image quality metric used to determine refractive state is compatible with the focusing criteria used by the visual system to control accommodation. Real focusing errors of the accommodating eye do not necessarily produce a reliably measurable loss of image quality or clinically significant loss of visual performance, probably because of increased depth-of-focus due to pupil constriction. When retinal image quality is close to maximum achievable (given the eye’s higher-order aberrations), acuity is also near maximum. A combination of accommodative lag, reduced image quality, and reduced

  5. Social image quality

    NASA Astrophysics Data System (ADS)

    Qiu, Guoping; Kheiri, Ahmed

    2011-01-01

    Current subjective image quality assessments have been developed in the laboratory environments, under controlledconditions, and are dependent on the participation of limited numbers of observers. In this research, with the help of Web 2.0 and social media technology, a new method for building a subjective image quality metric has been developed where the observers are the Internet users. A website with a simple user interface that enables Internet users from anywhere at any time to vote for a better quality version of a pair of the same image has been constructed. Users' votes are recorded and used to rank the images according to their perceived visual qualities. We have developed three rank aggregation algorithms to process the recorded pair comparison data, the first uses a naive approach, the second employs a Condorcet method, and the third uses the Dykstra's extension of Bradley-Terry method. The website has been collecting data for about three months and has accumulated over 10,000 votes at the time of writing this paper. Results show that the Internet and its allied technologies such as crowdsourcing offer a promising new paradigm for image and video quality assessment where hundreds of thousands of Internet users can contribute to building more robust image quality metrics. We have made Internet user generated social image quality (SIQ) data of a public image database available online (http://www.hdri.cs.nott.ac.uk/siq/) to provide the image quality research community with a new source of ground truth data. The website continues to collect votes and will include more public image databases and will also be extended to include videos to collect social video quality (SVQ) data. All data will be public available on the website in due course.

  6. Image quality assessment in the low quality regime

    NASA Astrophysics Data System (ADS)

    Pinto, Guilherme O.; Hemami, Sheila S.

    2012-03-01

    Traditionally, image quality estimators have been designed and optimized to operate over the entire quality range of images in a database, from very low quality to visually lossless. However, if quality estimation is limited to a smaller quality range, their performances drop dramatically, and many image applications only operate over such a smaller range. This paper is concerned with one such range, the low-quality regime, which is defined as the interval of perceived quality scores where there exists a linear relationship between the perceived quality scores and the perceived utility scores and exists at the low-quality end of image databases. Using this definition, this paper describes a subjective experiment to determine the low-quality regime for databases of distorted images that include perceived quality scores but not perceived utility scores, such as CSIQ and LIVE. The performances of several image utility and quality estimators are evaluated in the low-quality regime, indicating that utility estimators can be successfully applied to estimate perceived quality in this regime. Omission of the lowestfrequency image content is shown to be crucial to the performances of both kinds of estimators. Additionally, this paper establishes an upper-bound for the performances of quality estimators in the LQR, using a family of quality estimators based on VIF. The resulting optimal quality estimator indicates that estimating quality in the low-quality regime is robust to exact frequency pooling weights, and that near-optimal performance can be achieved by a variety of estimators providing that they substantially emphasize the appropriate frequency content.

  7. Coherent diffractive imaging: towards achieving atomic resolution.

    PubMed

    Dietze, S H; Shpyrko, O G

    2015-11-01

    The next generation of X-ray sources will feature highly brilliant X-ray beams that will enable the imaging of local nanoscale structures with unprecedented resolution. A general formalism to predict the achievable spatial resolution in coherent diffractive imaging, based solely on diffracted intensities, is provided. The coherent dose necessary to reach atomic resolution depends significantly on the atomic scale structure, where disordered or amorphous materials require roughly three orders of magnitude lower dose compared with the expected scaling of uniform density materials. Additionally, dose reduction for crystalline materials are predicted at certain resolutions based only on their unit-cell dimensions and structure factors. PMID:26524315

  8. Automatic no-reference image quality assessment.

    PubMed

    Li, Hongjun; Hu, Wei; Xu, Zi-Neng

    2016-01-01

    No-reference image quality assessment aims to predict the visual quality of distorted images without examining the original image as a reference. Most no-reference image quality metrics which have been already proposed are designed for one or a set of predefined specific distortion types and are unlikely to generalize for evaluating images degraded with other types of distortion. There is a strong need of no-reference image quality assessment methods which are applicable to various distortions. In this paper, the authors proposed a no-reference image quality assessment method based on a natural image statistic model in the wavelet transform domain. A generalized Gaussian density model is employed to summarize the marginal distribution of wavelet coefficients of the test images, so that correlative parameters are needed for the evaluation of image quality. The proposed algorithm is tested on three large-scale benchmark databases. Experimental results demonstrate that the proposed algorithm is easy to implement and computational efficient. Furthermore, our method can be applied to many well-known types of image distortions, and achieves a good quality of prediction performance. PMID:27468398

  9. Achieving Quality Learning in Higher Education.

    ERIC Educational Resources Information Center

    Nightingale, Peggy; O'Neil, Mike

    This volume on quality learning in higher education discusses issues of good practice particularly action learning and Total Quality Management (TQM)-type strategies and illustrates them with seven case studies in Australia and the United Kingdom. Chapter 1 discusses issues and problems in defining quality in higher education. Chapter 2 looks at…

  10. Evaluation of image quality

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    This presentation outlines in viewgraph format a general approach to the evaluation of display system quality for aviation applications. This approach is based on the assumption that it is possible to develop a model of the display which captures most of the significant properties of the display. The display characteristics should include spatial and temporal resolution, intensity quantizing effects, spatial sampling, delays, etc. The model must be sufficiently well specified to permit generation of stimuli that simulate the output of the display system. The first step in the evaluation of display quality is an analysis of the tasks to be performed using the display. Thus, for example, if a display is used by a pilot during a final approach, the aesthetic aspects of the display may be less relevant than its dynamic characteristics. The opposite task requirements may apply to imaging systems used for displaying navigation charts. Thus, display quality is defined with regard to one or more tasks. Given a set of relevant tasks, there are many ways to approach display evaluation. The range of evaluation approaches includes visual inspection, rapid evaluation, part-task simulation, and full mission simulation. The work described is focused on two complementary approaches to rapid evaluation. The first approach is based on a model of the human visual system. A model of the human visual system is used to predict the performance of the selected tasks. The model-based evaluation approach permits very rapid and inexpensive evaluation of various design decisions. The second rapid evaluation approach employs specifically designed critical tests that embody many important characteristics of actual tasks. These are used in situations where a validated model is not available. These rapid evaluation tests are being implemented in a workstation environment.

  11. Achieving Quality Health Services for Adolescents.

    PubMed

    2016-08-01

    This update of the 2008 statement from the American Academy of Pediatrics redirects the discussion of quality health care from the theoretical to the practical within the medical home. This statement reviews the evolution of the medical home concept and challenges the provision of quality adolescent health care within the patient-centered medical home. Areas of attention for quality adolescent health care are reviewed, including developmentally appropriate care, confidentiality, location of adolescent care, providers who offer such care, the role of research in advancing care, and the transition to adult care. PMID:27432849

  12. Achieving and sustaining quality in healthcare.

    PubMed

    Sr Mary Jean Ryan

    2004-01-01

    SSM Health Care (SSMHC), the first healthcare recipient of the Malcolm Baldrige National Quality Award, has been cited by both Baldrige and the Joint Commission on Accreditation of Healthcare Organizations as having a culture of continuous quality improvement (CQI). SSM Health Care began to implement CQI systemwide in 1990. CQI provided the foundation for other strategies that served to further weave quality improvement into the fabric of the organization's culture. It gave SSMHC's people the tools and techniques to make improvements, created an environment of teamwork, and introduced the concept of improving processes. Using the Baldrige Criteria for Performance Excellence as a business model helped SSMHC to see how various organizational functions should link and to discover gaps in the linkage within its own organization. Baldrige feedback reports identified opportunities that could then be prioritized and the resulting improvements implemented. Overall, the Baldrige model gave a focused approach to what had been scattered improvement efforts. SSM Health Care considers the Baldrige model the best way for an organization to get better faster. PMID:15055826

  13. Achieving indoor air quality through contaminant control

    SciTech Connect

    Katzel, J.

    1995-07-10

    Federal laws outlining industry`s responsibilities in creating a healthy, hazard-free workspace are well known. OSHA`s laws on interior air pollution establish threshold limit values (TLVs) and permissible exposure limits (PELs) for more than 500 potentially hazardous substances found in manufacturing operations. Until now, OSHA has promulgated regulations only for the manufacturing environment. However, its recently-proposed indoor air quality (IAQ) ruling, if implemented, will apply to all workspaces. It regulates IAQ, including environmental tobacco smoke, and requires employers to write and implement IAQ compliance plans.

  14. Image Enhancement, Image Quality, and Noise

    NASA Technical Reports Server (NTRS)

    Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.; Hines, Glenn D.

    2005-01-01

    The Multiscale Retinex With Color Restoration (MSRCR) is a non-linear image enhancement algorithm that provides simultaneous dynamic range compression, color constancy and rendition. The overall impact is to brighten up areas of poor contrast/lightness but not at the expense of saturating areas of good contrast/brightness. The downside is that with the poor signal-to-noise ratio that most image acquisition devices have in dark regions, noise can also be greatly enhanced thus affecting overall image quality. In this paper, we will discuss the impact of the MSRCR on the overall quality of an enhanced image as a function of the strength of shadows in an image, and as a function of the root-mean-square (RMS) signal-to-noise (SNR) ratio of the image.

  15. The Relationship of Classroom Quality to Kindergarten Achievement

    ERIC Educational Resources Information Center

    Burson, Susan J.

    2010-01-01

    This quantitative study focuses on the relationship between classroom quality and children's academic achievement. Specifically, it examines how classroom quality in three broad domains-- emotional climate, classroom management and instructional support--impact kindergarten achievement growth in mathematics and reading. The researcher collected…

  16. Quality assessment for hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Chen, Yuheng; Chen, Xinhua; Zhou, Jiankang; Shen, Weimin

    2014-11-01

    Image quality assessment is an essential value judgement approach for many applications. Multi & hyper spectral imaging has more judging essentials than grey scale or RGB imaging and its image quality assessment job has to cover up all-around evaluating factors. This paper presents an integrating spectral imaging quality assessment project, in which spectral-based, radiometric-based and spatial-based statistical behavior for three hyperspectral imagers are jointly executed. Spectral response function is worked out based on discrete illumination images and its spectral performance is deduced according to its FWHM and spectral excursion value. Radiometric response ability of different spectral channel under both on-ground and airborne imaging condition is judged by SNR computing based upon local RMS extraction and statistics method. Spatial response evaluation of the spectral imaging instrument is worked out by MTF computing with slanted edge analysis method. Reported pioneering systemic work in hyperspectral imaging quality assessment is carried out with the help of several domestic dominating work units, which not only has significance in the development of on-ground and in-orbit instrument performance evaluation technique but also takes on reference value for index demonstration and design optimization for instrument development.

  17. Propagation, structural similarity, and image quality

    NASA Astrophysics Data System (ADS)

    Pérez, Jorge; Mas, David; Espinosa, Julián; Vázquez, Carmen; Illueca, Carlos

    2012-06-01

    Retinal image quality is usually analysed through different parameters typical from instrumental optics, i.e, PSF, MTF and wavefront aberrations. Although these parameters are important, they are hard to translate to visual quality parameters since human vision exhibits some tolerance to certain aberrations. This is particularly important in postsurgery eyes, where non-common aberration are induced and their effects on the final image quality is not clear. Natural images usually show a strong dependency between one point and its neighbourhood. This fact helps to the image interpretation and should be considered when determining the final image quality. The aim of this work is to propose an objective index which allows comparing natural images on the retina and, from them, to obtain relevant information abut the visual quality of a particular subject. To this end, we propose a individual eye modelling. The morphological data of the subject's eye are considered and the light propagation through the ocular media is calculated by means of a Fourier-transform-based method. The retinal PSF so obtained is convolved with the natural scene under consideration and the obtained image is compared with the ideal one by using the structural similarity index. The technique is applied on 2 eyes with a multifocal corneal profile (PresbyLasik) and can be used to determine the real extension of the achieved pseudoaccomodation.

  18. Foveated wavelet image quality index

    NASA Astrophysics Data System (ADS)

    Wang, Zhou; Bovik, Alan C.; Lu, Ligang; Kouloheris, Jack L.

    2001-12-01

    The human visual system (HVS) is highly non-uniform in sampling, coding, processing and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. Currently, most image quality measurement methods are designed for uniform resolution images. These methods do not correlate well with the perceived foveated image quality. Wavelet analysis delivers a convenient way to simultaneously examine localized spatial as well as frequency information. We developed a new image quality metric called foveated wavelet image quality index (FWQI) in the wavelet transform domain. FWQI considers multiple factors of the HVS, including the spatial variance of the contrast sensitivity function, the spatial variance of the local visual cut-off frequency, the variance of human visual sensitivity in different wavelet subbands, and the influence of the viewing distance on the display resolution and the HVS features. FWQI can be employed for foveated region of interest (ROI) image coding and quality enhancement. We show its effectiveness by using it as a guide for optimal bit assignment of an embedded foveated image coding system. The coding system demonstrates very good coding performance and scalability in terms of foveated objective as well as subjective quality measurement.

  19. Achieving quality excellence at the Diablo Canyon Nuclear Power Plant

    SciTech Connect

    Skidmore, S.M.; Taggart, D.A.

    1988-01-01

    Quality assurance methods at the Diablo Canyon plant were transformed from the then typical industry practices that often alienated professional and technical people, as well as craftsmen and their foremen, to a cooperative method that allowed plant personnel to work together as a team. It has created an attitude to do it right the first time. The roles of quality professionals were expanded to include teaching and coaching to facilitate enhanced communication between and within functional organizations. This included regular presentations to managers and line personnel in an informal group participative atmosphere. These presentations have become widely known at the plant as quality awareness tailboard sessions. These presentations are intended to increase personnel sensitivity to the subject of quality and quality management. Economic achievement of excellence in quality is essential to remain competitive in today's marketplace. The proactive team-oriented approach of quality assurance achieves the bottom line of high quality with concurrently enhanced productivity and cost-effectiveness.

  20. Can wavefront coding infrared imaging system achieve decoded images approximating to in-focus infrared images?

    NASA Astrophysics Data System (ADS)

    Feng, Bin; Zhang, Chengshuo; Xu, Baoshu; Shi, Zelin

    2015-11-01

    Artefacts and noise degrade the decoded image of a wavefront coding infrared imaging system, which usually results in the decoded image being inferior to the in-focus infrared image of a conventional infrared imaging system. The previous letter showed that the decoded image fell behind the in-focus infrared image. For comparison, a bar target experiment at temperature of 20°C and two groups of outdoor experiments at temperatures of 28°C and 70°C are respectively conducted. Experimental results prove that a wavefront coding infrared imaging system can achieve the decoded image being approximating to its corresponding in-focus infrared image.

  1. Color image attribute and quality measurements

    NASA Astrophysics Data System (ADS)

    Gao, Chen; Panetta, Karen; Agaian, Sos

    2014-05-01

    Color image quality measures have been used for many computer vision tasks. In practical applications, the no-reference (NR) measures are desirable because reference images are not always accessible. However, only limited success has been achieved. Most existing NR quality assessments require that the types of image distortion is known a-priori. In this paper, three NR color image attributes: colorfulness, sharpness and contrast are quantified by new metrics. Using these metrics, a new Color Quality Measure (CQM), which is based on the linear combination of these three color image attributes, is presented. We evaluated the performance of several state-of-the-art no-reference measures for comparison purposes. Experimental results demonstrate the CQM correlates well with evaluations obtained from human observers and it operates in real time. The results also show that the presented CQM outperforms previous works with respect to ranking image quality among images containing the same or different contents. Finally, the performance of CQM is independent of distortion types, which is demonstrated in the experimental results.

  2. Lessions learned in WISE image quality

    NASA Astrophysics Data System (ADS)

    Kendall, Martha; Duval, Valerie G.; Larsen, Mark F.; Heinrichsen, Ingolf H.; Esplin, Roy W.; Shannon, Mark; Wright, Edward L.

    2010-08-01

    The Wide-Field Infrared Survey Explorer (WISE) mission launched in December of 2009 is a true success story. The mission is performing beyond expectations on-orbit and maintained cost and schedule throughout. How does such a thing happen? A team constantly focused on mission success is a key factor. Mission success is more than a program meeting its ultimate science goals; it is also meeting schedule and cost goals to avoid cancellation. The WISE program can attribute some of its success in achieving the image quality needed to meet science goals to lessons learned along the way. A requirement was missed in early decomposition, the absence of which would have adversely affected end-to-end system image quality. Fortunately, the ability of the cross-organizational team to focus on fixing the problem without pointing fingers or waiting for paperwork was crucial in achieving a timely solution. Asking layman questions early in the program could have revealed requirement flowdown misunderstandings between spacecraft control stability and image processing needs. Such is the lesson learned with the WISE spacecraft Attitude Determination & Control Subsystem (ADCS) jitter control and the image data reductions needs. Spacecraft motion can affect image quality in numerous ways. Something as seemingly benign as different terminology being used by teammates in separate groups working on data reduction, spacecraft ADCS, the instrument, mission operations, and the science proved to be a risk to system image quality. While the spacecraft was meeting the allocated jitter requirement , the drift rate variation need was not being met. This missing need was noticed about a year before launch and with a dedicated team effort, an adjustment was made to the spacecraft ADCS control. WISE is meeting all image quality requirements on-orbit thanks to a diligent team noticing something was missing before it was too late and applying their best effort to find a solution.

  3. Video and image quality

    NASA Astrophysics Data System (ADS)

    Aldridge, Jim

    1995-09-01

    This paper presents some of the results of a UK government research program into methods of improving the effectiveness of CCTV surveillance systems. The paper identifies the major components of video security systems and primary causes of unsatisfactory images. A method is outline for relating the picture detail limitations imposed by each system component on overall system performance. The paper also points out some possible difficulties arising from the use of emerging new technology.

  4. Should Achievement Tests Be Used To Judge School Quality?

    ERIC Educational Resources Information Center

    Bauer, Scott C.

    A study was conducted to provide empirical evidence to answer the question of whether student scores on standardized achievement tests represent reasonable measures of instructional quality. Using a research protocol designed by W. Popham and the local study directors, individual test items from a nationally marketed standardized achievement test…

  5. Fovea based image quality assessment

    NASA Astrophysics Data System (ADS)

    Guo, Anan; Zhao, Debin; Liu, Shaohui; Cao, Guangyao

    2010-07-01

    Humans are the ultimate receivers of the visual information contained in an image, so the reasonable method of image quality assessment (IQA) should follow the properties of the human visual system (HVS). In recent years, IQA methods based on HVS-models are slowly replacing classical schemes, such as mean squared error (MSE) and Peak Signal-to-Noise Ratio (PSNR). IQA-structural similarity (SSIM) regarded as one of the most popular HVS-based methods of full reference IQA has apparent improvements in performance compared with traditional metrics in nature, however, it performs not very well when the images' structure is destroyed seriously or masked by noise. In this paper, a new efficient fovea based structure similarity image quality assessment (FSSIM) is proposed. It enlarges the distortions in the concerned positions adaptively and changes the importances of the three components in SSIM. FSSIM predicts the quality of an image through three steps. First, it computes the luminance, contrast and structure comparison terms; second, it computes the saliency map by extracting the fovea information from the reference image with the features of HVS; third, it pools the above three terms according to the processed saliency map. Finally, a commonly experimental database LIVE IQA is used for evaluating the performance of the FSSIM. Experimental results indicate that the consistency and relevance between FSSIM and mean opinion score (MOS) are both better than SSIM and PSNR clearly.

  6. Landsat image data quality studies

    NASA Technical Reports Server (NTRS)

    Schueler, C. F.; Salomonson, V. V.

    1985-01-01

    Preliminary results of the Landsat-4 Image Data Quality Analysis (LIDQA) program to characterize the data obtained using the Thematic Mapper (TM) instrument on board the Landsat-4 and Landsat-5 satellites are reported. TM design specifications were compared to the obtained data with respect to four criteria, including spatial resolution; geometric fidelity; information content; and image relativity to Multispectral Scanner (MSS) data. The overall performance of the TM was rated excellent despite minor instabilities and radiometric anomalies in the data. Spatial performance of the TM exceeded design specifications in terms of both image sharpness and geometric accuracy, and the image utility of the TM data was at least twice as high as MSS data. The separability of alfalfa and sugar beet fields in a TM image is demonstrated.

  7. Scene reduction for subjective image quality assessment

    NASA Astrophysics Data System (ADS)

    Lewandowska (Tomaszewska), Anna

    2016-01-01

    Evaluation of image quality is important for many image processing systems, such as those used for acquisition, compression, restoration, enhancement, or reproduction. Its measurement is often accompanied by user studies, in which a group of observers rank or rate results of several algorithms. Such user studies, known as subjective image quality assessment experiments, can be very time consuming and do not guarantee conclusive results. This paper is intended to help design an efficient and rigorous quality assessment experiment. We propose a method of limiting the number of scenes that need to be tested, which can significantly reduce the experimental effort and still capture relevant scene-dependent effects. To achieve it, we employ a clustering technique and evaluate it on the basis of compactness and separation criteria. The correlation between the results obtained from a set of images in an initial database and the results received from reduced experiment are analyzed. Finally, we propose a procedure for reducing the initial scenes number. Four different assessment techniques were tested: single stimulus, double stimulus, forced choice, and similarity judgments. We conclude that in most cases, 9 to 12 judgments per evaluated algorithm for a large scene collection is sufficient to reduce the initial set of images.

  8. Quality metrics for sensor images

    NASA Technical Reports Server (NTRS)

    Ahumada, AL

    1993-01-01

    Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery

  9. Educational Administration Program Quality and the Impact on Student Achievement

    ERIC Educational Resources Information Center

    Byrd, Jimmy K.; Slater, Robert O.; Brooks, John

    2006-01-01

    The purpose of this study was to determine if there was a connection between quality of education obtained by superintendents in educational administration programs and school effectiveness as measured by student achievement. The best fitting model based on the model deviance test and accounting for the greatest variation in the outcome variable,…

  10. Quantitative statistical methods for image quality assessment.

    PubMed

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  11. Quantitative Statistical Methods for Image Quality Assessment

    PubMed Central

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  12. Learning to rank for blind image quality assessment.

    PubMed

    Gao, Fei; Tao, Dacheng; Gao, Xinbo; Li, Xuelong

    2015-10-01

    Blind image quality assessment (BIQA) aims to predict perceptual image quality scores without access to reference images. State-of-the-art BIQA methods typically require subjects to score a large number of images to train a robust model. However, subjective quality scores are imprecise, biased, and inconsistent, and it is challenging to obtain a large-scale database, or to extend existing databases, because of the inconvenience of collecting images, training the subjects, conducting subjective experiments, and realigning human quality evaluations. To combat these limitations, this paper explores and exploits preference image pairs (PIPs) such as the quality of image Ia is better than that of image Ib for training a robust BIQA model. The preference label, representing the relative quality of two images, is generally precise and consistent, and is not sensitive to image content, distortion type, or subject identity; such PIPs can be generated at a very low cost. The proposed BIQA method is one of learning to rank. We first formulate the problem of learning the mapping from the image features to the preference label as one of classification. In particular, we investigate the utilization of a multiple kernel learning algorithm based on group lasso to provide a solution. A simple but effective strategy to estimate perceptual image quality scores is then presented. Experiments show that the proposed BIQA method is highly effective and achieves a performance comparable with that of state-of-the-art BIQA algorithms. Moreover, the proposed method can be easily extended to new distortion categories. PMID:25616080

  13. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  14. Retinal Image Quality during Accommodation in Adult Myopic Eyes

    PubMed Central

    Sreenivasan, Vidhyapriya; Aslakson, Emily; Kornaus, Andrew; Thibos, Larry N.

    2014-01-01

    Purpose Reduced retinal image contrast produced by accommodative lag is implicated with myopia development. Here, we measure accommodative error and retinal image quality from wavefront aberrations in myopes and emmetropes when they perform visually demanding and naturalistic tasks. Methods Wavefront aberrations were measured in 10 emmetropic and 11 myopic adults at three distances (100, 40, and 20 cm) while performing four tasks (monocular acuity, binocular acuity, reading, and movie watching). For the acuity tasks, measurements of wavefront error were obtained near the end point of the acuity experiment. Refractive state was defined as the target vergence that optimizes image quality using a visual contrast metric (VSMTF) computed from wavefront errors. Results Accommodation was most accurate (and image quality best) during binocular acuity whereas accommodation was least accurate (and image quality worst) while watching a movie. When viewing distance was reduced, accommodative lag increased and image quality (as quantified by VSMTF) declined for all tasks in both refractive groups. For any given viewing distance, computed image quality was consistently worse in myopes than in emmetropes, more so for the acuity than for reading/movie watching. Although myopes showed greater lags and worse image quality for the acuity experiments compared to emmetropes, acuity was not measurably worse in myopes compared to emmetropes. Conclusions Retinal image quality present when performing a visually demanding task (e.g., during clinical examination) is likely to be greater than for less demanding tasks (e.g., reading/movie watching). Although reductions in image quality lead to reductions in acuity, the image quality metric VSMTF is not necessarily an absolute indicator of visual performance because myopes achieved slightly better acuity than emmetropes despite showing greater lags and worse image quality. Reduced visual contrast in myopes compared to emmetropes is consistent

  15. Improving Secondary Ion Mass Spectrometry Image Quality with Image Fusion

    PubMed Central

    Tarolli, Jay G.; Jackson, Lauren M.; Winograd, Nicholas

    2014-01-01

    The spatial resolution of chemical images acquired with cluster secondary ion mass spectrometry (SIMS) is limited not only by the size of the probe utilized to create the images, but also by detection sensitivity. As the probe size is reduced to below 1 µm, for example, a low signal in each pixel limits lateral resolution due to counting statistics considerations. Although it can be useful to implement numerical methods to mitigate this problem, here we investigate the use of image fusion to combine information from scanning electron microscope (SEM) data with chemically resolved SIMS images. The advantage of this approach is that the higher intensity and, hence, spatial resolution of the electron images can help to improve the quality of the SIMS images without sacrificing chemical specificity. Using a pan-sharpening algorithm, the method is illustrated using synthetic data, experimental data acquired from a metallic grid sample, and experimental data acquired from a lawn of algae cells. The results show that up to an order of magnitude increase in spatial resolution is possible to achieve. A cross-correlation metric is utilized for evaluating the reliability of the procedure. PMID:24912432

  16. Image quality assessment for CT used on small animals

    NASA Astrophysics Data System (ADS)

    Cisneros, Isabela Paredes; Agulles-Pedrós, Luis

    2016-07-01

    Image acquisition on a CT scanner is nowadays necessary in almost any kind of medical study. Its purpose, to produce anatomical images with the best achievable quality, implies the highest diagnostic radiation exposure to patients. Image quality can be measured quantitatively based on parameters such as noise, uniformity and resolution. This measure allows the determination of optimal parameters of operation for the scanner in order to get the best diagnostic image. A human Phillips CT scanner is the first one minded for veterinary-use exclusively in Colombia. The aim of this study was to measure the CT image quality parameters using an acrylic phantom and then, using the computational tool MatLab, determine these parameters as a function of current value and window of visualization, in order to reduce dose delivery by keeping the appropriate image quality.

  17. Assessing product image quality for online shopping

    NASA Astrophysics Data System (ADS)

    Goswami, Anjan; Chung, Sung H.; Chittar, Naren; Islam, Atiq

    2012-01-01

    Assessing product-image quality is important in the context of online shopping. A high quality image that conveys more information about a product can boost the buyer's confidence and can get more attention. However, the notion of image quality for product-images is not the same as that in other domains. The perception of quality of product-images depends not only on various photographic quality features but also on various high level features such as clarity of the foreground or goodness of the background etc. In this paper, we define a notion of product-image quality based on various such features. We conduct a crowd-sourced experiment to collect user judgments on thousands of eBay's images. We formulate a multi-class classification problem for modeling image quality by classifying images into good, fair and poor quality based on the guided perceptual notions from the judges. We also conduct experiments with regression using average crowd-sourced human judgments as target. We compute a pseudo-regression score with expected average of predicted classes and also compute a score from the regression technique. We design many experiments with various sampling and voting schemes with crowd-sourced data and construct various experimental image quality models. Most of our models have reasonable accuracies (greater or equal to 70%) on test data set. We observe that our computed image quality score has a high (0.66) rank correlation with average votes from the crowd sourced human judgments.

  18. Infrared image quality evaluation method without reference image

    NASA Astrophysics Data System (ADS)

    Yue, Song; Ren, Tingting; Wang, Chengsheng; Lei, Bo; Zhang, Zhijie

    2013-09-01

    Since infrared image quality depends on many factors such as optical performance and electrical noise of thermal imager, image quality evaluation becomes an important issue which can conduce to both image processing afterward and capability improving of thermal imager. There are two ways of infrared image quality evaluation, with or without reference image. For real-time thermal image, the method without reference image is preferred because it is difficult to get a standard image. Although there are various kinds of methods for evaluation, there is no general metric for image quality evaluation. This paper introduces a novel method to evaluate infrared image without reference image from five aspects: noise, clarity, information volume and levels, information in frequency domain and the capability of automatic target recognition. Generally, the basic image quality is obtained from the first four aspects, and the quality of target is acquired from the last aspect. The proposed method is tested on several infrared images captured by different thermal imagers. Calculate the indicators and compare with human vision results. The evaluation shows that this method successfully describes the characteristics of infrared image and the result is consistent with human vision system.

  19. Achieving adequate BMP`s for stormwater quality management

    SciTech Connect

    Jones-Lee, A.; Lee, G.F.

    1994-12-31

    There is considerable controversy about the technical appropriateness and the cost-effectiveness of requiring cities to control contaminants in urban stormwater discharges to meet state water quality standards equivalent to US EPA numeric chemical water quality criteria. At this time and likely for the next 10 years, urban stormwater discharges will be exempt from regulation to achieve state water quality standards in receiving waters, owing to the high cost to cities of the management of contaminants in the stormwater runoff-discharge so as to prevent exceedances of water quality standards in the receiving waters. Instead of requiring the same degree of contaminant control for stormwater discharges as is required for point-source discharges of municipal and industrial wastewaters, those responsible for urban stormwater discharges will have to implement Best Management Practices (BMP`s) for contaminant control. The recommended approach for implementation of BMP`s involves the use of site-specific evaluations of what, if any, real problems (use impairment) are caused by stormwater-associated contaminants in the waters receiving that stormwater discharge. From this type of information BMP`s can then be developed to control those contaminants in stormwater discharges that are, in fact, impairing the beneficial uses of receiving waters.

  20. Aerial image retargeting (AIR): achieving litho-friendly designs

    NASA Astrophysics Data System (ADS)

    Yehia Hamouda, Ayman; Word, James; Anis, Mohab; Karim, Karim S.

    2011-04-01

    In this work, we present a new technique to detect non-Litho-Friendly design areas based on their Aerial Image signature. The aerial image is calculated for the litho target (pre-OPC). This is followed by the fixing (retargeting) the design to achieve a litho friendly OPC target. This technique is applied and tested on 28 nm metal layer and shows a big improvement in the process window performance. For an optimized Aerial-Image-Retargeting (AIR) recipe is very computationally efficient and its runtime doesn't consume more than 1% of the OPC flow runtime.

  1. Exploring High-Achieving Students' Images of Mathematicians

    ERIC Educational Resources Information Center

    Aguilar, Mario Sánchez; Rosas, Alejandro; Zavaleta, Juan Gabriel Molina; Romo-Vázquez, Avenilde

    2016-01-01

    The aim of this study is to describe the images that a group of high-achieving Mexican students hold of mathematicians. For this investigation, we used a research method based on the Draw-A-Scientist Test (DAST) with a sample of 63 Mexican high school students. The group of students' pictorial and written descriptions of mathematicians assisted us…

  2. An Underwater Color Image Quality Evaluation Metric.

    PubMed

    Yang, Miao; Sowmya, Arcot

    2015-12-01

    Quality evaluation of underwater images is a key goal of underwater video image retrieval and intelligent processing. To date, no metric has been proposed for underwater color image quality evaluation (UCIQE). The special absorption and scattering characteristics of the water medium do not allow direct application of natural color image quality metrics especially to different underwater environments. In this paper, subjective testing for underwater image quality has been organized. The statistical distribution of the underwater image pixels in the CIELab color space related to subjective evaluation indicates the sharpness and colorful factors correlate well with subjective image quality perception. Based on these, a new UCIQE metric, which is a linear combination of chroma, saturation, and contrast, is proposed to quantify the non-uniform color cast, blurring, and low-contrast that characterize underwater engineering and monitoring images. Experiments are conducted to illustrate the performance of the proposed UCIQE metric and its capability to measure the underwater image enhancement results. They show that the proposed metric has comparable performance to the leading natural color image quality metrics and the underwater grayscale image quality metrics available in the literature, and can predict with higher accuracy the relative amount of degradation with similar image content in underwater environments. Importantly, UCIQE is a simple and fast solution for real-time underwater video processing. The effectiveness of the presented measure is also demonstrated by subjective evaluation. The results show better correlation between the UCIQE and the subjective mean opinion score. PMID:26513783

  3. Wavelet based image quality self measurements

    NASA Astrophysics Data System (ADS)

    Al-Jawad, Naseer; Jassim, Sabah

    2010-04-01

    Noise in general is considered to be degradation in image quality. Moreover image quality is measured based on the appearance of the image edges and their clarity. Most of the applications performance is affected by image quality and level of different types of degradation. In general measuring image quality and identifying the type of noise or degradation is considered to be a key factor in raising the applications performance, this task can be very challenging. Wavelet transform now a days, is widely used in different applications. These applications are mostly benefiting from the wavelet localisation in the frequency domain. The coefficients of the high frequency sub-bands in wavelet domain are represented by Laplace histogram. In this paper we are proposing to use the Laplace distribution histogram to measure the image quality and also to identify the type of degradation affecting the given image. Image quality and the level of degradation are mostly measured using a reference image with reasonable quality. The discussed Laplace distribution histogram provides a self testing measurement for the quality of the image. This measurement is based on constructing the theoretical Laplace distribution histogram of the high frequency wavelet sub-band. This construction is based on the actual standard deviation, then to be compared with the actual Laplace distribution histogram. The comparison is performed using histogram intersection method. All the experiments are performed using the extended Yale database.

  4. Process perspective on image quality evaluation

    NASA Astrophysics Data System (ADS)

    Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte

    2008-01-01

    The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.

  5. JPEG2000 still image coding quality.

    PubMed

    Chen, Tzong-Jer; Lin, Sheng-Chieh; Lin, You-Chen; Cheng, Ren-Gui; Lin, Li-Hui; Wu, Wei

    2013-10-01

    This work demonstrates the image qualities between two popular JPEG2000 programs. Two medical image compression algorithms are both coded using JPEG2000, but they are different regarding the interface, convenience, speed of computation, and their characteristic options influenced by the encoder, quantization, tiling, etc. The differences in image quality and compression ratio are also affected by the modality and compression algorithm implementation. Do they provide the same quality? The qualities of compressed medical images from two image compression programs named Apollo and JJ2000 were evaluated extensively using objective metrics. These algorithms were applied to three medical image modalities at various compression ratios ranging from 10:1 to 100:1. Following that, the quality of the reconstructed images was evaluated using five objective metrics. The Spearman rank correlation coefficients were measured under every metric in the two programs. We found that JJ2000 and Apollo exhibited indistinguishable image quality for all images evaluated using the above five metrics (r > 0.98, p < 0.001). It can be concluded that the image quality of the JJ2000 and Apollo algorithms is statistically equivalent for medical image compression. PMID:23589187

  6. Cognitive issues in image quality measurement

    NASA Astrophysics Data System (ADS)

    de Ridder, Huib

    2001-01-01

    Designers of imaging systems, image processing algorithms, etc., usually take for granted that methods for assessing perceived image quality produce unbiased estimates of the viewers' quality impression. Quality judgments, however, are affected by the judgment strategies induced by the experimental procedures. In this paper the results of two experiments are presented illustrating the influence judgment strategies can have on quality judgments. The first experiment concerns contextual effects due to the composition of the stimulus sets. Subjects assessed the sharpness of two differently composed sets of blurred versions of one static image. The sharpness judgments for the blurred images present in both stimulus sets were found to be dependent on the composition of the set as well as the scaling technique employed. In the second experiment subjects assessed either the overall quality or the overall impairment of manipulated and standard JPEG-coded images containing two main artifacts. The results indicate a systematic different between the quality and impairment judgments that could be interpreted as instruction-based different weighting of the two artifacts. Again, some influence of scaling techniques was observed. The results of both experiments underscore the important role judgment strategies play in the psychophysical evaluation of image quality. Ignoring this influence on quality judgments may lead to invalid conclusions about the viewers' impression of image quality.

  7. Phase congruency assesses hyperspectral image quality

    NASA Astrophysics Data System (ADS)

    Shao, Xiaopeng; Zhong, Cheng

    2012-10-01

    Blind image quality assessment (QA) is a tough task especially for hyperspectral imagery which is degraded by noise, distortion, defocus, and other complex factors. Subjective hyperspectral imagery QA methods are basically measured the degradation of image from human perceptual visual quality. As the most important image quality measurement features, noise and blur, determined the image quality greatly, are employed to predict the objective hyperspectral imagery quality of each band. We demonstrate a novel no-reference hyperspectral imagery QA model based on phase congruency (PC), which is a dimensionless quantity and provides an absolute measure of the significance of feature point. First, Log Gabor wavelet is used to calculate the phase congruency of frequencies of each band image. The relationship between noise and PC can be derived from above transformation under the assumption that noise is additive. Second, PC focus measure evaluation model is proposed to evaluate blur caused by different amounts of defocus. The ratio and mean factors of edge blur level and noise is defined to assess the quality of each band image. This image QA method obtains excellent correlation with subjective image quality score without any reference. Finally, the PC information is utilized to improve the quality of some bands images.

  8. Quality Science Teacher Professional Development and Student Achievement

    NASA Astrophysics Data System (ADS)

    Dubner, J.

    2007-12-01

    Studies show that socio-economic background and parental education accounts for 50-60 percent of a child's achievement in school. School, and other influences, account for the remaining 40-50 percent. In contrast to most other professions, schools require no real apprenticeship training of science teachers. Overall, only 38 percent of United States teachers have had any on-the-job training in their first teaching position, and in some cases this consisted of a few meetings over the course of a year between the beginning teacher and the assigned mentor or master teacher. Since individual teachers determine the bulk of a student's school experiences, interventions focused on teachers have the greatest likelihood of affecting students. To address this deficiency, partnerships between scientists and K-12 teachers are increasingly recognized as an excellent method for improving teacher preparedness and the quality of science education. Columbia University's Summer Research Program for Science Teachers' (founded in 1990) basic premise is simple: teachers cannot effectively teach science if they have no firsthand experience doing science, hence the Program's motto, "Practice what you teach." Columbia University's Summer Research Program for Science Teachers provides strong evidence that a teacher research program is a very effective form of professional development for secondary school science teachers and has a direct correlation to increased student achievement in science. The author will present the methodology of the program's evaluation citing statistically significant data. The author will also show the economic benefits of teacher participation in this form of professional development.

  9. Retinal image quality assessment using generic features

    NASA Astrophysics Data System (ADS)

    Fasih, Mahnaz; Langlois, J. M. Pierre; Ben Tahar, Houssem; Cheriet, Farida

    2014-03-01

    Retinal image quality assessment is an important step in automated eye disease diagnosis. Diagnosis accuracy is highly dependent on the quality of retinal images, because poor image quality might prevent the observation of significant eye features and disease manifestations. A robust algorithm is therefore required in order to evaluate the quality of images in a large database. We developed an algorithm for retinal image quality assessment based on generic features that is independent from segmentation methods. It exploits the local sharpness and texture features by applying the cumulative probability of blur detection metric and run-length encoding algorithm, respectively. The quality features are combined to evaluate the image's suitability for diagnosis purposes. Based on the recommendations of medical experts and our experience, we compared a global and a local approach. A support vector machine with radial basis functions was used as a nonlinear classifier in order to classify images to gradable and ungradable groups. We applied our methodology to 65 images of size 2592×1944 pixels that had been graded by a medical expert. The expert evaluated 38 images as gradable and 27 as ungradable. The results indicate very good agreement between the proposed algorithm's predictions and the medical expert's judgment: the sensitivity and specificity for the local approach are respectively 92% and 94%. The algorithm demonstrates sufficient robustness to identify relevant images for automated diagnosis.

  10. Seven challenges for image quality research

    NASA Astrophysics Data System (ADS)

    Chandler, Damon M.; Alam, Md M.; Phan, Thien D.

    2014-02-01

    Image quality assessment has been a topic of recent intense research due to its usefulness in a wide variety of applications. Owing in large part to efforts within the HVEI community, image-quality research has particularly benefited from improved models of visual perception. However, over the last decade, research in image quality has largely shifted from the previous broader objective of gaining a better understanding of human vision, to the current limited objective of better fitting the available ground-truth data. In this paper, we discuss seven open challenges in image quality research. These challenges stem from lack of complete perceptual models for: natural images; suprathreshold distortions; interactions between distortions and images; images containing multiple and nontraditional distortions; and images containing enhancements. We also discuss challenges related to computational efficiency. The objective of this paper is not only to highlight the limitations in our current knowledge of image quality, but to also emphasize the need for additional fundamental research in quality perception.

  11. Combined terahertz imaging system for enhanced imaging quality

    NASA Astrophysics Data System (ADS)

    Dolganova, Irina N.; Zaytsev, Kirill I.; Metelkina, Anna A.; Yakovlev, Egor V.; Karasik, Valeriy E.; Yurchenko, Stanislav O.

    2016-06-01

    An improved terahertz (THz) imaging system is proposed for enhancing image quality. Imaging scheme includes THz source and detection system operated in active mode as well as in passive one. In order to homogeneously illuminate the object plane the THz reshaper is proposed. The form and internal structure of the reshaper were studied by the numerical simulation. Using different test-objects we compare imaging quality in active and passive THz imaging modes. Imaging contrast and modulation transfer functions in active and passive imaging modes show drawbacks of them in high and low spatial frequencies, respectively. The experimental results confirm the benefit of combining both imaging modes into hybrid one. The proposed algorithm of making hybrid THz image is an effective approach of retrieving maximum information about the remote object.

  12. Optimization of synthetic aperture image quality

    NASA Astrophysics Data System (ADS)

    Moshavegh, Ramin; Jensen, Jonas; Villagomez-Hoyos, Carlos A.; Stuart, Matthias B.; Hemmsen, Martin Christian; Jensen, Jørgen Arendt

    2016-04-01

    Synthetic Aperture (SA) imaging produces high-quality images and velocity estimates of both slow and fast flow at high frame rates. However, grating lobe artifacts can appear both in transmission and reception. These affect the image quality and the frame rate. Therefore optimization of parameters effecting the image quality of SA is of great importance, and this paper proposes an advanced procedure for optimizing the parameters essential for acquiring an optimal image quality, while generating high resolution SA images. Optimization of the image quality is mainly performed based on measures such as F-number, number of emissions and the aperture size. They are considered to be the most contributing acquisition factors in the quality of the high resolution images in SA. Therefore, the performance of image quality is quantified in terms of full-width at half maximum (FWHM) and the cystic resolution (CTR). The results of the study showed that SA imaging with only 32 emissions and maximum sweep angle of 22 degrees yields a very good image quality compared with using 256 emissions and the full aperture size. Therefore the number of emissions and the maximum sweep angle in the SA can be optimized to reach a reasonably good performance, and to increase the frame rate by lowering the required number of emissions. All the measurements are performed using the experimental SARUS scanner connected to a λ/2-pitch transducer. A wire phantom and a tissue mimicking phantom containing anechoic cysts are scanned using the optimized parameters for the transducer. Measurements coincide with simulations.

  13. Image quality based x-ray dose control in cardiac imaging

    NASA Astrophysics Data System (ADS)

    Davies, Andrew G.; Kengyelics, Stephen M.; Gislason-Lee, Amber J.

    2015-03-01

    An automated closed-loop dose control system balances the radiation dose delivered to patients and the quality of images produced in cardiac x-ray imaging systems. Using computer simulations, this study compared two designs of automatic x-ray dose control in terms of the radiation dose and quality of images produced. The first design, commonly in x-ray systems today, maintained a constant dose rate at the image receptor. The second design maintained a constant image quality in the output images. A computer model represented patients as a polymethylmetacrylate phantom (which has similar x-ray attenuation to soft tissue), containing a detail representative of an artery filled with contrast medium. The model predicted the entrance surface dose to the phantom and contrast to noise ratio of the detail as an index of image quality. Results showed that for the constant dose control system, phantom dose increased substantially with phantom size (x5 increase between 20 cm and 30 cm thick phantom), yet the image quality decreased by 43% for the same thicknesses. For the constant quality control, phantom dose increased at a greater rate with phantom thickness (>x10 increase between 20 cm and 30 cm phantom). Image quality based dose control could tailor the x-ray output to just achieve the quality required, which would reduce dose to patients where the current dose control produces images of too high quality. However, maintaining higher levels of image quality for large patients would result in a significant dose increase over current practice.

  14. Automatic quality assessment of planetary images

    NASA Astrophysics Data System (ADS)

    Sidiropoulos, P.; Muller, J.-P.

    2015-10-01

    A significant fraction of planetary images are corrupted beyond the point that much scientific meaning can be extracted. For example, transmission errors result in missing data which is unrecoverable. The available planetary image datasets include many such "bad data", which both occupy valuable scientific storage resources and create false impressions about planetary image availability for specific planetary objects or target areas. In this work, we demonstrate a pipeline that we have developed to automatically assess the quality of planetary images. Additionally, this method discriminates between different types of image degradation, such as low-quality originating from camera flaws or low-quality triggered by atmospheric conditions, etc. Examples of quality assessment results for Viking Orbiter imagery will be also presented.

  15. Does High School Facility Quality Affect Student Achievement? A Two-Level Hierarchical Linear Model

    ERIC Educational Resources Information Center

    Bowers, Alex J.; Urick, Angela

    2011-01-01

    The purpose of this study is to isolate the independent effects of high school facility quality on student achievement using a large, nationally representative U.S. database of student achievement and school facility quality. Prior research on linking school facility quality to student achievement has been mixed. Studies that relate overall…

  16. Perceptual image quality: Effects of tone characteristics

    PubMed Central

    Delahunt, Peter B.; Zhang, Xuemei; Brainard, David H.

    2007-01-01

    Tone mapping refers to the conversion of luminance values recorded by a digital camera or other acquisition device, to the luminance levels available from an output device, such as a monitor or a printer. Tone mapping can improve the appearance of rendered images. Although there are a variety of algorithms available, there is little information about the image tone characteristics that produce pleasing images. We devised an experiment where preferences for images with different tone characteristics were measured. The results indicate that there is a systematic relation between image tone characteristics and perceptual image quality for images containing faces. For these images, a mean face luminance level of 46–49 CIELAB L* units and a luminance standard deviation (taken over the whole image) of 18 CIELAB L* units produced the best renderings. This information is relevant for the design of tone-mapping algorithms, particularly as many images taken by digital camera users include faces. PMID:17235365

  17. Image Quality Ranking Method for Microscopy.

    PubMed

    Koho, Sami; Fazeli, Elnaz; Eriksson, John E; Hänninen, Pekka E

    2016-01-01

    Automated analysis of microscope images is necessitated by the increased need for high-resolution follow up of events in time. Manually finding the right images to be analyzed, or eliminated from data analysis are common day-to-day problems in microscopy research today, and the constantly growing size of image datasets does not help the matter. We propose a simple method and a software tool for sorting images within a dataset, according to their relative quality. We demonstrate the applicability of our method in finding good quality images in a STED microscope sample preparation optimization image dataset. The results are validated by comparisons to subjective opinion scores, as well as five state-of-the-art blind image quality assessment methods. We also show how our method can be applied to eliminate useless out-of-focus images in a High-Content-Screening experiment. We further evaluate the ability of our image quality ranking method to detect out-of-focus images, by extensive simulations, and by comparing its performance against previously published, well-established microscopy autofocus metrics. PMID:27364703

  18. Image Quality Ranking Method for Microscopy

    PubMed Central

    Koho, Sami; Fazeli, Elnaz; Eriksson, John E.; Hänninen, Pekka E.

    2016-01-01

    Automated analysis of microscope images is necessitated by the increased need for high-resolution follow up of events in time. Manually finding the right images to be analyzed, or eliminated from data analysis are common day-to-day problems in microscopy research today, and the constantly growing size of image datasets does not help the matter. We propose a simple method and a software tool for sorting images within a dataset, according to their relative quality. We demonstrate the applicability of our method in finding good quality images in a STED microscope sample preparation optimization image dataset. The results are validated by comparisons to subjective opinion scores, as well as five state-of-the-art blind image quality assessment methods. We also show how our method can be applied to eliminate useless out-of-focus images in a High-Content-Screening experiment. We further evaluate the ability of our image quality ranking method to detect out-of-focus images, by extensive simulations, and by comparing its performance against previously published, well-established microscopy autofocus metrics. PMID:27364703

  19. Image Quality Ranking Method for Microscopy

    NASA Astrophysics Data System (ADS)

    Koho, Sami; Fazeli, Elnaz; Eriksson, John E.; Hänninen, Pekka E.

    2016-07-01

    Automated analysis of microscope images is necessitated by the increased need for high-resolution follow up of events in time. Manually finding the right images to be analyzed, or eliminated from data analysis are common day-to-day problems in microscopy research today, and the constantly growing size of image datasets does not help the matter. We propose a simple method and a software tool for sorting images within a dataset, according to their relative quality. We demonstrate the applicability of our method in finding good quality images in a STED microscope sample preparation optimization image dataset. The results are validated by comparisons to subjective opinion scores, as well as five state-of-the-art blind image quality assessment methods. We also show how our method can be applied to eliminate useless out-of-focus images in a High-Content-Screening experiment. We further evaluate the ability of our image quality ranking method to detect out-of-focus images, by extensive simulations, and by comparing its performance against previously published, well-established microscopy autofocus metrics.

  20. End-to-end image quality assessment

    NASA Astrophysics Data System (ADS)

    Raventos, Joaquin

    2012-05-01

    An innovative computerized benchmarking approach (US Patent pending Sep 2011) based on extensive application of photometry, geometrical optics, and digital media using a randomized target, for a standard observer to assess the image quality of video imaging systems, at different day time, and low-light luminance levels. It takes into account, the target's contrast and color characteristics, as well as the observer's visual acuity and dynamic response. This includes human vision as part of the "extended video imaging system" (EVIS), and allows image quality assessment by several standard observers simultaneously.

  1. MIQM: a multicamera image quality measure.

    PubMed

    Solh, Mashhour; AlRegib, Ghassan

    2012-09-01

    Although several subjective and objective quality assessment methods have been proposed in the literature for images and videos from single cameras, no comparable effort has been devoted to the quality assessment of multicamera images. With the increasing popularity of multiview applications, quality assessment of multicamera images and videos is becoming fundamental to the development of these applications. Image quality is affected by several factors, such as camera configuration, number of cameras, and the calibration process. In order to develop an objective metric specifically designed for multicamera systems, we identified and quantified two types of visual distortions in multicamera images: photometric distortions and geometric distortions. The relative distortion between individual camera scenes is a major factor in determining the overall perceived quality. In this paper, we show that such distortions can be translated into luminance, contrast, spatial motion, and edge-based structure components. We propose three different indices that can quantify these components. We provide examples to demonstrate the correlation among these components and the corresponding indices. Then, we combine these indices into one multicamera image quality measure (MIQM). Results and comparisons with other measures, such as peak signal-to noise ratio, mean structural similarity, and visual information fidelity show that MIQM outperforms other measures in capturing the perceptual fidelity of multicamera images. Finally, we verify the results against subjective evaluation. PMID:22645264

  2. No-reference stereoscopic image quality assessment

    NASA Astrophysics Data System (ADS)

    Akhter, Roushain; Parvez Sazzad, Z. M.; Horita, Y.; Baltes, J.

    2010-02-01

    Display of stereo images is widely used to enhance the viewing experience of three-dimensional imaging and communication systems. In this paper, we propose a method for estimating the quality of stereoscopic images using segmented image features and disparity. This method is inspired by the human visual system. We believe the perceived distortion and disparity of any stereoscopic display is strongly dependent on local features, such as edge (non-plane) and non-edge (plane) areas. Therefore, a no-reference perceptual quality assessment is developed for JPEG coded stereoscopic images based on segmented local features of artifacts and disparity. Local feature information such as edge and non-edge area based relative disparity estimation, as well as the blockiness and the blur within the block of images are evaluated in this method. Two subjective stereo image databases are used to evaluate the performance of our method. The subjective experiments results indicate our model has sufficient prediction performance.

  3. Rendered virtual view image objective quality assessment

    NASA Astrophysics Data System (ADS)

    Lu, Gang; Li, Xiangchun; Zhang, Yi; Peng, Kai

    2013-08-01

    The research on rendered virtual view image (RVVI) objective quality assessment is important for integrated imaging system and image quality assessment (IQA). Traditional IQA algorithms cannot be applied directly on the system receiver-side due to interview displacement and the absence of original reference. This study proposed a block-based neighbor reference (NbR) IQA framework for RVVI IQA. Neighbor views used for rendering are employed for quality assessment in the proposed framework. A symphonious factor handling noise and interview displacement is defined and applied to evaluate the contribution of the obtained quality index in each block pair. A three-stage experiment scheme is also presented to testify the proposed framework and evaluate its homogeneity performance when comparing to full reference IQA. Experimental results show the proposed framework is useful in RVVI objective quality assessment at system receiver-side and benchmarking different rendering algorithms.

  4. Continuous assessment of perceptual image quality

    NASA Astrophysics Data System (ADS)

    Hamberg, Roelof; de Ridder, Huib

    1995-12-01

    The study addresses whether subjects are able to assess the perceived quality of an image sequence continuously. To this end, a new method for assessing time-varying perceptual image quality is presented by which subjects continuously indicate the perceived strength of image quality by moving a slider along a graphical scale. The slider's position on this scale is sampled every second. In this way, temporal variations in quality can be monitored quantitatively, and a means is provided by which differences between, for example, alternative transmission systems can be analyzed in an informative way. The usability of this method is illustrated by an experiment in which, for a period of 815 s, subjects assessed the quality of still pictures comprising time-varying degrees of sharpness. Copyright (c) 1995 Optical Society of America

  5. Quality measures in applications of image restoration.

    PubMed

    Kriete, A; Naim, M; Schafer, L

    2001-01-01

    We describe a new method for the estimation of image quality in image restoration applications. We demonstrate this technique on a simulated data set of fluorescent beads, in comparison with restoration by three different deconvolution methods. Both the number of iterations and a regularisation factor are varied to enforce changes in the resulting image quality. First, the data sets are directly compared by an accuracy measure. These values serve to validate the image quality descriptor, which is developed on the basis of optical information theory. This most general measure takes into account the spectral energies and the noise, weighted in a logarithmic fashion. It is demonstrated that this method is particularly helpful as a user-oriented method to control the output of iterative image restorations and to eliminate the guesswork in choosing a suitable number of iterations. PMID:11587324

  6. Toward clinically relevant standardization of image quality.

    PubMed

    Samei, Ehsan; Rowberg, Alan; Avraham, Ellie; Cornelius, Craig

    2004-12-01

    In recent years, notable progress has been made on standardization of medical image presentations in the definition and implementation of the Digital Imaging and Communications in Medicine (DICOM) Grayscale Standard Display Function (GSDF). In parallel, the American Association of Physicists in Medicine (AAPM) Task Group 18 has provided much needed guidelines and tools for visual and quantitative assessment of medical display quality. In spite of these advances, however, there are still notable gaps in the effectiveness of DICOM GSDF to assure consistent and high-quality display of medical images. In additions the degree of correlation between display technical data and diagnostic usability and performance of displays remains unclear. This article proposes three specific steps that DICOM, AAPM, and ACR may collectively take to bridge the gap between technical performance and clinical use: (1) DICOM does not provide means and acceptance criteria to evaluate the conformance of a display device to GSDF or to address other image quality characteristics. DICOM can expand beyond luminance response, extending the measurable, quantifiable elements of TG18 such as reflection and resolution. (2) In a large picture archiving and communication system (PACS) installation, it is critical to continually track the appropriate use and performance of multiple display devices. DICOM may help with this task by adding a Device Service Class to the standard to provide for communication and control of image quality parameters between applications and devices, (3) The question of clinical significance of image quality metrics has rarely been addressed by prior efforts. In cooperation with AAPM, the American College of Radiology (ACR), and the Society for Computer Applications in Radiology (SCAR), DICOM may help to initiate research that will determine the clinical consequence of variations in image quality metrics (eg, GSDF conformance) and to define what constitutes image quality from a

  7. Image quality evaluation using moving targets

    NASA Astrophysics Data System (ADS)

    Artmann, Uwe

    2013-03-01

    The basic concept of testing a digital imaging device is to reproduce a known target and to analyze the resulting image. This semi-reference approach can be used for various different aspects of image quality. Each part of the imaging chain can have an influence on the results: lens, sensor, image processing and the target itself. The results are valid only for the complete system. If we want to test a single component, we have to make sure that we change only one and keep all others constant. When testing mobile imaging devices, we run into the problem that hardly anything can be manually controlled by the tester. Manual exposure control is not available for most devices, the focus cannot be influenced and hardly any settings for the image processing are available. Due to the limitations in the hardware, the image pipeline in the digital signal processor (DSP) of mobile imaging devices is a critical part of the image quality evaluation. The processing power of the DSPs allows sharpening, tonal correction and noise reduction to be non-linear and adaptive. This makes it very hard to describe the behavior for an objective image quality evaluation. The image quality is highly influenced by the signal processing for noise and resolution and the processing is the main reason for the loss of low contrast, _ne details, the so called texture blur. We present our experience to describe the image processing in more detail. All standardized test methods use a defined chart and require, that the chart and the camera are not moved in any way during test. In this paper, we present our results investigating the influence of chart movement during the test. Different structures, optimized for different aspects of image quality evaluation, are moved with a defined speed during the capturing process. The chart movement will change the input for the signal processing depending on the speed of the target during the test. The basic theoretical changes in the image will be the

  8. No training blind image quality assessment

    NASA Astrophysics Data System (ADS)

    Chu, Ying; Mou, Xuanqin; Ji, Zhen

    2014-03-01

    State of the art blind image quality assessment (IQA) methods generally extract perceptual features from the training images, and send them into support vector machine (SVM) to learn the regression model, which could be used to further predict the quality scores of the testing images. However, these methods need complicated training and learning, and the evaluation results are sensitive to image contents and learning strategies. In this paper, two novel blind IQA metrics without training and learning are firstly proposed. The new methods extract perceptual features, i.e., the shape consistency of conditional histograms, from the joint histograms of neighboring divisive normalization transform coefficients of distorted images, and then compare the length attribute of the extracted features with that of the reference images and degraded images in the LIVE database. For the first method, a cluster center is found in the feature attribute space of the natural reference images, and the distance between the feature attribute of the distorted image and the cluster center is adopted as the quality label. The second method utilizes the feature attributes and subjective scores of all the images in the LIVE database to construct a dictionary, and the final quality score is calculated by interpolating the subjective scores of nearby words in the dictionary. Unlike the traditional SVM based blind IQA methods, the proposed metrics have explicit expressions, which reflect the relationships of the perceptual features and the image quality well. Experiment results in the publicly available databases such as LIVE, CSIQ and TID2008 had shown the effectiveness of the proposed methods, and the performances are fairly acceptable.

  9. Image Acquisition and Quality in Digital Radiography.

    PubMed

    Alexander, Shannon

    2016-09-01

    Medical imaging has undergone dramatic changes and technological breakthroughs since the introduction of digital radiography. This article presents information on the development of digital radiography and types of digital radiography systems. Aspects of image quality and radiation exposure control are highlighted as well. In addition, the article includes related workplace changes and medicolegal considerations in the digital radiography environment. PMID:27601691

  10. Image gathering and restoration - Information and visual quality

    NASA Technical Reports Server (NTRS)

    Mccormick, Judith A.; Alter-Gartenberg, Rachel; Huck, Friedrich O.

    1989-01-01

    A method is investigated for optimizing the end-to-end performance of image gathering and restoration for visual quality. To achieve this objective, one must inevitably confront the problems that the visual quality of restored images depends on perceptual rather than mathematical considerations and that these considerations vary with the target, the application, and the observer. The method adopted in this paper is to optimize image gathering informationally and to restore images interactively to obtain the visually preferred trade-off among fidelity resolution, sharpness, and clarity. The results demonstrate that this method leads to significant improvements in the visual quality obtained by the traditional digital processing methods. These traditional methods allow a significant loss of visual quality to occur because they treat the design of the image-gathering system and the formulation of the image-restoration algorithm as two separate tasks and fail to account for the transformations between the continuous and the discrete representations in image gathering and reconstruction.

  11. Using Collaborative Course Development to Achieve Online Course Quality Standards

    ERIC Educational Resources Information Center

    Chao, Ining Tracy; Saj, Tami; Hamilton, Doug

    2010-01-01

    The issue of quality is becoming front and centre as online distance education moves into the mainstream of higher education. Many believe collaborative course development is the best way to design quality online courses. This research uses a case study approach to probe into the collaborative course development process and the implementation of…

  12. How healthcare organizations use the Internet to market quality achievements.

    PubMed

    Revere, Lee; Robinson, Leroy

    2010-01-01

    The increasingly competitive environment is having a strong bearing on the strategic marketing practices of hospitals. The Internet is a fairly new marketing tool, and it has the potential to dramatically influence healthcare consumers. This exploratory study investigates how hospitals use the Internet as a tool to market the quality of their services. Significant evidence exists that customers use the Internet to find information about potential healthcare providers, including information concerning quality. Data were collected from a random sample of 45 U.S. hospitals from the American Hospital Association database. The data included hospital affiliation, number of staffed beds, accreditation status, Joint Commission quality awards, and number of competing hospitals. The study's findings show that system-affiliated hospitals do not provide more, or less, quality information on their websites than do non-system-affiliated hospitals. The findings suggest that the amount of quality information provided on a hospital website is not dependent on hospital size. Research provides evidence that hospitals with more Joint Commission awards promote their quality accomplishments more so than their counterparts that earned fewer Joint Commission awards. The findings also suggest that the more competitors in a marketplace the more likely a hospital is to promote its quality as a potential differential advantage. The study's findings indicate that a necessary element of any hospital's competitive strategy should be to include the marketing of its quality on the organization's website. PMID:20210072

  13. Holographic projection with higher image quality.

    PubMed

    Qu, Weidong; Gu, Huarong; Tan, Qiaofeng

    2016-08-22

    The spatial resolution limited by the size of the spatial light modulator (SLM) in the holographic projection can hardly be increased, and speckle noise always appears to induce the degradation of image quality. In this paper, the holographic projection with higher image quality is presented. The spatial resolution of the reconstructed image is 2 times of that of the existing holographic projection, and speckles are suppressed well at the same time. Finally, the effectiveness of the holographic projection is verified in experiments. PMID:27557197

  14. Color image processing for date quality evaluation

    NASA Astrophysics Data System (ADS)

    Lee, Dah Jye; Archibald, James K.

    2010-01-01

    Many agricultural non-contact visual inspection applications use color image processing techniques because color is often a good indicator of product quality. Color evaluation is an essential step in the processing and inventory control of fruits and vegetables that directly affects profitability. Most color spaces such as RGB and HSV represent colors with three-dimensional data, which makes using color image processing a challenging task. Since most agricultural applications only require analysis on a predefined set or range of colors, mapping these relevant colors to a small number of indexes allows simple and efficient color image processing for quality evaluation. This paper presents a simple but efficient color mapping and image processing technique that is designed specifically for real-time quality evaluation of Medjool dates. In contrast with more complex color image processing techniques, the proposed color mapping method makes it easy for a human operator to specify and adjust color-preference settings for different color groups representing distinct quality levels. Using this color mapping technique, the color image is first converted to a color map that has one color index represents a color value for each pixel. Fruit maturity level is evaluated based on these color indices. A skin lamination threshold is then determined based on the fruit surface characteristics. This adaptive threshold is used to detect delaminated fruit skin and hence determine the fruit quality. The performance of this robust color grading technique has been used for real-time Medjool date grading.

  15. Perceptual image quality and telescope performance ranking

    NASA Astrophysics Data System (ADS)

    Lentz, Joshua K.; Harvey, James E.; Marshall, Kenneth H.; Salg, Joseph; Houston, Joseph B.

    2010-08-01

    Launch Vehicle Imaging Telescopes (LVIT) are expensive, high quality devices intended for improving the safety of vehicle personnel, ground support, civilians, and physical assets during launch activities. If allowed to degrade from the combination of wear, environmental factors, and ineffective or inadequate maintenance, these devices lose their ability to provide adequate quality imagery to analysts to prevent catastrophic events such as the NASA Space Shuttle, Challenger, accident in 1986 and the Columbia disaster of 2003. A software tool incorporating aberrations and diffraction that was developed for maintenance evaluation and modeling of telescope imagery is presented. This tool provides MTF-based image quality metric outputs which are correlated to ascent imagery analysts' perception of image quality, allowing a prediction of usefulness of imagery which would be produced by a telescope under different simulated conditions.

  16. Achieving quality in a government hospital: departmental responsibility.

    PubMed

    Haron, Yafa; Segal, Zvi; Barhoum, Masad

    2009-01-01

    Quality improvement in health care organizations requires structural reorganization and system reform and the development of an appropriate organizational "culture." In 2007, the Division of Quality and Excellence in Civil Service in Israel developed a concept to improve quality management in governmental institutions throughout the country. To put this strategy into practice, Western Galilee Hospital, a governmental hospital, in northern Israel, developed a plan to advance the quality management system where each department and unit is autonomously responsible for its own quality and excellence. Since the hospital has been certificated by ISO 9001 for more than 10 years (the only hospital in Israel to have this certificate), the main challenge now is to improve the quality and excellence system in every department. The aim of this article is to describe the implementation of a comprehensive program designed to raise the ability of managers and workers in Western Galilee Hospital in addressing all of the government's requirements for quality and excellence in service in Israel. PMID:19369858

  17. Peripheral Aberrations and Image Quality for Contact Lens Correction

    PubMed Central

    Shen, Jie; Thibos, Larry N.

    2011-01-01

    Purpose Contact lenses reduced the degree of hyperopic field curvature present in myopic eyes and rigid contact lenses reduced sphero-cylindrical image blur on the peripheral retina, but their effect on higher order aberrations and overall optical quality of the eye in the peripheral visual field is still unknown. The purpose of our study was to evaluate peripheral wavefront aberrations and image quality across the visual field before and after contact lens correction. Methods A commercial Hartmann-Shack aberrometer was used to measure ocular wavefront errors in 5° steps out to 30° of eccentricity along the horizontal meridian in uncorrected eyes and when the same eyes are corrected with soft or rigid contact lenses. Wavefront aberrations and image quality were determined for the full elliptical pupil encountered in off-axis measurements. Results Ocular higher-order aberrations increase away from fovea in the uncorrected eye. Third-order aberrations are larger and increase faster with eccentricity compared to the other higher-order aberrations. Contact lenses increase all higher-order aberrations except 3rd-order Zernike terms. Nevertheless, a net increase in image quality across the horizontal visual field for objects located at the foveal far point is achieved with rigid lenses, whereas soft contact lenses reduce image quality. Conclusions Second order aberrations limit image quality more than higher-order aberrations in the periphery. Although second-order aberrations are reduced by contact lenses, the resulting gain in image quality is partially offset by increased amounts of higher-order aberrations. To fully realize the benefits of correcting higher-order aberrations in the peripheral field requires improved correction of second-order aberrations as well. PMID:21873925

  18. Computerized measurement of mammographic display image quality

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.; Sivarudrappa, Mahesh; Roehrig, Hans

    1999-05-01

    Since the video monitor is widely believed to be the weak link in the imaging chain, it is critical, to include it in the total image quality evaluation. Yet, most physical measurements of mammographic image quality are presently limited to making measurements on the digital matrix, not the displayed image. A method is described to quantitatively measure image quality of mammographic monitors using ACR phantom-based test patterns. The image of the test pattern is digitized using a charge coupled device (CCD) camera, and the resulting image file is analyzed by an existing phantom analysis method (Computer Analysis of Mammography Phantom Images, CAMPI). The new method is called CCD-CAMPI and it yields the Signal-to-Noise-Ratio (SNR) for an arbitrary target shape (e.g., speck, mass or fiber). In this work we show the feasibility of this idea for speck targets. Also performed were physical image quality characterization of the monitor (so-called Fourier measures) and analysis by another template matching method due to Tapiovaara and Wagner (TW) which is closely related to CAMPI. The methods were applied to a MegaScan monitor. Test patterns containing a complete speck group superposed on a noiseless background were displayed on the monitor and a series of CCD images were acquired. These images were subjected to CCD-CAMPI and TW analyses. It was found that the SNR values for the CCD-CAMPI method tracked those of the TW method, although the latter measurements were considerably less precise. The TW SNR measure was also about 25% larger than the CCD-CAMPI determination. These differences could be understood from the manner in which the two methods evaluate the noise. Overall accuracy of the CAMPI SNR determination was 4.1% for single images when expressed as a coefficient of variance. While the SNR measures are predictable from the Fourier measures the number of images and effort required is prohibitive and it is not suited to Quality Control (QC). Unlike the Fourier

  19. Effect of optical aberrations on image quality and visual performance

    NASA Astrophysics Data System (ADS)

    Ravikumar, Sowmya

    In addition to the effects of diffraction, retinal image quality in the human eye is degraded by optical aberrations. Although the paraxial geometric optics description of defocus consists of a simple blurred circle whose size determines the extent of blur, in reality the interactions between monochromatic and chromatic aberrations create a complex pattern of retinal image degradation. My thesis work hypothesizes that although both monochromatic and chromatic optical aberrations in general reduce image quality from best achievable, the underlying causes of retinal image quality degradation are characteristic of the nature of the aberration, its interactions with other aberrations as well as the composition of the stimulus. To establish a controlled methodology, a computational model of the retinal image with various levels of aberrations was used to create filters equivalent to those produced by real optical aberrations. Visual performance was measured psychophysically by using these special filters that separately modulated amplitude and phase in the retinal image. In order to include chromatic aberration into the optical interactions, a computational polychromatic model of the eye was created and validated. The model starts with monochromatic wavefront maps and derives a composite white light point-spread function whose quality was assessed using metrics of image quality. Finally, in order to assess the effectiveness of simultaneous multifocal intra-ocular lenses in correcting the eye's optical aberrations, a polychromatic computational model of a pseudophakic eye was constructed. This model incorporated the special chromatic properties unique to an eye corrected with hybrid refractive-diffractive optical elements. Results showed that normal optical aberrations reduced visual performance not only by reducing image contrast but also by altering the phase structure of the image. Longitudinal chromatic aberration had a greater effect on image quality in isolation

  20. A database for spectral image quality

    NASA Astrophysics Data System (ADS)

    Le Moan, Steven; George, Sony; Pedersen, Marius; Blahová, Jana; Hardeberg, Jon Yngve

    2015-01-01

    We introduce a new image database dedicated to multi-/hyperspectral image quality assessment. A total of nine scenes representing pseudo-at surfaces of different materials (textile, wood, skin. . . ) were captured by means of a 160 band hyperspectral system with a spectral range between 410 and 1000nm. Five spectral distortions were designed, applied to the spectral images and subsequently compared in a psychometric experiment, in order to provide a basis for applications such as the evaluation of spectral image difference measures. The database can be downloaded freely from http://www.colourlab.no/cid.

  1. Evaluation of image quality in computed radiography based mammography systems

    NASA Astrophysics Data System (ADS)

    Singh, Abhinav; Bhwaria, Vipin; Valentino, Daniel J.

    2011-03-01

    Mammography is the most widely accepted procedure for the early detection of breast cancer and Computed Radiography (CR) is a cost-effective technology for digital mammography. We have demonstrated that CR mammography image quality is viable for Digital Mammography. The image quality of mammograms acquired using Computed Radiography technology was evaluated using the Modulation Transfer Function (MTF), Noise Power Spectrum (NPS) and Detective Quantum Efficiency (DQE). The measurements were made using a 28 kVp beam (RQA M-II) using 2 mm of Al as a filter and a target/filter combination of Mo/Mo. The acquired image bit depth was 16 bits and the pixel pitch for scanning was 50 microns. A Step-Wedge phantom (to measure the Contrast-to-noise ratio (CNR)) and the CDMAM 3.4 Contrast Detail phantom were also used to assess the image quality. The CNR values were observed at varying thickness of PMMA. The CDMAM 3.4 phantom results were plotted and compared to the EUREF acceptable and achievable values. The effect on image quality was measured using the physics metrics. A lower DQE was observed even with a higher MTF. This could be possibly due to a higher noise component present due to the way the scanner was configured. The CDMAM phantom scores demonstrated a contrast-detail comparable to the EUREF values. A cost-effective CR machine was optimized for high-resolution and high-contrast imaging.

  2. Blind image quality assessment through anisotropy.

    PubMed

    Gabarda, Salvador; Cristóbal, Gabriel

    2007-12-01

    We describe an innovative methodology for determining the quality of digital images. The method is based on measuring the variance of the expected entropy of a given image upon a set of predefined directions. Entropy can be calculated on a local basis by using a spatial/spatial-frequency distribution as an approximation for a probability density function. The generalized Rényi entropy and the normalized pseudo-Wigner distribution (PWD) have been selected for this purpose. As a consequence, a pixel-by-pixel entropy value can be calculated, and therefore entropy histograms can be generated as well. The variance of the expected entropy is measured as a function of the directionality, and it has been taken as an anisotropy indicator. For this purpose, directional selectivity can be attained by using an oriented 1-D PWD implementation. Our main purpose is to show how such an anisotropy measure can be used as a metric to assess both the fidelity and quality of images. Experimental results show that an index such as this presents some desirable features that resemble those from an ideal image quality function, constituting a suitable quality index for natural images. Namely, in-focus, noise-free natural images have shown a maximum of this metric in comparison with other degraded, blurred, or noisy versions. This result provides a way of identifying in-focus, noise-free images from other degraded versions, allowing an automatic and nonreference classification of images according to their relative quality. It is also shown that the new measure is well correlated with classical reference metrics such as the peak signal-to-noise ratio. PMID:18059913

  3. Monotonic correlation analysis of image quality measures for image fusion

    NASA Astrophysics Data System (ADS)

    Kaplan, Lance M.; Burks, Stephen D.; Moore, Richard K.; Nguyen, Quang

    2008-04-01

    The next generation of night vision goggles will fuse image intensified and long wave infra-red to create a hybrid image that will enable soldiers to better interpret their surroundings during nighttime missions. Paramount to the development of such goggles is the exploitation of image quality (IQ) measures to automatically determine the best image fusion algorithm for a particular task. This work introduces a novel monotonic correlation coefficient to investigate how well possible IQ features correlate to actual human performance, which is measured by a perception study. The paper will demonstrate how monotonic correlation can identify worthy features that could be overlooked by traditional correlation values.

  4. Creating Quality Choices: Charters. Closing the Achievement Gap Series

    ERIC Educational Resources Information Center

    Read, Tory

    2008-01-01

    The "Closing the Achievement Gap" series explores the Casey Foundation's education investments and presents stories, results, and lessons learned. This publication presents stories about two successful charter efforts and Foundation contributions. Contributions include: (1) Supporting a variety of activities and programs; (2) Financial support for…

  5. What Does Quality Programming Mean for High Achieving Students?

    ERIC Educational Resources Information Center

    Samudzi, Cleo

    2008-01-01

    The Missouri Academy of Science, Mathematics and Computing (Missouri Academy) is a two-year accelerated, early-entrance-to-college, residential school that matches the level, complexity and pace of the curriculum with the readiness and motivation of high achieving high school students. The school is a part of Northwest Missouri State University…

  6. ACHIEVING IRRIGATION RETURN FLOW QUALITY CONTROL THROUGH IMPROVED LEGAL SYSTEMS

    EPA Science Inventory

    The key to irrigated agricultural return flow quality control is proper utilization and management of the resource itself, and an accepted tool in out society is the law. This project is designed to develop legal alternatives that will facilitate the implementation of improved wa...

  7. Subjective matters: from image quality to image psychology

    NASA Astrophysics Data System (ADS)

    Fedorovskaya, Elena A.; De Ridder, Huib

    2013-03-01

    From the advent of digital imaging through several decades of studies, the human vision research community systematically focused on perceived image quality and digital artifacts due to resolution, compression, gamma, dynamic range, capture and reproduction noise, blur, etc., to help overcome existing technological challenges and shortcomings. Technological advances made digital images and digital multimedia nearly flawless in quality, and ubiquitous and pervasive in usage, provide us with the exciting but at the same time demanding possibility to turn to the domain of human experience including higher psychological functions, such as cognition, emotion, awareness, social interaction, consciousness and Self. In this paper we will outline the evolution of human centered multidisciplinary studies related to imaging and propose steps and potential foci of future research.

  8. Perceived Image Quality Improvements from the Application of Image Deconvolution to Retinal Images from an Adaptive Optics Fundus Imager

    NASA Astrophysics Data System (ADS)

    Soliz, P.; Nemeth, S. C.; Erry, G. R. G.; Otten, L. J.; Yang, S. Y.

    Aim: The objective of this project was to apply an image restoration methodology based on wavefront measurements obtained with a Shack-Hartmann sensor and evaluating the restored image quality based on medical criteria.Methods: Implementing an adaptive optics (AO) technique, a fundus imager was used to achieve low-order correction to images of the retina. The high-order correction was provided by deconvolution. A Shack-Hartmann wavefront sensor measures aberrations. The wavefront measurement is the basis for activating a deformable mirror. Image restoration to remove remaining aberrations is achieved by direct deconvolution using the point spread function (PSF) or a blind deconvolution. The PSF is estimated using measured wavefront aberrations. Direct application of classical deconvolution methods such as inverse filtering, Wiener filtering or iterative blind deconvolution (IBD) to the AO retinal images obtained from the adaptive optical imaging system is not satisfactory because of the very large image size, dificulty in modeling the system noise, and inaccuracy in PSF estimation. Our approach combines direct and blind deconvolution to exploit available system information, avoid non-convergence, and time-consuming iterative processes. Results: The deconvolution was applied to human subject data and resulting restored images compared by a trained ophthalmic researcher. Qualitative analysis showed significant improvements. Neovascularization can be visualized with the adaptive optics device that cannot be resolved with the standard fundus camera. The individual nerve fiber bundles are easily resolved as are melanin structures in the choroid. Conclusion: This project demonstrated that computer-enhanced, adaptive optic images have greater detail of anatomical and pathological structures.

  9. Measuring image quality in overlapping areas of panoramic composed images

    NASA Astrophysics Data System (ADS)

    Mitjà, Carles; Bover, Toni; Escofet, Jaume

    2012-06-01

    Several professional photographic applications uses the merging of consecutive overlapping images in order to obtain bigger files by means of stitching techniques or extended field of view (FOV) for panoramic images. All of those applications share the fact that the final composed image is obtained by overlapping the neighboring areas of consecutive individual images taken as a mosaic or a series of tiles over the scene, from the same point of view. Any individual image taken with a given lens can carry residual aberrations and several of them will affect more probably the borders of the image frame. Furthermore, the amount of distortion aberration present in the images of a given lens will be reversed in position for the two overlapping areas of a pair of consecutive takings. Finally, the different images used in composing the final one have corresponding overlapping areas taken with different perspective. From all the previously stated can be derived that the software employed must remap all the pixel information in order to resize and match image features in those overlapping areas, providing a final composed image with the desired perspective projection. The work presented analyse two panoramic format images taken with a pair of lenses and composed by means of a state of the art stitching software. Then, a series of images are taken to cover an FOV three times the original lens FOV, the images are merged by means of a software of common use in professional panoramic photography and the final image quality is evaluated through a series of targets positioned in strategic locations over the whole taking field of view. That allows measuring the resulting Resolution and Modulation Transfer Function (MTF). The results are shown compared with the previous measures on the original individual images.

  10. FFDM image quality assessment using computerized image texture analysis

    NASA Astrophysics Data System (ADS)

    Berger, Rachelle; Carton, Ann-Katherine; Maidment, Andrew D. A.; Kontos, Despina

    2010-04-01

    Quantitative measures of image quality (IQ) are routinely obtained during the evaluation of imaging systems. These measures, however, do not necessarily correlate with the IQ of the actual clinical images, which can also be affected by factors such as patient positioning. No quantitative method currently exists to evaluate clinical IQ. Therefore, we investigated the potential of using computerized image texture analysis to quantitatively assess IQ. Our hypothesis is that image texture features can be used to assess IQ as a measure of the image signal-to-noise ratio (SNR). To test feasibility, the "Rachel" anthropomorphic breast phantom (Model 169, Gammex RMI) was imaged with a Senographe 2000D FFDM system (GE Healthcare) using 220 unique exposure settings (target/filter, kVs, and mAs combinations). The mAs were varied from 10%-300% of that required for an average glandular dose (AGD) of 1.8 mGy. A 2.5cm2 retroareolar region of interest (ROI) was segmented from each image. The SNR was computed from the ROIs segmented from images linear with dose (i.e., raw images) after flat-field and off-set correction. Image texture features of skewness, coarseness, contrast, energy, homogeneity, and fractal dimension were computed from the Premium ViewTM postprocessed image ROIs. Multiple linear regression demonstrated a strong association between the computed image texture features and SNR (R2=0.92, p<=0.001). When including kV, target and filter as additional predictor variables, a stronger association with SNR was observed (R2=0.95, p<=0.001). The strong associations indicate that computerized image texture analysis can be used to measure image SNR and potentially aid in automating IQ assessment as a component of the clinical workflow. Further work is underway to validate our findings in larger clinical datasets.

  11. Geometric assessment of image quality using digital image registration techniques

    NASA Technical Reports Server (NTRS)

    Tisdale, G. E.

    1976-01-01

    Image registration techniques were developed to perform a geometric quality assessment of multispectral and multitemporal image pairs. Based upon LANDSAT tapes, accuracies to a small fraction of a pixel were demonstrated. Because it is insensitive to the choice of registration areas, the technique is well suited to performance in an automatic system. It may be implemented at megapixel-per-second rates using a commercial minicomputer in combination with a special purpose digital preprocessor.

  12. Image quality measures and their performance

    NASA Technical Reports Server (NTRS)

    Eskicioglu, Ahmet M.; Fisher, Paul S.; Chen, Si-Yuan

    1994-01-01

    A number of quality measures are evaluated for gray scale image compression. They are all bivariate exploiting the differences between corresponding pixels in the original and degraded images. It is shown that although some numerical measures correlate well with the observers' response for a given compression technique, they are not reliable for an evaluation across different techniques. The two graphical measures (histograms and Hosaka plots), however, can be used to appropriately specify not only the amount, but also the type of degradation in reconstructed images.

  13. Achievements in scientific photography. Volume 28 - The optical image and recording media

    NASA Astrophysics Data System (ADS)

    Chibisov, K. V.

    Papers are presented on such topics as the properties of optical data recording systems, image quality, image processing, and recording media. Particular consideration is given to mathematical models for the formation of optical images; trends in the development of quality criteria for photographic systems; hybrid optoelectronic systems of image processing; and photothermoplastic recording media.

  14. An alternative approach to achieving water quality-based limits

    SciTech Connect

    Hart, C.M.; Graeser, W.C.

    1995-12-01

    Since May 1982, members of the Iron and Steel Industry have been required to meet effluent limits based on Best Available Technology (BAT) for a process water discharge to receiving stream. US Steel Clairton Works has been successful in meeting these limits in the last three years; however, the current regulatory thrust is toward more stringent limits based on water quality. In cases of smaller streams such as the receiving stream for Clairton Works` process outfall, these limits can be very rigid. This paper will discuss the alternative approaches investigated to meet the new more stringent limits including the solution chosen.

  15. Preparation of patients for anaesthesia - achieving quality care.

    PubMed

    Lau, L; Jan, G; Chan, T F

    2002-04-01

    Implementation of anaesthesia begins with a preoperative assessment of the surgical patient and development of an anaesthetic plan. Preparation of the patient includes the preoperative assessment, review of preoperative tests, optimisation of medical conditions, adequate preoperative fasting, appropriate premedication, and the explanation of anaesthetic risk to patients. The goals of preoperative preparation are to reduce the morbidity of surgery, to increase the quality while decreasing the cost of perioperative care, and to return the patient to desirable functioning as quickly as possible. A knowledgeable anaesthesiologist is the 'final clinical gatekeeper', who coordinates perioperative management and ensures that the patient is in the optimal state for anaesthesia and surgery. PMID:11937664

  16. Managing health care variability to achieve quality care.

    PubMed

    Simmons, J C

    2001-05-01

    While much has been written about variation and health care, one area that has received little attention is variation within hospitals related to the operations management--which can lead to wasted money and human resources. Two Boston researchers who have been studying this area say that addressing these variations--and using techniques found in other major industries across the country--could give hospitals a new tool in addressing patient safety issues, nursing shortages, cost containment, and overall better quality of care. PMID:11400326

  17. Image Quality Analysis of Various Gastrointestinal Endoscopes: Why Image Quality Is a Prerequisite for Proper Diagnostic and Therapeutic Endoscopy

    PubMed Central

    Ko, Weon Jin; An, Pyeong; Ko, Kwang Hyun; Hahm, Ki Baik; Hong, Sung Pyo

    2015-01-01

    Arising from human curiosity in terms of the desire to look within the human body, endoscopy has undergone significant advances in modern medicine. Direct visualization of the gastrointestinal (GI) tract by traditional endoscopy was first introduced over 50 years ago, after which fairly rapid advancement from rigid esophagogastric scopes to flexible scopes and high definition videoscopes has occurred. In an effort towards early detection of precancerous lesions in the GI tract, several high-technology imaging scopes have been developed, including narrow band imaging, autofocus imaging, magnified endoscopy, and confocal microendoscopy. However, these modern developments have resulted in fundamental imaging technology being skewed towards red-green-blue and this technology has obscured the advantages of other endoscope techniques. In this review article, we have described the importance of image quality analysis using a survey to consider the diversity of endoscope system selection in order to better achieve diagnostic and therapeutic goals. The ultimate aims can be achieved through the adoption of modern endoscopy systems that obtain high image quality. PMID:26473119

  18. Quality evaluation of fruit by hyperspectral imaging

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This chapter presents new applications of hyperspectral imaging for measuring the optical properties of fruits and assessing their quality attributes. A brief overview is given of current techniques for measuring optical properties of turbid and opaque biological materials. Then a detailed descripti...

  19. Image Quality Indicator for Infrared Inspections

    NASA Technical Reports Server (NTRS)

    Burke, Eric

    2011-01-01

    The quality of images generated during an infrared thermal inspection depends on many system variables, settings, and parameters to include the focal length setting of the IR camera lens. If any relevant parameter is incorrect or sub-optimal, the resulting IR images will usually exhibit inherent unsharpness and lack of resolution. Traditional reference standards and image quality indicators (IQIs) are made of representative hardware samples and contain representative flaws of concern. These standards are used to verify that representative flaws can be detected with the current IR system settings. However, these traditional standards do not enable the operator to quantify the quality limitations of the resulting images, i.e. determine the inherent maximum image sensitivity and image resolution. As a result, the operator does not have the ability to optimize the IR inspection system prior to data acquisition. The innovative IQI described here eliminates this limitation and enables the operator to objectively quantify and optimize the relevant variables of the IR inspection system, resulting in enhanced image quality with consistency and repeatability in the inspection application. The IR IQI consists of various copper foil features of known sizes that are printed on a dielectric non-conductive board. The significant difference in thermal conductivity between the two materials ensures that each appears with a distinct grayscale or brightness in the resulting IR image. Therefore, the IR image of the IQI exhibits high contrast between the copper features and the underlying dielectric board, which is required to detect the edges of the various copper features. The copper features consist of individual elements of various shapes and sizes, or of element-pairs of known shapes and sizes and with known spacing between the elements creating the pair. For example, filled copper circles with various diameters can be used as individual elements to quantify the image sensitivity

  20. Analysis of image quality based on perceptual preference

    NASA Astrophysics Data System (ADS)

    Xue, Liqin; Hua, Yuning; Zhao, Guangzhou; Qi, Yaping

    2007-11-01

    This paper deals with image quality analysis considering the impact of psychological factors involved in assessment. The attributes of image quality requirement were partitioned according to the visual perception characteristics and the preference of image quality were obtained by the factor analysis method. The features of image quality which support the subjective preference were identified, The adequacy of image is evidenced to be the top requirement issues to the display image quality improvement. The approach will be beneficial to the research of the image quality subjective quantitative assessment method.

  1. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  2. Physical measures of image quality in mammography

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.

    1996-04-01

    A recently introduced method for quantitative analysis of images of the American College of Radiology (ACR) mammography accreditation phantom has been extended to include signal- to-noise-ratio (SNR) measurements, and has been applied to survey the image quality of 54 mammography machines from 17 hospitals. Participants sent us phantom images to be evaluated for each mammography machine at their hospital. Each phantom was loaned to us for obtaining images of the wax insert plate on a reference machine at our institution. The images were digitized and analyzed to yield indices that quantified the image quality of the machines precisely. We have developed methods for normalizing for the variation of the individual speck sizes between different ACR phantoms, for the variation of the speck sizes within a microcalcification group, and for variations in overall speeds of the mammography systems. In terms of the microcalcification SNR, the variability of the x-ray machines was 40.5% when no allowance was made for phantom or mAs variations. This dropped to 17.1% when phantom variability was accounted for, and to 12.7% when mAs variability was also allowed for. Our work shows the feasibility of practical, low-cost, objective and accurate evaluations, as a useful adjunct to the present ACR method.

  3. Naturalness and interestingness of test images for visual quality evaluation

    NASA Astrophysics Data System (ADS)

    Halonen, Raisa; Westman, Stina; Oittinen, Pirkko

    2011-01-01

    Balanced and representative test images are needed to study perceived visual quality in various application domains. This study investigates naturalness and interestingness as image quality attributes in the context of test images. Taking a top-down approach we aim to find the dimensions which constitute naturalness and interestingness in test images and the relationship between these high-level quality attributes. We compare existing collections of test images (e.g. Sony sRGB images, ISO 12640 images, Kodak images, Nokia images and test images developed within our group) in an experiment combining quality sorting and structured interviews. Based on the data gathered we analyze the viewer-supplied criteria for naturalness and interestingness across image types, quality levels and judges. This study advances our understanding of subjective image quality criteria and enables the validation of current test images, furthering their development.

  4. Exposure reduction and image quality in orthodontic radiology: a review of the literature

    SciTech Connect

    Taylor, T.S.; Ackerman, R.J. Jr.; Hardman, P.K.

    1988-01-01

    This article summarizes the use of rare earth screen technology to achieve high-quality panoramic and cephalometric radiographs with sizable reductions in patient radiation dosage. Collimation, shielding, quality control, and darkroom procedures are reviewed to further reduce patient risk and improve image quality. 34 references.

  5. No-reference image quality metric based on image classification

    NASA Astrophysics Data System (ADS)

    Choi, Hyunsoo; Lee, Chulhee

    2011-12-01

    In this article, we present a new no-reference (NR) objective image quality metric based on image classification. We also propose a new blocking metric and a new blur metric. Both metrics are NR metrics since they need no information from the original image. The blocking metric was computed by considering that the visibility of horizontal and vertical blocking artifacts can change depending on background luminance levels. When computing the blur metric, we took into account the fact that blurring in edge regions is generally more sensitive to the human visual system. Since different compression standards usually produce different compression artifacts, we classified images into two classes using the proposed blocking metric: one class that contained blocking artifacts and another class that did not contain blocking artifacts. Then, we used different quality metrics based on the classification results. Experimental results show that each metric correlated well with subjective ratings, and the proposed NR image quality metric consistently provided good performance with various types of content and distortions.

  6. Medical Imaging Image Quality Assessment with Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Michail, C. M.; Karpetas, G. E.; Fountos, G. P.; Kalyvas, N. I.; Martini, Niki; Koukou, Vaia; Valais, I. G.; Kandarakis, I. S.

    2015-09-01

    The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction, with cluster computing. The PET scanner simulated in this study was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the Modulation Transfer Function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL algorithm. OSMAPOSL reconstruction was assessed by using various subsets (3 to 21) and iterations (1 to 20), as well as by using various beta (hyper) parameter values. MTF values were found to increase up to the 12th iteration whereas remain almost constant thereafter. MTF improves by using lower beta values. The simulated PET evaluation method based on the TLC plane source can be also useful in research for the further development of PET and SPECT scanners though GATE simulations.

  7. Image registration for DSA quality enhancement.

    PubMed

    Buzug, T M; Weese, J

    1998-01-01

    A generalized framework for histogram-based similarity measures is presented and applied to the image-enhancement task in digital subtraction angiography (DSA). The class of differentiable, strictly convex weighting functions is identified as suitable weightings of histograms for measuring the degree of clustering that goes along with registration. With respect to computation time, the energy similarity measure is the function of choice for the registration of mask and contrast image prior to subtraction. The robustness of the energy measure is studied for geometrical image distortions like rotation and scaling. Additionally, it is investigated how the histogram binning and inhomogeneous motion inside the templates influence the quality of the similarity measure. Finally, the registration success for the automated procedure is compared with the manually shift-corrected image pair of the head. PMID:9719851

  8. Quality in university physics teaching: is it being achieved?

    NASA Astrophysics Data System (ADS)

    1998-11-01

    This was the title of a Physics Discipline Workshop held at the University of Leeds on 10 and 11 September 1998. Organizer Ashley Clarke of the university's Physics and Astronomy Department collected together an interesting variety of speakers polygonically targeting the topic, although as workshops go the audience didn't have to do much work except listen. There were representatives from 27 university physics departments who must have gone away with a lot to think about and possibly some new academic year resolutions to keep. But as a non-university no-longer teacher of (school) physics I was impressed with the general commitment to the idea that if you get the right quality of learning the teaching must be OK. I also learned (but have since forgotten) a lot of new acronyms. The keynote talk was by Gillian Hayes, Associate Director of the Quality Assurance Agency for Higher Education (QAA). She explained the role and implementation of the Subject Reviews that QAA is making for all subjects in all institutions of higher education on a five- to seven-year cycle. Physics Education hopes to publish an article about all this from QAA shortly. In the meantime, suffice it to say that the review looks at six aspects of provision, essentially from the point of view of enhancing students' experiences and learning. No doubt all participants would agree with this (they'd better if they want to score well on the Review) but may have been more worried by the next QAA speaker, Norman Jackson, who drummed in the basic facts of life as HE moves from an elite provision system to a mass provision system. He had an interesting graph showing how in the last ten years or so more students were getting firsts and upper seconds and fewer getting thirds. It seems that all those A-level students getting better grades than they used to are carrying on their good luck to degree level. But they still can't do maths (allegedly) and I doubt whether Jon Ogborn (IoP Advancing Physics Project

  9. Quality After-School Programming and Its Relationship to Achievement-Related Behaviors and Academic Performance

    ERIC Educational Resources Information Center

    Grassi, Annemarie M.

    2012-01-01

    The purpose of this study is to understand the relationship between quality social support networks developed through high quality afterschool programming and achievement amongst middle school and high school aged youth. This study seeks to develop a deeper understanding of how quality after-school programs influence a youth's developmental…

  10. Dried fruits quality assessment by hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Gargiulo, Aldo; Bonifazi, Giuseppe

    2012-05-01

    Dried fruits products present different market values according to their quality. Such a quality is usually quantified in terms of freshness of the products, as well as presence of contaminants (pieces of shell, husk, and small stones), defects, mould and decays. The combination of these parameters, in terms of relative presence, represent a fundamental set of attributes conditioning dried fruits humans-senses-detectable-attributes (visual appearance, organolectic properties, etc.) and their overall quality in terms of marketable products. Sorting-selection strategies exist but sometimes they fail when a higher degree of detection is required especially if addressed to discriminate between dried fruits of relatively small dimensions and when aiming to perform an "early detection" of pathogen agents responsible of future moulds and decays development. Surface characteristics of dried fruits can be investigated by hyperspectral imaging (HSI). In this paper, specific and "ad hoc" applications addressed to propose quality detection logics, adopting a hyperspectral imaging (HSI) based approach, are described, compared and critically evaluated. Reflectance spectra of selected dried fruits (hazelnuts) of different quality and characterized by the presence of different contaminants and defects have been acquired by a laboratory device equipped with two HSI systems working in two different spectral ranges: visible-near infrared field (400-1000 nm) and near infrared field (1000-1700 nm). The spectra have been processed and results evaluated adopting both a simple and fast wavelength band ratio approach and a more sophisticated classification logic based on principal component (PCA) analysis.

  11. Retinal image quality in the rodent eye.

    PubMed

    Artal, P; Herreros de Tejada, P; Muñoz Tedó, C; Green, D G

    1998-01-01

    Many rodents do not see well. For a target to be resolved by a rat or a mouse, it must subtend a visual angle of a degree or more. It is commonly assumed that this poor spatial resolving capacity is due to neural rather than optical limitations, but the quality of the retinal image has not been well characterized in these animals. We have modified a double-pass apparatus, initially designed for the human eye, so it could be used with rodents to measure the modulation transfer function (MTF) of the eye's optics. That is, the double-pass retinal image of a monochromatic (lambda = 632.8 nm) point source was digitized with a CCD camera. From these double-pass measurements, the single-pass MTF was computed under a variety of conditions of focus and with different pupil sizes. Even with the eye in best focus, the image quality in both rats and mice is exceedingly poor. With a 1-mm pupil, for example, the MTF in the rat had an upper limit of about 2.5 cycles/deg, rather than the 28 cycles/deg one would obtain if the eye were a diffraction-limited system. These images are about 10 times worse than the comparable retinal images in the human eye. Using our measurements of the optics and the published behavioral and electrophysiological contrast sensitivity functions (CSFs) of rats, we have calculated the CSF that the rat would have if it had perfect rather than poor optics. We find, interestingly, that diffraction-limited optics would produce only slight improvement overall. That is, in spite of retinal images which are of very low quality, the upper limit of visual resolution in rodents is neurally determined. Rats and mice seem to have eyes in which the optics and retina/brain are well matched. PMID:9682864

  12. No-reference image quality assessment based on nonsubsample shearlet transform and natural scene statistics

    NASA Astrophysics Data System (ADS)

    Wang, Guan-jun; Wu, Zhi-yong; Yun, Hai-jiao; Cui, Ming

    2016-03-01

    A novel no-reference (NR) image quality assessment (IQA) method is proposed for assessing image quality across multifarious distortion categories. The new method transforms distorted images into the shearlet domain using a non-subsample shearlet transform (NSST), and designs the image quality feature vector to describe images utilizing natural scenes statistical features: coefficient distribution, energy distribution and structural correlation ( SC) across orientations and scales. The final image quality is achieved from distortion classification and regression models trained by a support vector machine (SVM). The experimental results on the LIVE2 IQA database indicate that the method can assess image quality effectively, and the extracted features are susceptive to the category and severity of distortion. Furthermore, our proposed method is database independent and has a higher correlation rate and lower root mean squared error ( RMSE) with human perception than other high performance NR IQA methods.

  13. Achieving molecular selectivity in imaging using multiphoton Raman spectroscopy techniques

    SciTech Connect

    Holtom, Gary R. ); Thrall, Brian D. ); Chin, Beek Yoke ); Wiley, H Steven ); Colson, Steven D. )

    2000-12-01

    In the case of most imaging methods, contrast is generated either by physical properties of the sample (Differential Image Contrast, Phase Contrast), or by fluorescent labels that are localized to a particular protein or organelle. Standard Raman and infrared methods for obtaining images are based upon the intrinsic vibrational properties of molecules, and thus obviate the need for attached flurophores. Unfortunately, they have significant limitations for live-cell imaging. However, an active Raman method, called Coherent Anti-Stokes Raman Scattering (CARS), is well suited for microscopy, and provides a new means for imaging specific molecules. Vibrational imaging techniques, such as CARS, avoid problems associated with photobleaching and photo-induced toxicity often associated with the use of fluorescent labels with live cells. Because the laser configuration needed to implement CARS technology is similar to that used in other multiphoton microscopy methods, such as two -photon fluorescence and harmonic generation, it is possible to combine imaging modalities, thus generating simultaneous CARS and fluorescence images. A particularly powerful aspect of CARS microscopy is its ability to selectively image deuterated compounds, thus allowing the visualization of molecules, such as lipids, that are chemically indistinguishable from the native species.

  14. Measuring image quality performance on image versions saved with different file format and compression ratio

    NASA Astrophysics Data System (ADS)

    Mitjà, Carles; Escofet, Jaume; Bover, Toni

    2012-06-01

    Digitization of existing documents containing images is an important body of work for many archives ranging from individuals to institutional organizations. The methods and file formats used in this digitization is usually a trade off between budget, file volume size and image quality, while not necessarily in this order. The use of most commons and standardized file formats, JPEG and TIFF, prompts the operator to decide the compression ratio that affects both the final file volume size and the quality of the resulting image version. The evaluation of the image quality achieved by a system can be done by means of several measures and methods, being the Modulation Transfer Function (MTF) one of most used. The methods employed by the compression algorithms affect in a different way the two basic features of the image contents, edges and textures. Those basic features are too differently affected by the amount of noise generated at the digitization stage. Therefore, the target used in the measurement should be related with the features usually presents in general imaging. This work presents a comparison between the results obtained by measuring the MTF of images taken with a professional camera system and saved in several file formats compression ratios. In order to accomplish with the needs early stated, the MTF measurement has been done by two separate methods using the slanted edge and dead leaves targets respectively. The measurement results are shown and compared related with the respective file volume size.

  15. Image quality assessment and human visual system

    NASA Astrophysics Data System (ADS)

    Gao, Xinbo; Lu, Wen; Tao, Dacheng; Li, Xuelong

    2010-07-01

    This paper summaries the state-of-the-art of image quality assessment (IQA) and human visual system (HVS). IQA provides an objective index or real value to measure the quality of the specified image. Since human beings are the ultimate receivers of visual information in practical applications, the most reliable IQA is to build a computational model to mimic the HVS. According to the properties and cognitive mechanism of the HVS, the available HVS-based IQA methods can be divided into two categories, i.e., bionics methods and engineering methods. This paper briefly introduces the basic theories and development histories of the above two kinds of HVS-based IQA methods. Finally, some promising research issues are pointed out in the end of the paper.

  16. Image quality vs. sensitivity: fundamental sensor system engineering

    NASA Astrophysics Data System (ADS)

    Schueler, Carl F.

    2008-08-01

    This paper focuses on the fundamental system engineering tradeoff driving almost all remote sensing design efforts, affecting complexity, cost, performance, schedule, and risk: image quality vs. sensitivity. This single trade encompasses every aspect of performance, including radiometric accuracy, dynamic range and precision, as well as spatial, spectral, and temporal coverage and resolution. This single trade also encompasses every aspect of design, including mass, dimensions, power, orbit selection, spacecraft interface, sensor and spacecraft functional trades, pointing or scanning architecture, sensor architecture (e.g., field-of-view, optical form, aperture, f/#, material properties), electronics, mechanical and thermal properties. The relationship between image quality and sensitivity is introduced based on the concepts of modulation transfer function (MTF) and signal-to-noise ratio (SNR) with examples to illustrate the balance to be achieved by the system architect to optimize cost, complexity, performance and risk relative to end-user requirements.

  17. Retinal image quality, reading and myopia.

    PubMed

    Collins, Michael J; Buehren, Tobias; Iskander, D Robert

    2006-01-01

    Analysis was undertaken of the retinal image characteristics of the best-spectacle corrected eyes of progressing myopes (n = 20, mean age = 22 years; mean spherical equivalent = -3.84 D) and a control group of emmetropes (n = 20, mean age = 23 years; mean spherical equivalent = 0.00 D) before and after a 2h reading task. Retinal image quality was calculated based upon wavefront measurements taken with a Hartmann-Shack sensor with fixation on both a far (5.5 m) and near (individual reading distance) target. The visual Strehl ratio based on the optical transfer function (VSOTF) was significantly worse for the myopes prior to reading for both the far (p = 0.01) and near (p = 0.03) conditions. The myopic group showed significant reductions in various aspects of retinal image quality compared with the emmetropes, involving components of the modulation transfer function, phase transfer function and point spread function, often along the vertical meridian of the eye. The depth of focus of the myopes (0.54 D) was larger (p = 0.02) than the emmetropes (0.42 D) and the distribution of refractive power (away from optimal sphero-cylinder) was greater in the myopic eyes (variance of distributions p < 0.05). We found evidence that the lead and lag of accommodation are influenced by the higher order aberrations of the eye (e.g. significant correlations between lead/lag and the peak of the visual Strehl ratio based on the MTF). This could indicate that the higher accommodation lags seen in myopes are providing optimized retinal image characteristics. The interaction between low and high order aberrations of the eye play a significant role in reducing the retinal image quality of myopic eyes compared with emmetropes. PMID:15913701

  18. Towards real-time image quality assessment

    NASA Astrophysics Data System (ADS)

    Geary, Bobby; Grecos, Christos

    2011-03-01

    We introduce a real-time implementation and evaluation of a new fast accurate full reference based image quality metric. The popular general image quality metric known as the Structural Similarity Index Metric (SSIM) has been shown to be an effective, efficient and useful, finding many practical and theoretical applications. Recently the authors have proposed an enhanced version of the SSIM algorithm known as the Rotated Gaussian Discrimination Metric (RGDM). This approach uses a Gaussian-like discrimination function to evaluate local contrast and luminance. RGDM was inspired by an exploration of local statistical parameter variations in relation to variation of Mean Opinion Score (MOS) for a range of particular distortion types. In this paper we out-line the salient features of the derivation of RGDM and show how analyses of local statistics of distortion type necessitate variation in discrimination function width. Results on the LIVE image database show tight banding of RGDM metric value when plotted against mean opinion score indicating the usefulness of this metric. We then explore a number of strategies for algorithmic speed-up including the application of Integral Images for patch based computation optimisation, cost reduction for the evaluation of the discrimination function and general loop unrolling. We also employ fast Single Instruction Multiple Data (SIMD) intrinsics and explore data parallel decomposition on a multi-core Intel Processor.

  19. The 'Fulcrum for Quality'. The physics of provider reform: six key variables to achieve 'MEDQIC'.

    PubMed

    King, R D

    1996-01-01

    As health care providers move from a fee-for-service, revenue-driven system to total capitation, the need to focus on achieving the most efficient delivery of quality integrated care will be paramount to survival. The energy to survive a totally capitated reimbursement system will depend on the degree that a provider is able to evolve from a revenue-driven to a quality-driven organizational culture. This evolution is going to be very difficult for many health care providers who are committed to a managing money mentality rather than a managing quality mentality to achieve the "bottom line" and regard human resources as expendable and quality as unmeasureable rhetoric. The "Fulcrum for Quality" demonstrates how six key variables change when moving from a fee-for-service to a capitated reimbursement system: reimbursement methodology, utilization, operational expenses, information systems, management and communication. In a fee-for-service system, providers experience minimal restrictions on the revenue limits they could achieve. Consequently, inefficiencies and poor quality had little to do with a provider's ability to increase profits. Under a totally capitated system that limits revenue, providers will experience that quality and efficiencies directly affect profits. The "Fulcrum for Quality" discusses the transformation of the six key variables and the changes a medical practice manager will need to make in order to achieve the most efficient delivery of quality integrated care. PMID:10154121

  20. Model-based quantification of image quality

    NASA Technical Reports Server (NTRS)

    Hazra, Rajeeb; Miller, Keith W.; Park, Stephen K.

    1989-01-01

    In 1982, Park and Schowengerdt published an end-to-end analysis of a digital imaging system quantifying three principal degradation components: (1) image blur - blurring caused by the acquisition system, (2) aliasing - caused by insufficient sampling, and (3) reconstruction blur - blurring caused by the imperfect interpolative reconstruction. This analysis, which measures degradation as the square of the radiometric error, includes the sample-scene phase as an explicit random parameter and characterizes the image degradation caused by imperfect acquisition and reconstruction together with the effects of undersampling and random sample-scene phases. In a recent paper Mitchell and Netravelli displayed the visual effects of the above mentioned degradations and presented subjective analysis about their relative importance in determining image quality. The primary aim of the research is to use the analysis of Park and Schowengerdt to correlate their mathematical criteria for measuring image degradations with subjective visual criteria. Insight gained from this research can be exploited in the end-to-end design of optical systems, so that system parameters (transfer functions of the acquisition and display systems) can be designed relative to each other, to obtain the best possible results using quantitative measurements.

  1. The optimal polarizations for achieving maximum contrast in radar images

    NASA Technical Reports Server (NTRS)

    Swartz, A. A.; Yueh, H. A.; Kong, J. A.; Novak, L. M.; Shin, R. T.

    1988-01-01

    There is considerable interest in determining the optimal polarizations that maximize contrast between two scattering classes in polarimetric radar images. A systematic approach is presented for obtaining the optimal polarimetric matched filter, i.e., that filter which produces maximum contrast between two scattering classes. The maximization procedure involves solving an eigenvalue problem where the eigenvector corresponding to the maximum contrast ratio is an optimal polarimetric matched filter. To exhibit the physical significance of this filter, it is transformed into its associated transmitting and receiving polarization states, written in terms of horizontal and vertical vector components. For the special case where the transmitting polarization is fixed, the receiving polarization which maximizes the contrast ratio is also obtained. Polarimetric filtering is then applies to synthetic aperture radar images obtained from the Jet Propulsion Laboratory. It is shown, both numerically and through the use of radar imagery, that maximum image contrast can be realized when data is processed with the optimal polarimeter matched filter.

  2. The mobile image quality survey game

    NASA Astrophysics Data System (ADS)

    Rasmussen, D. René

    2012-01-01

    In this paper we discuss human assessment of the quality of photographic still images, that are degraded in various manners relative to an original, for example due to compression or noise. In particular, we examine and present results from a technique where observers view images on a mobile device, perform pairwise comparisons, identify defects in the images, and interact with the display to indicate the location of the defects. The technique measures the response time and accuracy of the responses. By posing the survey in a form similar to a game, providing performance feedback to the observer, the technique attempts to increase the engagement of the observers, and to avoid exhausting observers, a factor that is often a problem for subjective surveys. The results are compared with the known physical magnitudes of the defects and with results from similar web-based surveys. The strengths and weaknesses of the technique are discussed. Possible extensions of the technique to video quality assessment are also discussed.

  3. Hyperspectral and multispectral imaging for evaluating food safety and quality

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Spectral imaging technologies have been developed rapidly during the past decade. This paper presents hyperspectral and multispectral imaging technologies in the area of food safety and quality evaluation, with an introduction, demonstration, and summarization of the spectral imaging techniques avai...

  4. Does Practice Make Perfect? Independent Reading Quantity, Quality and Student Achievement

    ERIC Educational Resources Information Center

    Topping, K. J.; Samuels, J.; Paul, T.

    2007-01-01

    Does reading practice make perfect? Or is reading achievement related to the quality of practice as well as the quantity? To answer these questions, data on 45,670 students in grades 1-12 who read over 3 million books were analyzed. Measures largely of quantity (engaged reading volume) and purely of quality (success in reading comprehension)…

  5. School Quality and Cognitive Achievement Production: A Case Study of Rural Pakistan.

    ERIC Educational Resources Information Center

    Behrman, Jere R.; And Others

    1997-01-01

    Examines determinants of cognitive achievement in rural Pakistan, controlling for cognitive ability, family background, various school quality measures, and educational attainment. Estimates indicate substantial variation in school effectiveness. Investments that improve teacher quality and increase student exposure to teachers are likely to have…

  6. Teacher Quality in Educational Production. Tracking, Decay, and Student Achievement. NBER Working Paper No. 14442

    ERIC Educational Resources Information Center

    Rothstein, Jesse

    2008-01-01

    Growing concerns over the achievement of U.S. students have led to proposals to reward good teachers and penalize (or fire) bad ones. The leading method for assessing teacher quality is "value added" modeling (VAM), which decomposes students' test scores into components attributed to student heterogeneity and to teacher quality. Implicit in the…

  7. Quality of Teaching Mathematics and Learning Achievement Gains: Evidence from Primary Schools in Kenya

    ERIC Educational Resources Information Center

    Ngware, Moses W.; Ciera, James; Musyoka, Peter K.; Oketch, Moses

    2015-01-01

    This paper examines the contribution of quality mathematics teaching to student achievement gains. Quality of mathematics teaching is assessed through teacher demonstration of the five strands of mathematical proficiency, the level of cognitive task demands, and teacher mathematical knowledge. Data is based on 1907 grade 6 students who sat for the…

  8. On pictures and stuff: image quality and material appearance

    NASA Astrophysics Data System (ADS)

    Ferwerda, James A.

    2014-02-01

    Realistic images are a puzzle because they serve as visual representations of objects while also being objects themselves. When we look at an image we are able to perceive both the properties of the image and the properties of the objects represented by the image. Research on image quality has typically focused improving image properties (resolution, dynamic range, frame rate, etc.) while ignoring the issue of whether images are serving their role as visual representations. In this paper we describe a series of experiments that investigate how well images of different quality convey information about the properties of the objects they represent. In the experiments we focus on the effects that two image properties (contrast and sharpness) have on the ability of images to represent the gloss of depicted objects. We found that different experimental methods produced differing results. Specifically, when the stimulus images were presented using simultaneous pair comparison, observers were influenced by the surface properties of the images and conflated changes in image contrast and sharpness with changes in object gloss. On the other hand, when the stimulus images were presented sequentially, observers were able to disregard the image plane properties and more accurately match the gloss of the objects represented by the different quality images. These findings suggest that in understanding image quality it is useful to distinguish between quality of the imaging medium and the quality of the visual information represented by that medium.

  9. Pleiades-Hr Innovative Techniques for Radiometric Image Quality Commissioning

    NASA Astrophysics Data System (ADS)

    Blanchet, G.; Lebeque, L.; Fourest, S.; Latry, C.; Porez-Nadal, F.; Lacherade, S.; Thiebaut, C.

    2012-07-01

    The first Pleiades-HR satellite, part of a constellation of two, has been launched on December 17, 2011. This satellite produces high resolution optical images. In order to achieve good image quality, Pleiades-HR should first undergo an important 6 month commissioning phase period. This phase consists in calibrating and assessing the radiometric and geometric image quality to offer the best images to end users. This new satellite has benefited from technology improvements in various fields which make it stand out from other Earth observation satellites. In particular, its best-in-class agility performance enables new calibration and assessment techniques. This paper is dedicated to presenting these innovative techniques that have been tested for the first time for the Pleiades- HR radiometric commissioning. Radiometric activities concern compression, absolute calibration, detector normalization, and refocusing operations, MTF (Modulation Transfer Function) assessment, signal-to-noise ratio (SNR) estimation, and tuning of the ground processing parameters. The radiometric performances of each activity are summarized in this paper.

  10. Image analysis for dental bone quality assessment using CBCT imaging

    NASA Astrophysics Data System (ADS)

    Suprijanto; Epsilawati, L.; Hajarini, M. S.; Juliastuti, E.; Susanti, H.

    2016-03-01

    Cone beam computerized tomography (CBCT) is one of X-ray imaging modalities that are applied in dentistry. Its modality can visualize the oral region in 3D and in a high resolution. CBCT jaw image has potential information for the assessment of bone quality that often used for pre-operative implant planning. We propose comparison method based on normalized histogram (NH) on the region of inter-dental septum and premolar teeth. Furthermore, the NH characteristic from normal and abnormal bone condition are compared and analyzed. Four test parameters are proposed, i.e. the difference between teeth and bone average intensity (s), the ratio between bone and teeth average intensity (n) of NH, the difference between teeth and bone peak value (Δp) of NH, and the ratio between teeth and bone of NH range (r). The results showed that n, s, and Δp have potential to be the classification parameters of dental calcium density.

  11. Stereoscopic image quality assessment using disparity-compensated view filtering

    NASA Astrophysics Data System (ADS)

    Song, Yang; Yu, Mei; Jiang, Gangyi; Shao, Feng; Peng, Zongju

    2016-03-01

    Stereoscopic image quality assessment (IQA) plays a vital role in stereoscopic image/video processing systems. We propose a new quality assessment for stereoscopic image that uses disparity-compensated view filtering (DCVF). First, because a stereoscopic image is composed of different frequency components, DCVF is designed to decompose it into high-pass and low-pass components. Then, the qualities of different frequency components are acquired according to their phase congruency and coefficient distribution characteristics. Finally, support vector regression is utilized to establish a mapping model between the component qualities and subjective qualities, and stereoscopic image quality is calculated using this mapping model. Experiments on the LIVE 3-D IQA database and NBU 3-D IQA databases demonstrate that the proposed method can evaluate stereoscopic image quality accurately. Compared with several state-of-the-art quality assessment methods, the proposed method is more consistent with human perception.

  12. Finger vein image quality evaluation using support vector machines

    NASA Astrophysics Data System (ADS)

    Yang, Lu; Yang, Gongping; Yin, Yilong; Xiao, Rongyang

    2013-02-01

    In an automatic finger-vein recognition system, finger-vein image quality is significant for segmentation, enhancement, and matching processes. In this paper, we propose a finger-vein image quality evaluation method using support vector machines (SVMs). We extract three features including the gradient, image contrast, and information capacity from the input image. An SVM model is built on the training images with annotated quality labels (i.e., high/low) and then applied to unseen images for quality evaluation. To resolve the class-imbalance problem in the training data, we perform oversampling for the minority class with random-synthetic minority oversampling technique. Cross-validation is also employed to verify the reliability and stability of the learned model. Our experimental results show the effectiveness of our method in evaluating the quality of finger-vein images, and by discarding low-quality images detected by our method, the overall finger-vein recognition performance is considerably improved.

  13. Comparison of image compression techniques for high quality based on properties of visual perception

    NASA Astrophysics Data System (ADS)

    Algazi, V. Ralph; Reed, Todd R.

    1991-12-01

    The growing interest and importance of high quality imaging has several roots: Imaging and graphics, or more broadly multimedia, as the predominant means of man-machine interaction on computers, and the rapid maturing of advanced television technology. Because of their economic importance, proposed advanced television standards are being discussed and evaluated for rapid adoption. These advanced standards are based on well known image compression techniques, used for very low bit rate video communications as well. In this paper, we examine the expected improvement in image quality that advanced television and imaging techniques should bring about. We then examine and discuss the data compression techniques which are commonly used, to determine if they are capable of providing the achievable gain in quality, and to assess some of their limitations. We also discuss briefly the potential of these techniques for very high quality imaging and display applications, which extend beyond the range of existing and proposed television standards.

  14. Image quality metrics for optical coherence angiography.

    PubMed

    Lozzi, Andrea; Agrawal, Anant; Boretsky, Adam; Welle, Cristin G; Hammer, Daniel X

    2015-07-01

    We characterized image quality in optical coherence angiography (OCA) en face planes of mouse cortical capillary network in terms of signal-to-noise ratio (SNR) and Weber contrast (Wc) through a novel mask-based segmentation method. The method was used to compare two adjacent B-scan processing algorithms, (1) average absolute difference (AAD) and (2) standard deviation (SD), while varying the number of lateral cross-sections acquired (also known as the gate length, N). AAD and SD are identical at N = 2 and exhibited similar image quality for N<10. However, AAD is relatively less susceptible to bulk tissue motion artifact than SD. SNR and Wc were 15% and 35% higher for AAD from N = 25 to 100. In addition data sets were acquired with two objective lenses with different magnifications to quantify the effect of lateral resolution on fine capillary detection. The lower power objective yielded a significant mean broadening of 17% in Full Width Half Maximum (FWHM) diameter. These results may guide study and device designs for OCA capillary and blood flow quantification. PMID:26203372

  15. Digital mammography--DQE versus optimized image quality in clinical environment: an on site study

    NASA Astrophysics Data System (ADS)

    Oberhofer, Nadia; Fracchetti, Alessandro; Springeth, Margareth; Moroder, Ehrenfried

    2010-04-01

    The intrinsic quality of the detection system of 7 different digital mammography units (5 direct radiography DR; 2 computed radiography CR), expressed by DQE, has been compared with their image quality/dose performances in clinical use. DQE measurements followed IEC 62220-1-2 using a tungsten test object for MTF determination. For image quality assessment two different methods have been applied: 1) measurement of contrast to noise ratio (CNR) according to the European guidelines and 2) contrast-detail (CD) evaluation. The latter was carried out with the phantom CDMAM ver. 3.4 and the commercial software CDMAM Analyser ver. 1.1 (both Artinis) for automated image analysis. The overall image quality index IQFinv proposed by the software has been validated. Correspondence between the two methods has been shown figuring out a linear correlation between CNR and IQFinv. All systems were optimized with respect to image quality and average glandular dose (AGD) within the constraints of automatic exposure control (AEC). For each equipment, a good image quality level was defined by means of CD analysis, and the corresponding CNR value considered as target value. The goal was to achieve for different PMMA-phantom thicknesses constant image quality, that means the CNR target value, at minimum dose. All DR systems exhibited higher DQE and significantly better image quality compared to CR systems. Generally switching, where available, to a target/filter combination with an x-ray spectrum of higher mean energy permitted dose savings at equal image quality. However, several systems did not allow to modify the AEC in order to apply optimal radiographic technique in clinical use. The best ratio image quality/dose was achieved by a unit with a-Se detector and W anode only recently available on the market.

  16. Quality Control of Diffusion Weighted Images

    PubMed Central

    Liu, Zhexing; Wang, Yi; Gerig, Guido; Gouttard, Sylvain; Tao, Ran; Fletcher, Thomas; Styner, Martin

    2013-01-01

    Diffusion Tensor Imaging (DTI) has become an important MRI procedure to investigate the integrity of white matter in brain in vivo. DTI is estimated from a series of acquired Diffusion Weighted Imaging (DWI) volumes. DWI data suffers from inherent low SNR, overall long scanning time of multiple directional encoding with correspondingly large risk to encounter several kinds of artifacts. These artifacts can be too severe for a correct and stable estimation of the diffusion tensor. Thus, a quality control (QC) procedure is absolutely necessary for DTI studies. Currently, routine DTI QC procedures are conducted manually by visually checking the DWI data set in a gradient by gradient and slice by slice way. The results often suffer from low consistence across different data sets, lack of agreement of different experts, and difficulty to judge motion artifacts by qualitative inspection. Additionally considerable manpower is needed for this step due to the large number of images to QC, which is common for group comparison and longitudinal studies, especially with increasing number of diffusion gradient directions. We present a framework for automatic DWI QC. We developed a tool called DTIPrep which pipelines the QC steps with a detailed protocoling and reporting facility. And it is fully open source. This framework/tool has been successfully applied to several DTI studies with several hundred DWIs in our lab as well as collaborating labs in Utah and Iowa. In our studies, the tool provides a crucial piece for robust DTI analysis in brain white matter study. PMID:24353379

  17. Low-Achieving Readers, High Expectations: Image Theatre Encourages Critical Literacy

    ERIC Educational Resources Information Center

    Rozansky, Carol Lloyd; Aagesen, Colleen

    2010-01-01

    Students in an eighth-grade, urban, low-achieving reading class were introduced to critical literacy through engagement in Image Theatre. Developed by liberatory dramatist Augusto Boal, Image Theatre gives participants the opportunity to examine texts in the triple role of interpreter, artist, and sculptor (i.e., image creator). The researchers…

  18. Image quality assessment using multi-method fusion.

    PubMed

    Liu, Tsung-Jung; Lin, Weisi; Kuo, C-C Jay

    2013-05-01

    A new methodology for objective image quality assessment (IQA) with multi-method fusion (MMF) is presented in this paper. The research is motivated by the observation that there is no single method that can give the best performance in all situations. To achieve MMF, we adopt a regression approach. The new MMF score is set to be the nonlinear combination of scores from multiple methods with suitable weights obtained by a training process. In order to improve the regression results further, we divide distorted images into three to five groups based on the distortion types and perform regression within each group, which is called "context-dependent MMF" (CD-MMF). One task in CD-MMF is to determine the context automatically, which is achieved by a machine learning approach. To further reduce the complexity of MMF, we perform algorithms to select a small subset from the candidate method set. The result is very good even if only three quality assessment methods are included in the fusion process. The proposed MMF method using support vector regression is shown to outperform a large number of existing IQA methods by a significant margin when being tested in six representative databases. PMID:23288335

  19. Image Quality Characteristics of Handheld Display Devices for Medical Imaging

    PubMed Central

    Yamazaki, Asumi; Liu, Peter; Cheng, Wei-Chung; Badano, Aldo

    2013-01-01

    Handheld devices such as mobile phones and tablet computers have become widespread with thousands of available software applications. Recently, handhelds are being proposed as part of medical imaging solutions, especially in emergency medicine, where immediate consultation is required. However, handheld devices differ significantly from medical workstation displays in terms of display characteristics. Moreover, the characteristics vary significantly among device types. We investigate the image quality characteristics of various handheld devices with respect to luminance response, spatial resolution, spatial noise, and reflectance. We show that the luminance characteristics of the handheld displays are different from those of workstation displays complying with grayscale standard target response suggesting that luminance calibration might be needed. Our results also demonstrate that the spatial characteristics of handhelds can surpass those of medical workstation displays particularly for recent generation devices. While a 5 mega-pixel monochrome workstation display has horizontal and vertical modulation transfer factors of 0.52 and 0.47 at the Nyquist frequency, the handheld displays released after 2011 can have values higher than 0.63 at the respective Nyquist frequencies. The noise power spectra for workstation displays are higher than 1.2×10−5 mm2 at 1 mm−1, while handheld displays have values lower than 3.7×10−6 mm2. Reflectance measurements on some of the handheld displays are consistent with measurements for workstation displays with, in some cases, low specular and diffuse reflectance coefficients. The variability of the characterization results among devices due to the different technological features indicates that image quality varies greatly among handheld display devices. PMID:24236113

  20. Quality assessment for spectral domain optical coherence tomography (OCT) images

    NASA Astrophysics Data System (ADS)

    Liu, Shuang; Paranjape, Amit S.; Elmaanaoui, Badr; Dewelle, Jordan; Rylander, H. Grady, III; Markey, Mia K.; Milner, Thomas E.

    2009-02-01

    Retinal nerve fiber layer (RNFL) thickness, a measure of glaucoma progression, can be measured in images acquired by spectral domain optical coherence tomography (OCT). The accuracy of RNFL thickness estimation, however, is affected by the quality of the OCT images. In this paper, a new parameter, signal deviation (SD), which is based on the standard deviation of the intensities in OCT images, is introduced for objective assessment of OCT image quality. Two other objective assessment parameters, signal to noise ratio (SNR) and signal strength (SS), are also calculated for each OCT image. The results of the objective assessment are compared with subjective assessment. In the subjective assessment, one OCT expert graded the image quality according to a three-level scale (good, fair, and poor). The OCT B-scan images of the retina from six subjects are evaluated by both objective and subjective assessment. From the comparison, we demonstrate that the objective assessment successfully differentiates between the acceptable quality images (good and fair images) and poor quality OCT images as graded by OCT experts. We evaluate the performance of the objective assessment under different quality assessment parameters and demonstrate that SD is the best at distinguishing between fair and good quality images. The accuracy of RNFL thickness estimation is improved significantly after poor quality OCT images are rejected by automated objective assessment using the SD, SNR, and SS.

  1. Depressive Symptoms in 3rd Grade Teachers: Relations to Classroom Quality and Student Achievement

    PubMed Central

    McLean, Leigh; Connor, Carol McDonald

    2014-01-01

    This study investigated associations among third grade teachers’ (n = 27) symptoms of depression, quality of the classroom-learning environment (CLE), and students’ (n = 523, mean age 8.6 years) math and literacy performance. Teachers’ depressive symptoms in the winter negatively predicted students’ spring mathematics achievement. This depended on students’ fall mathematics scores; students who began the year with weaker math skills and were in classrooms where teachers reported more depressive symptoms achieved smaller gains than did peers whose teachers reported fewer symptoms. Teachers’ depressive symptoms were negatively associated with quality of CLE, and quality of CLE mediated the association between depressive symptoms and student achievement. Findings point to the importance of teachers’ mental health, with implications for policy and practice. PMID:25676719

  2. Depressive symptoms in third-grade teachers: relations to classroom quality and student achievement.

    PubMed

    McLean, Leigh; McDonald Connor, Carol

    2015-01-01

    This study investigated associations among third-grade teachers' (N = 27) symptoms of depression, quality of the classroom-learning environment (CLE), and students' (N = 523, Mage  = 8.6 years) math and literacy performance. teachers' depressive symptoms in the winter negatively predicted students' spring mathematics achievement. This depended on students' fall mathematics scores; students who began the year with weaker math skills and were in classrooms where teachers reported more depressive symptoms achieved smaller gains than did peers whose teachers reported fewer symptoms. teachers' depressive symptoms were negatively associated with quality of CLE, and quality of CLE mediated the association between depressive symptoms and student achievement. The findings point to the importance of teachers' mental health, with implications for policy and practice. PMID:25676719

  3. Taking image quality factor into the OPC model tuning flow

    NASA Astrophysics Data System (ADS)

    Wang, Ching-Heng; Liu, Qingwei; Zhang, Liguo

    2007-03-01

    All OPC model builders are in search of a physically realistic model that is adequately calibrated and contains the information that can be used for process predictions and analysis of a given process. But there still are some unknown physics in the process and wafer data sets are not perfect. Most cases even using the average values of different empirical data sets will still take inaccurate measurements into the model fitting process, which makes the fitting process more time consuming and also may cause losing convergence and stability. The Image quality is one of the most worrisome obstacles faced by next-generation lithography. Nowadays, considerable effort is devoted to enhance the contrast, as well as understanding its impact on devices. It is a persistent problem for 193nm micro-lithography and will carry us for at least three generations, culminating with immersion lithography. This work is to weight different wafer data points with a weighting function. The weighting function is dependent on the Normal image log slope (NILS), which can reflect the image quality. Using this approach, we can filter wrong information of the process and make the OPC model more accurate. CalibreWorkbench is the platform we used in this study, which has been proven to have an excellent performance on 0.13um, 90nm and 65nm production and development models setup. Leveraging its automatic optical-tuning function, we practiced the best weighting approach to achieve the most efficient and convergent tuning flow.

  4. The influence of statistical variations on image quality

    NASA Astrophysics Data System (ADS)

    Hultgren, Bror; Hertel, Dirk; Bullitt, Julian

    2006-01-01

    For more than thirty years imaging scientists have constructed metrics to predict psychovisually perceived image quality. Such metrics are based on a set of objectively measurable basis functions such as Noise Power Spectrum (NPS), Modulation Transfer Function (MTF), and characteristic curves of tone and color reproduction. Although these basis functions constitute a set of primitives that fully describe an imaging system from the standpoint of information theory, we found that in practical imaging systems the basis functions themselves are determined by system-specific primitives, i.e. technology parameters. In the example of a printer, MTF and NPS are largely determined by dot structure. In addition MTF is determined by color registration, and NPS by streaking and banding. Since any given imaging system is only a single representation of a class of more or less identical systems, the family of imaging systems and the single system are not described by a unique set of image primitives. For an image produced by a given imaging system, the set of image primitives describing that particular image will be a singular instantiation of the underlying statistical distribution of that primitive. If we know precisely the set of imaging primitives that describe the given image we should be able to predict its image quality. Since only the distributions are known, we can only predict the distribution in image quality for a given image as produced by the larger class of 'identical systems'. We will demonstrate the combinatorial effect of the underlying statistical variations in the image primitives on the objectively measured image quality of a population of printers as well as on the perceived image quality of a set of test images. We also will discuss the choice of test image sets and impact of scene content on the distribution of perceived image quality.

  5. SU-E-I-43: Pediatric CT Dose and Image Quality Optimization

    SciTech Connect

    Stevens, G; Singh, R

    2014-06-01

    Purpose: To design an approach to optimize radiation dose and image quality for pediatric CT imaging, and to evaluate expected performance. Methods: A methodology was designed to quantify relative image quality as a function of CT image acquisition parameters. Image contrast and image noise were used to indicate expected conspicuity of objects, and a wide-cone system was used to minimize scan time for motion avoidance. A decision framework was designed to select acquisition parameters as a weighted combination of image quality and dose. Phantom tests were used to acquire images at multiple techniques to demonstrate expected contrast, noise and dose. Anthropomorphic phantoms with contrast inserts were imaged on a 160mm CT system with tube voltage capabilities as low as 70kVp. Previously acquired clinical images were used in conjunction with simulation tools to emulate images at different tube voltages and currents to assess human observer preferences. Results: Examination of image contrast, noise, dose and tube/generator capabilities indicates a clinical task and object-size dependent optimization. Phantom experiments confirm that system modeling can be used to achieve the desired image quality and noise performance. Observer studies indicate that clinical utilization of this optimization requires a modified approach to achieve the desired performance. Conclusion: This work indicates the potential to optimize radiation dose and image quality for pediatric CT imaging. In addition, the methodology can be used in an automated parameter selection feature that can suggest techniques given a limited number of user inputs. G Stevens and R Singh are employees of GE Healthcare.

  6. Using short-wave infrared imaging for fruit quality evaluation

    NASA Astrophysics Data System (ADS)

    Zhang, Dong; Lee, Dah-Jye; Desai, Alok

    2013-12-01

    Quality evaluation of agricultural and food products is important for processing, inventory control, and marketing. Fruit size and surface quality are two important quality factors for high-quality fruit such as Medjool dates. Fruit size is usually measured by length that can be done easily by simple image processing techniques. Surface quality evaluation on the other hand requires more complicated design, both in image acquisition and image processing. Skin delamination is considered a major factor that affects fruit quality and its value. This paper presents an efficient histogram analysis and image processing technique that is designed specifically for real-time surface quality evaluation of Medjool dates. This approach, based on short-wave infrared imaging, provides excellent image contrast between the fruit surface and delaminated skin, which allows significant simplification of image processing algorithm and reduction of computational power requirements. The proposed quality grading method requires very simple training procedure to obtain a gray scale image histogram for each quality level. Using histogram comparison, each date is assigned to one of the four quality levels and an optimal threshold is calculated for segmenting skin delamination areas from the fruit surface. The percentage of the fruit surface that has skin delamination can then be calculated for quality evaluation. This method has been implemented and used for commercial production and proven to be efficient and accurate.

  7. Quality Prediction of Asymmetrically Distorted Stereoscopic 3D Images.

    PubMed

    Wang, Jiheng; Rehman, Abdul; Zeng, Kai; Wang, Shiqi; Wang, Zhou

    2015-11-01

    Objective quality assessment of distorted stereoscopic images is a challenging problem, especially when the distortions in the left and right views are asymmetric. Existing studies suggest that simply averaging the quality of the left and right views well predicts the quality of symmetrically distorted stereoscopic images, but generates substantial prediction bias when applied to asymmetrically distorted stereoscopic images. In this paper, we first build a database that contains both single-view and symmetrically and asymmetrically distorted stereoscopic images. We then carry out a subjective test, where we find that the quality prediction bias of the asymmetrically distorted images could lean toward opposite directions (overestimate or underestimate), depending on the distortion types and levels. Our subjective test also suggests that eye dominance effect does not have strong impact on the visual quality decisions of stereoscopic images. Furthermore, we develop an information content and divisive normalization-based pooling scheme that improves upon structural similarity in estimating the quality of single-view images. Finally, we propose a binocular rivalry-inspired multi-scale model to predict the quality of stereoscopic images from that of the single-view images. Our results show that the proposed model, without explicitly identifying image distortion types, successfully eliminates the prediction bias, leading to significantly improved quality prediction of the stereoscopic images. PMID:26087491

  8. LANDSAT-4 image data quality analysis

    NASA Technical Reports Server (NTRS)

    Anuta, P. E. (Principal Investigator)

    1982-01-01

    Work done on evaluating the geometric and radiometric quality of early LANDSAT-4 sensor data is described. Band to band and channel to channel registration evaluations were carried out using a line correlator. Visual blink comparisons were run on an image display to observe band to band registration over 512 x 512 pixel blocks. The results indicate a .5 pixel line misregistration between the 1.55 to 1.75, 2.08 to 2.35 micrometer bands and the first four bands. Also a four 30M line and column misregistration of the thermal IR band was observed. Radiometric evaluation included mean and variance analysis of individual detectors and principal components analysis. Results indicate that detector bias for all bands is very close or within tolerance. Bright spots were observed in the thermal IR band on an 18 line by 128 pixel grid. No explanation for this was pursued. The general overall quality of the TM was judged to be very high.

  9. Metric-based no-reference quality assessment of heterogeneous document images

    NASA Astrophysics Data System (ADS)

    Nayef, Nibal; Ogier, Jean-Marc

    2015-01-01

    No-reference image quality assessment (NR-IQA) aims at computing an image quality score that best correlates with either human perceived image quality or an objective quality measure, without any prior knowledge of reference images. Although learning-based NR-IQA methods have achieved the best state-of-the-art results so far, those methods perform well only on the datasets on which they were trained. The datasets usually contain homogeneous documents, whereas in reality, document images come from different sources. It is unrealistic to collect training samples of images from every possible capturing device and every document type. Hence, we argue that a metric-based IQA method is more suitable for heterogeneous documents. We propose a NR-IQA method with the objective quality measure of OCR accuracy. The method combines distortion-specific quality metrics. The final quality score is calculated taking into account the proportions of, and the dependency among different distortions. Experimental results show that the method achieves competitive results with learning-based NR-IQA methods on standard datasets, and performs better on heterogeneous documents.

  10. Lesion insertion in projection domain for computed tomography image quality assessment

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Ma, Chi; Yu, Zhicong; Leng, Shuai; Yu, Lifeng; McCollough, Cynthia

    2015-03-01

    To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way to achieve this objective is to create hybrid images that combine patient images with simulated lesions. Because conventional hybrid images generated in the image-domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Liver lesion models were forward projected according to the geometry of a commercial CT scanner to acquire lesion projections. The lesion projections were then inserted into patient projections (decoded from commercial CT raw data with the assistance of the vendor) and reconstructed to acquire hybrid images. To validate the accuracy of the forward projection geometry, simulated images reconstructed from the forward projections of a digital ACR phantom were compared to physically acquired ACR phantom images. To validate the hybrid images, lesion models were inserted into patient images and visually assessed. Results showed that the simulated phantom images and the physically acquired phantom images had great similarity in terms of HU accuracy and high-contrast resolution. The lesions in the hybrid image had a realistic appearance and merged naturally into the liver background. In addition, the inserted lesion demonstrated reconstruction-parameter-dependent appearance. Compared to conventional image-domain approach, our method enables more realistic hybrid images for image quality assessment.

  11. Food quality assessment by NIR hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Whitworth, Martin B.; Millar, Samuel J.; Chau, Astor

    2010-04-01

    Near infrared reflectance (NIR) spectroscopy is well established in the food industry for rapid compositional analysis of bulk samples. NIR hyperspectral imaging provides new opportunities to measure the spatial distribution of components such as moisture and fat, and to identify and measure specific regions of composite samples. An NIR hyperspectral imaging system has been constructed for food research applications, incorporating a SWIR camera with a cooled 14 bit HgCdTe detector and N25E spectrograph (Specim Ltd, Finland). Samples are scanned in a pushbroom mode using a motorised stage. The system has a spectral resolution of 256 pixels covering a range of 970-2500 nm and a spatial resolution of 320 pixels covering a swathe adjustable from 8 to 300 mm. Images are acquired at a rate of up to 100 lines s-1, enabling samples to be scanned within a few seconds. Data are captured using SpectralCube software (Specim) and analysed using ENVI and IDL (ITT Visual Information Solutions). Several food applications are presented. The strength of individual absorbance bands enables the distribution of particular components to be assessed. Examples are shown for detection of added gluten in wheat flour and to study the effect of processing conditions on fat distribution in chips/French fries. More detailed quantitative calibrations have been developed to study evolution of the moisture distribution in baguettes during storage at different humidities, to assess freshness of fish using measurements of whole cod and fillets, and for prediction of beef quality by identification and separate measurement of lean and fat regions.

  12. Achieving performance breakthroughs in an HMO business process through quality planning.

    PubMed

    Hanan, K B

    1993-01-01

    Kaiser Permanente's Georgia Region commissioned a quality planning team to design a new process to improve payments to its suppliers and vendors. The result of the team's effort was a 73 percent reduction in cycle time. This team's experiences point to the advantages of process redesign as a quality planning model, as well as some general guidelines for its most effective use in teams. If quality planning project teams are carefully configured, sufficiently expert in the existing process, and properly supported by management, organizations can achieve potentially dramatic improvements in process performance using this approach. PMID:10130708

  13. Perceptual Quality Assessment for Multi-Exposure Image Fusion.

    PubMed

    Ma, Kede; Zeng, Kai; Wang, Zhou

    2015-11-01

    Multi-exposure image fusion (MEF) is considered an effective quality enhancement technique widely adopted in consumer electronics, but little work has been dedicated to the perceptual quality assessment of multi-exposure fused images. In this paper, we first build an MEF database and carry out a subjective user study to evaluate the quality of images generated by different MEF algorithms. There are several useful findings. First, considerable agreement has been observed among human subjects on the quality of MEF images. Second, no single state-of-the-art MEF algorithm produces the best quality for all test images. Third, the existing objective quality models for general image fusion are very limited in predicting perceived quality of MEF images. Motivated by the lack of appropriate objective models, we propose a novel objective image quality assessment (IQA) algorithm for MEF images based on the principle of the structural similarity approach and a novel measure of patch structural consistency. Our experimental results on the subjective database show that the proposed model well correlates with subjective judgments and significantly outperforms the existing IQA models for general image fusion. Finally, we demonstrate the potential application of the proposed model by automatically tuning the parameters of MEF algorithms. PMID:26068317

  14. Mathematics Teacher Quality: Its Distribution and Relationship with Student Achievement in Turkey

    ERIC Educational Resources Information Center

    Özel, Zeynep Ebrar Yetkiner; Özel, Serkan

    2013-01-01

    A main purpose of the present study was to investigate the distribution of qualified mathematics teachers in relation to students' socioeconomic status (SES), as measured by parental education, among Turkish middle schools. Further, relationships between mathematics teacher quality indicators and students' mathematics achievement were…

  15. The Effects of Two Intervention Programs on Teaching Quality and Student Achievement

    ERIC Educational Resources Information Center

    Azkiyah, S. N.; Doolaard, Simone; Creemers, Bert P. M.; Van Der Werf, M. P. C.

    2014-01-01

    This paper compares the effectiveness of two interventions aimed to improve teaching quality and student achievement in Indonesia. The first intervention was the use of education standards, while the second one was the combination of education standards with a teacher improvement program. The study involved 50 schools, 52 teachers, and 1660…

  16. Transactional Relationships between Latinos' Friendship Quality and Academic Achievement during the Transition to Middle School

    ERIC Educational Resources Information Center

    Sebanc, Anne M.; Guimond, Amy B.; Lutgen, Jeff

    2016-01-01

    This study investigates whether friendship quality, academic achievement, and mastery goal orientation predict each other across the transition to middle school. Participants were 146 Latino students (75 girls) followed from the end of elementary school through the first year of middle school. Measures included positive and negative friendship…

  17. The Effect of the Adoption of the Quality Philosophy by Teachers on Student Achievement

    ERIC Educational Resources Information Center

    Sandifer, Cody Clark

    2009-01-01

    The purpose of this study was to determine if the adoption of the Deming philosophy by teachers and use of the LtoJ[R] process resulted in greater academic achievement. Results of internal consistency analysis indicated that the instrument, the "Commitment to Quality Inventory for Educators," was a reliable measure of the Deming philosophy for…

  18. The Relationship of IEP Quality to Curricular Access and Academic Achievement for Students with Disabilities

    ERIC Educational Resources Information Center

    La Salle, Tamika P.; Roach, Andrew T.; McGrath, Dawn

    2013-01-01

    The purpose of this study was to investigate the quality of Individualized Education Programs (IEPs) and its influence on academic achievement, inclusion in general education classrooms, and curricular access for students with disabilities. 130 teachers from the state of Indiana were asked to submit the most recent IEP of one of their students in…

  19. Friendship Quality and School Achievement: A Longitudinal Analysis during Primary School

    ERIC Educational Resources Information Center

    Zucchetti, Giulia; Candela, Filippo; Sacconi, Beatrice; Rabaglietti, Emanuela

    2015-01-01

    This study examined the longitudinal relationship between friendship quality (positive and negative) and school achievement among 228 school-age children (51% girls, M = 8.09, SD = 0.41). A three-wave cross-lagged analysis was used to determine the direction of influence between these domains across school years. Findings revealed that: (a) school…

  20. Student Course Taking and Teacher Quality: Their Effects on Achievement and Growth

    ERIC Educational Resources Information Center

    Heck, Ronald H.; Mahoe, Rochelle

    2010-01-01

    Purpose: The purpose of this paper is to examine the relationship between high school students' curricular positions, their perceptions of the quality of their teachers, and school academic process variables on students' growth rates and ending achievement in mathematics and science. Design/methodology/approach: Multilevel latent curve modeling is…

  1. Teacher-Child Relationship Quality and Academic Achievement of Chinese American Children in Immigrant Families

    ERIC Educational Resources Information Center

    Ly, Jennifer; Zhou, Qing; Chu, Keira; Chen, Stephen H.

    2012-01-01

    This study examined the cross-sectional relations between teacher-child relationship quality (TCRQ) and math and reading achievement in a socio-economically diverse sample of Chinese American first- and second-grade children in immigrant families (N=207). Teachers completed a questionnaire measuring TCRQ dimensions including closeness, conflict,…

  2. The Relation among School District Health, Total Quality Principles for School Organization and Student Achievement

    ERIC Educational Resources Information Center

    Marshall, Jon; Pritchard, Ruie; Gunderson, Betsey

    2004-01-01

    The purpose of this study was to determine the congruence among W. E. Deming's 14 points for Total Quality Management (TQM), the organizational health of school districts, and student achievement. Based on Kanter's (1983) concept of a Culture of Pride with a Climate of Success, healthy districts were defined as having an organizational culture…

  3. Can Schools Achieve Both Quality and Equity? Investigating the Two Dimensions of Educational Effectiveness

    ERIC Educational Resources Information Center

    Kyriakides, L.; Creemers, B. P. M.

    2011-01-01

    This article investigates the extent to which schools can achieve both equity and quality. Data emerged from two effectiveness studies in teaching mathematics and Greek language, which were conducted to test the validity of the dynamic model of educational effectiveness. Separate multilevel analyses for each subject were conducted and it was found…

  4. 77 FR 1687 - EPA Workshops on Achieving Water Quality Through Integrated Municipal Stormwater and Wastewater...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-11

    ... solutions to multiple causes of water pollution. The Agency anticipates that the framework document will... AGENCY EPA Workshops on Achieving Water Quality Through Integrated Municipal Stormwater and Wastewater Plans Under the Clean Water Act (CWA) AGENCY: Environmental Protection Agency (EPA). ACTION:...

  5. Social Capital, Human Capital and Parent-Child Relation Quality: Interacting for Children's Educational Achievement?

    ERIC Educational Resources Information Center

    von Otter, Cecilia; Stenberg, Sten-Åke

    2015-01-01

    We analyse the utility of social capital for children's achievement, and if this utility interacts with family human capital and the quality of the parent-child relationship. Our focus is on parental activities directly related to children's school work. Our data stem from a Swedish cohort born in 1953 and consist of both survey and register data.…

  6. Depressive Symptoms in Third-Grade Teachers: Relations to Classroom Quality and Student Achievement

    ERIC Educational Resources Information Center

    McLean, Leigh; Connor, Carol McDonald

    2015-01-01

    This study investigated associations among third-grade teachers' (N = 27) symptoms of depression, quality of the classroom-learning environment (CLE), and students' (N = 523, M[subscript age] = 8.6 years) math and literacy performance. teachers' depressive symptoms in the winter negatively predicted students' spring mathematics achievement. This…

  7. Improving quality and reducing inequities: a challenge in achieving best care

    PubMed Central

    Nicewander, David A.; Qin, Huanying; Ballard, David J.

    2006-01-01

    The health care quality chasm is better described as a gulf for certain segments of the population, such as racial and ethnic minority groups, given the gap between actual care received and ideal or best care quality. The landmark Institute of Medicine report Crossing the Quality Chasm: A New Health System for the 21st Century challenges all health care organizations to pursue six major aims of health care improvement: safety, timeliness, effectiveness, efficiency, equity, and patient-centeredness. “Equity” aims to ensure that quality care is available to all and that the quality of care provided does not differ by race, ethnicity, or other personal characteristics unrelated to a patient's reason for seeking care. Baylor Health Care System is in the unique position of being able to examine the current state of equity in a typical health care delivery system and to lead the way in health equity research. Its organizational vision, “culture of quality,” and involved leadership bode well for achieving equitable best care. However, inequities in access, use, and outcomes of health care must be scrutinized; the moral, ethical, and economic issues they raise and the critical injustice they create must be remedied if this goal is to be achieved. Eliminating any observed inequities in health care must be synergistically integrated with quality improvement. Quality performance indicators currently collected and evaluated indicate that Baylor Health Care System often performs better than the national average. However, there are significant variations in care by age, gender, race/ethnicity, and socioeconomic status that indicate the many remaining challenges in achieving “best care” for all. PMID:16609733

  8. Contrast sensitivity function calibration based on image quality prediction

    NASA Astrophysics Data System (ADS)

    Han, Yu; Cai, Yunze

    2014-11-01

    Contrast sensitivity functions (CSFs) describe visual stimuli based on their spatial frequency. However, CSF calibration is limited by the size of the sample collection and this remains an open issue. In this study, we propose an approach for calibrating CSFs that is based on the hypothesis that a precise CSF model can accurately predict image quality. Thus, CSF calibration is regarded as the inverse problem of image quality prediction according to our hypothesis. A CSF could be calibrated by optimizing the performance of a CSF-based image quality metric using a database containing images with known quality. Compared with the traditional method, this would reduce the work involved in sample collection dramatically. In the present study, we employed three image databases to optimize some existing CSF models. The experimental results showed that the performance of a three-parameter CSF model was better than that of other models. The results of this study may be helpful in CSF and image quality research.

  9. Mutual information as a measure of image quality for 3D dynamic lung imaging with EIT

    PubMed Central

    Crabb, M G; Davidson, J L; Little, R; Wright, P; Morgan, A R; Miller, C A; Naish, J H; Parker, G J M; Kikinis, R; McCann, H; Lionheart, W R B

    2014-01-01

    We report on a pilot study of dynamic lung electrical impedance tomography (EIT) at the University of Manchester. Low-noise EIT data at 100 frames per second (fps) were obtained from healthy male subjects during controlled breathing, followed by magnetic resonance imaging (MRI) subsequently used for spatial validation of the EIT reconstruction. The torso surface in the MR image and electrode positions obtained using MRI fiducial markers informed the construction of a 3D finite element model extruded along the caudal-distal axis of the subject. Small changes in the boundary that occur during respiration were accounted for by incorporating the sensitivity with respect to boundary shape into a robust temporal difference reconstruction algorithm. EIT and MRI images were co-registered using the open source medical imaging software, 3D Slicer. A quantitative comparison of quality of different EIT reconstructions was achieved through calculation of the mutual information with a lung-segmented MR image. EIT reconstructions using a linear shape correction algorithm reduced boundary image artefacts, yielding better contrast of the lungs, and had 10% greater mutual information compared with a standard linear EIT reconstruction. PMID:24710978

  10. Research iris serial images quality assessment method based on HVS

    NASA Astrophysics Data System (ADS)

    Li, Zhi-hui; Zhang, Chang-hai; Ming, Xing; Zhao, Yong-hua

    2006-01-01

    Iris recognition can be widely used in security and customs, and it provides superiority security than other human feature recognition such as fingerprint, face and so on. The iris image quality is crucial to recognition effect. Accordingly reliable image quality assessments are necessary for evaluating iris image quality. However, there haven't uniformly criterion to Image quality assessment. Image quality assessment have Objective and Subjective Evaluation methods, In practice, However Subjective Evaluation method is fussy and doesn't effective on iris recognition. Objective Evaluation method should be used in iris recognition. According to human visual system model (HVS) Multi-scale and selectivity characteristic, it presents a new iris Image quality assessment method. In the paper, ROI is found and wavelet transform zero-crossing is used to find Multi-scale edge, and Multi-scale fusion measure is used to assess iris image quality. In experiment, Objective and Subjective Evaluation methods are used to assess iris images. From the results, the method is effectively to iris image quality assessment.

  11. Objective assessment of image quality and dose reduction in CT iterative reconstruction

    SciTech Connect

    Vaishnav, J. Y. Jung, W. C.; Popescu, L. M.; Zeng, R.; Myers, K. J.

    2014-07-15

    Purpose: Iterative reconstruction (IR) algorithms have the potential to reduce radiation dose in CT diagnostic imaging. As these algorithms become available on the market, a standardizable method of quantifying the dose reduction that a particular IR method can achieve would be valuable. Such a method would assist manufacturers in making promotional claims about dose reduction, buyers in comparing different devices, physicists in independently validating the claims, and the United States Food and Drug Administration in regulating the labeling of CT devices. However, the nonlinear nature of commercially available IR algorithms poses challenges to objectively assessing image quality, a necessary step in establishing the amount of dose reduction that a given IR algorithm can achieve without compromising that image quality. This review paper seeks to consolidate information relevant to objectively assessing the quality of CT IR images, and thereby measuring the level of dose reduction that a given IR algorithm can achieve. Methods: The authors discuss task-based methods for assessing the quality of CT IR images and evaluating dose reduction. Results: The authors explain and review recent literature on signal detection and localization tasks in CT IR image quality assessment, the design of an appropriate phantom for these tasks, possible choices of observers (including human and model observers), and methods of evaluating observer performance. Conclusions: Standardizing the measurement of dose reduction is a problem of broad interest to the CT community and to public health. A necessary step in the process is the objective assessment of CT image quality, for which various task-based methods may be suitable. This paper attempts to consolidate recent literature that is relevant to the development and implementation of task-based methods for the assessment of CT IR image quality.

  12. The Relationship between University Students' Academic Achievement and Perceived Organizational Image

    ERIC Educational Resources Information Center

    Polat, Soner

    2011-01-01

    The purpose of present study was to determine the relationship between university students' academic achievement and perceived organizational image. The sample of the study was the senior students at the faculties and vocational schools in Umuttepe Campus at Kocaeli University. Because the development of organizational image is a long process, the…

  13. Wavelet image processing applied to optical and digital holography: past achievements and future challenges

    NASA Astrophysics Data System (ADS)

    Jones, Katharine J.

    2005-08-01

    The link between wavelets and optics goes back to the work of Dennis Gabor who both invented holography and developed Gabor decompositions. Holography involves 3-D images. Gabor decompositions involves 1-D signals. Gabor decompositions are the predecessors of wavelets. Wavelet image processing of holography, both optical holography and digital holography, will be examined with respect to past achievements and future challenges.

  14. Automated FMV image quality assessment based on power spectrum statistics

    NASA Astrophysics Data System (ADS)

    Kalukin, Andrew

    2015-05-01

    Factors that degrade image quality in video and other sensor collections, such as noise, blurring, and poor resolution, also affect the spatial power spectrum of imagery. Prior research in human vision and image science from the last few decades has shown that the image power spectrum can be useful for assessing the quality of static images. The research in this article explores the possibility of using the image power spectrum to automatically evaluate full-motion video (FMV) imagery frame by frame. This procedure makes it possible to identify anomalous images and scene changes, and to keep track of gradual changes in quality as collection progresses. This article will describe a method to apply power spectral image quality metrics for images subjected to simulated blurring, blocking, and noise. As a preliminary test on videos from multiple sources, image quality measurements for image frames from 185 videos are compared to analyst ratings based on ground sampling distance. The goal of the research is to develop an automated system for tracking image quality during real-time collection, and to assign ratings to video clips for long-term storage, calibrated to standards such as the National Imagery Interpretability Rating System (NIIRS).

  15. The study of surgical image quality evaluation system by subjective quality factor method

    NASA Astrophysics Data System (ADS)

    Zhang, Jian J.; Xuan, Jason R.; Yang, Xirong; Yu, Honggang; Koullick, Edouard

    2016-03-01

    GreenLightTM procedure is an effective and economical way of treatment of benign prostate hyperplasia (BPH); there are almost a million of patients treated with GreenLightTM worldwide. During the surgical procedure, the surgeon or physician will rely on the monitoring video system to survey and confirm the surgical progress. There are a few obstructions that could greatly affect the image quality of the monitoring video, like laser glare by the tissue and body fluid, air bubbles and debris generated by tissue evaporation, and bleeding, just to name a few. In order to improve the physician's visual experience of a laser surgical procedure, the system performance parameter related to image quality needs to be well defined. However, since image quality is the integrated set of perceptions of the overall degree of excellence of an image, or in other words, image quality is the perceptually weighted combination of significant attributes (contrast, graininess …) of an image when considered in its marketplace or application, there is no standard definition on overall image or video quality especially for the no-reference case (without a standard chart as reference). In this study, Subjective Quality Factor (SQF) and acutance are used for no-reference image quality evaluation. Basic image quality parameters, like sharpness, color accuracy, size of obstruction and transmission of obstruction, are used as subparameter to define the rating scale for image quality evaluation or comparison. Sample image groups were evaluated by human observers according to the rating scale. Surveys of physician groups were also conducted with lab generated sample videos. The study shows that human subjective perception is a trustworthy way of image quality evaluation. More systematic investigation on the relationship between video quality and image quality of each frame will be conducted as a future study.

  16. The effect of teacher quality on the achievement of students in Integrated Physics and Chemistry

    NASA Astrophysics Data System (ADS)

    Alexander, Rima

    For many years, researchers, policy makers, and the education community have explored various school variables and their impact on student achievement (Darling-Hammond, 2000; Ferguson and Womack 1993; Ferguson and Ladd 1996; Rice, 2003; Rockoff, 2003; Rowan, Chiang, and Miller 1997; Sanders and Horn, 1996; Wright Horn and Sanders, 1997). Invariably, the issue of teacher quality arises. Teacher quality is the single most influential factor under school control that affects student achievement (Darling-Hammond, 2000; Rice, 2003; Rockoff, 2003; Sanders and Horn, 1996; Wright Horn and Sanders, 1997). Generally, students taught by highly qualified teachers perform better on standardized tests than students with less qualified teachers (Ferguson and Womack 1993; Ferguson and Ladd 1996; Rowan, Chiang, and Miller 1997). Previous research indicates that teachers indeed matter for the improvement of student achievement, but getting good measures of what is meant by teacher quality is a continuing challenge (Goldhaber, 2002). The purpose of this study was to describe the effect of teacher quality on the achievement of students in Integrated Physics and Chemistry (IPC). In order to achieve this purpose, this study addressed the following research question: chemistry and physics teachers compare to the achievement of students taught by less-qualified IPC teachers? A causal-comparative methodology was employed to address this research question. The independent variable was teacher quality---highly-qualified or less qualified. The teacher attributes that were examined in this study are: (1) teachers' educational background; (2) content knowledge; (3) pedagogical knowledge; and (4) certification. The dependent variable was student achievement in integrated physics and chemistry, as measured by an end-of-course IPC District Assessment of Curriculum, IPC DAC. Descriptive statistics were computed for the independent variable in the study. A Chi Square was performed on the data

  17. Advancing the Quality of Solar Occultation Retrievals through Solar Imaging

    NASA Astrophysics Data System (ADS)

    Gordley, L. L.; Hervig, M. E.; Marshall, B. T.; Russell, J. E.; Bailey, S. M.; Brown, C. W.; Burton, J. C.; Deaver, L. E.; Magill, B. E.; McHugh, M. J.; Paxton, G. J.; Thompson, R. E.

    2008-12-01

    The quality of retrieved profiles (e.g. mixing ratio, temperature, pressure, and extinction) from solar occultation sensors is strongly dependent on the angular fidelity of the measurements. The SOFIE instrument, launched on-board the AIM (Aeronomy of Ice in the Mesosphere) satellite on April 25, 2007, was designed to provide very high precision broadband measurements for the study of Polar Mesospheric Clouds (PMCs), that appear near 83km, just below the high latitude summer mesopause. The SOFIE instrument achieves an unprecedented angular fidelity by imaging the sun on a 2D detector array and tracking the edges with an uncertainty of <0.1 arc seconds. This makes possible retrieved profiles of vertical high resolution mixing ratios, refraction base temperature and pressure from tropopause to lower mesosphere, and transmission with accuracy sufficient to infer cosmic smoke extinction. Details of the approach and recent results will be presented.

  18. Full-reference quality assessment of stereoscopic images by learning sparse monocular and binocular features

    NASA Astrophysics Data System (ADS)

    Li, Kemeng; Shao, Feng; Jiang, Gangyi; Yu, Mei

    2014-11-01

    Perceptual stereoscopic image quality assessment (SIQA) aims to use computational models to measure the image quality in consistent with human visual perception. In this research, we try to simulate monocular and binocular visual perception, and proposed a monocular-binocular feature fidelity (MBFF) induced index for SIQA. To be more specific, in the training stage, we learn monocular and binocular dictionaries from the training database, so that the latent response properties can be represented as a set of basis vectors. In the quality estimation stage, we compute monocular feature fidelity (MFF) and binocular feature fidelity (BFF) indexes based on the estimated sparse coefficient vectors, and compute global energy response similarity (GERS) index by considering energy changes. The final quality score is obtained by incorporating them together. Experimental results on four public 3D image quality assessment databases demonstrate that in comparison with the most related existing methods, the devised algorithm achieves high consistency alignment with subjective assessment.

  19. Color image quality assessment with biologically inspired feature and machine learning

    NASA Astrophysics Data System (ADS)

    Deng, Cheng; Tao, Dacheng

    2010-07-01

    In this paper, we present a new no-reference quality assessment metric for color images by using biologically inspired features (BIFs) and machine learning. In this metric, we first adopt a biologically inspired model to mimic the visual cortex and represent a color image based on BIFs which unifies color units, intensity units and C1 units. Then, in order to reduce the complexity and benefit the classification, the high dimensional features are projected to a low dimensional representation with manifold learning. Finally, a multiclass classification process is performed on this new low dimensional representation of the image and the quality assessment is based on the learned classification result in order to respect the one of the human observers. Instead of computing a final note, our method classifies the quality according to the quality scale recommended by the ITU. The preliminary results show that the developed metric can achieve good quality evaluation performance.

  20. Preoperative treatment planning with intraoperative optimization can achieve consistent high-quality implants in prostate brachytherapy

    SciTech Connect

    Kudchadker, Rajat J.; Pugh, Thomas J.; Swanson, David A.; Bruno, Teresa L.; Bolukbasi, Yasemin; Frank, Steven J.

    2012-01-01

    Advances in brachytherapy treatment planning systems have allowed the opportunity for brachytherapy to be planned intraoperatively as well as preoperatively. The relative advantages and disadvantages of each approach have been the subject of extensive debate, and some contend that the intraoperative approach is vital to the delivery of optimal therapy. The purpose of this study was to determine whether high-quality permanent prostate implants can be achieved consistently using a preoperative planning approach that allows for, but does not necessitate, intraoperative optimization. To achieve this purpose, we reviewed the records of 100 men with intermediate-risk prostate cancer who had been prospectively treated with brachytherapy monotherapy between 2006 and 2009 at our institution. All patients were treated with iodine-125 stranded seeds; the planned target dose was 145 Gy. Only 8 patients required adjustments to the plan on the basis of intraoperative findings. Consistency and quality were assessed by calculating the correlation coefficient between the planned and implanted amounts of radioactivity and by examining the mean values of the dosimetric parameters obtained on preoperative and 30 days postoperative treatment planning. The amount of radioactivity implanted was essentially identical to that planned (mean planned radioactivity, 41.27 U vs. mean delivered radioactivity, 41.36 U; R{sup 2} = 0.99). The mean planned and day 30 prostate V100 values were 99.9% and 98.6%, respectively. The mean planned and day 30 prostate D90 values were 186.3 and 185.1 Gy, respectively. Consistent, high-quality prostate brachytherapy treatment plans can be achieved using a preoperative planning approach, mostly without the need for intraoperative optimization. Good quality assurance measures during simulation, treatment planning, implantation, and postimplant evaluation are paramount for achieving a high level of quality and consistency.

  1. Is image quality a function of contrast perception?

    NASA Astrophysics Data System (ADS)

    Haun, Andrew M.; Peli, Eli

    2013-03-01

    In this retrospective we trace in broad strokes the development of image quality measures based on the study of the early stages of the human visual system (HVS), where contrast encoding is fundamental. We find that while presenters at the Human Vision and Electronic Imaging meetings have frequently strived to find points of contact between the study of human contrast psychophysics and the development of computer vision and image quality algorithms. Progress has not always been made on these terms, although indirect impact of vision science on more recent image quality metrics can be observed.

  2. New image quality assessment method using wavelet leader pyramids

    NASA Astrophysics Data System (ADS)

    Chen, Xiaolin; Yang, Xiaokang; Zheng, Shibao; Lin, Weiyao; Zhang, Rui; Zhai, Guangtao

    2011-06-01

    In this paper, we propose a wave leader pyramids based Visual Information Fidelity method for image quality assessment. Motivated by the observations that the human vision systems (HVS) are more sensitive to edge and contour regions and that the human visual sensitivity varies with spatial frequency, we first introduce the two-dimensional wavelet leader pyramids to robustly extract the multiscale information of edges. Based on the wavelet leader pyramids, we further propose a visual information fidelity metric to evaluate the quality of images by quantifying the information loss between the original and the distorted images. Experimental results show that our method outperforms many state-of-the-art image quality metrics.

  3. Dynamic flat panel detector versus image intensifier in cardiac imaging: dose and image quality

    NASA Astrophysics Data System (ADS)

    Vano, E.; Geiger, B.; Schreiner, A.; Back, C.; Beissel, J.

    2005-12-01

    The practical aspects of the dosimetric and imaging performance of a digital x-ray system for cardiology procedures were evaluated. The system was configured with an image intensifier (II) and later upgraded to a dynamic flat panel detector (FD). Entrance surface air kerma (ESAK) to phantoms of 16, 20, 24 and 28 cm of polymethyl methacrylate (PMMA) and the image quality of a test object were measured. Images were evaluated directly on the monitor and with numerical methods (noise and signal-to-noise ratio). Information contained in the DICOM header for dosimetry audit purposes was also tested. ESAK values per frame (or kerma rate) for the most commonly used cine and fluoroscopy modes for different PMMA thicknesses and for field sizes of 17 and 23 cm for II, and 20 and 25 cm for FD, produced similar results in the evaluated system with both technologies, ranging between 19 and 589 µGy/frame (cine) and 5 and 95 mGy min-1 (fluoroscopy). Image quality for these dose settings was better for the FD version. The 'study dosimetric report' is comprehensive, and its numerical content is sufficiently accurate. There is potential in the future to set those systems with dynamic FD to lower doses than are possible in the current II versions, especially for digital cine runs, or to benefit from improved image quality.

  4. Improving the Quality of Imaging in the Emergency Department.

    PubMed

    Blackmore, C Craig; Castro, Alexandra

    2015-12-01

    Imaging is critical for the care of emergency department (ED) patients. However, much of the imaging performed for acute care today is overutilization, creating substantial cost without significant benefit. Further, the value of imaging is not easily defined, as imaging only affects outcomes indirectly, through interaction with treatment. Improving the quality, including appropriateness, of emergency imaging requires understanding of how imaging contributes to patient care. The six-tier efficacy hierarchy of Fryback and Thornbury enables understanding of the value of imaging on multiple levels, ranging from technical efficacy to medical decision-making and higher-level patient and societal outcomes. The imaging efficacy hierarchy also allows definition of imaging quality through the Institute of Medicine (IOM)'s quality domains of safety, effectiveness, patient-centeredness, timeliness, efficiency, and equitability and provides a foundation for quality improvement. In this article, the authors elucidate the Fryback and Thornbury framework to define the value of imaging in the ED and to relate emergency imaging to the IOM quality domains. PMID:26568040

  5. Quaternion structural similarity: a new quality index for color images.

    PubMed

    Kolaman, Amir; Yadid-Pecht, Orly

    2012-04-01

    One of the most important issues for researchers developing image processing algorithms is image quality. Methodical quality evaluation, by showing images to several human observers, is slow, expensive, and highly subjective. On the other hand, a visual quality matrix (VQM) is a fast, cheap, and objective tool for evaluating image quality. Although most VQMs are good in predicting the quality of an image degraded by a single degradation, they poorly perform for a combination of two degradations. An example for such degradation is the color crosstalk (CTK) effect, which introduces blur with desaturation. CTK is expected to become a bigger issue in image quality as the industry moves toward smaller sensors. In this paper, we will develop a VQM that will be able to better evaluate the quality of an image degraded by a combined blur/desaturation degradation and perform as well as other VQMs on single degradations such as blur, compression, and noise. We show why standard scalar techniques are insufficient to measure a combined blur/desaturation degradation and explain why a vectorial approach is better suited. We introduce quaternion image processing (QIP), which is a true vectorial approach and has many uses in the fields of physics and engineering. Our new VQM is a vectorial expansion of structure similarity using QIP, which gave it its name-Quaternion Structural SIMilarity (QSSIM). We built a new database of a combined blur/desaturation degradation and conducted a quality survey with human subjects. An extensive comparison between QSSIM and other VQMs on several image quality databases-including our new database-shows the superiority of this new approach in predicting visual quality of color images. PMID:22203713

  6. Automated retinal image quality assessment on the UK Biobank dataset for epidemiological studies.

    PubMed

    Welikala, R A; Fraz, M M; Foster, P J; Whincup, P H; Rudnicka, A R; Owen, C G; Strachan, D P; Barman, S A

    2016-04-01

    Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost. PMID:26894596

  7. Investigation into the impact of tone reproduction on the perceived image quality of fine art reproductions

    NASA Astrophysics Data System (ADS)

    Farnand, Susan; Jiang, Jun; Frey, Franziska

    2012-01-01

    A project, supported by the Andrew W. Mellon Foundation, evaluating current practices in fine art image reproduction, determining the image quality generally achievable, and establishing a suggested framework for art image interchange was recently completed. (Information regarding the Mellon project and related work may be found at www.artimaging.rit.edu.) To determine the image quality currently being achieved, experimentation was conducted in which a set of objective targets and pieces of artwork in various media were imaged by participating museums and other cultural heritage institutions. Prints and images for display made from the delivered image files at the Rochester Institute of Technology were used as stimuli in psychometric testing in which observers were asked to evaluate the prints as reproductions of the original artwork and as stand alone images. The results indicated that there were limited differences between assessments made with and without the original present for printed reproductions. For displayed images, the differences were more significant with lower contrast images being ranked lower and higher contrast images generally ranked higher when the original was not present. This was true for experiments conducted both in a dimly lit laboratory as well as via the web, indicating that more than viewing conditions were driving this shift.

  8. Effect of image quality on calcification detection in digital mammography

    SciTech Connect

    Warren, Lucy M.; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M.; Wallis, Matthew G.; Chakraborty, Dev P.; Dance, David R.; Bosmans, Hilde; Young, Kenneth C.

    2012-06-15

    Purpose: This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. Methods: One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. Results: There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC

  9. Effect of image quality on calcification detection in digital mammography

    PubMed Central

    Warren, Lucy M.; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M.; Wallis, Matthew G.; Chakraborty, Dev P.; Dance, David R.; Bosmans, Hilde; Young, Kenneth C.

    2012-01-01

    Purpose: This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. Methods: One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. Results: There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC

  10. Image quality and dose efficiency of high energy phase sensitive x-ray imaging: Phantom studies

    PubMed Central

    Wong, Molly Donovan; Wu, Xizeng; Liu, Hong

    2014-01-01

    The goal of this preliminary study was to perform an image quality comparison of high energy phase sensitive imaging with low energy conventional imaging at similar radiation doses. The comparison was performed with the following phantoms: American College of Radiology (ACR), contrast-detail (CD), acrylic edge and tissue-equivalent. Visual comparison of the phantom images indicated comparable or improved image quality for all phantoms. Quantitative comparisons were performed through ACR and CD observer studies, both of which indicated higher image quality in the high energy phase sensitive images. The results of this study demonstrate the ability of high energy phase sensitive imaging to overcome existing challenges with the clinical implementation of phase contrast imaging and improve the image quality for a similar radiation dose as compared to conventional imaging near typical mammography energies. In addition, the results illustrate the capability of phase sensitive imaging to sustain the image quality improvement at high x-ray energies and for – breast – simulating phantoms, both of which indicate the potential to benefit fields such as mammography. Future studies will continue to investigate the potential for dose reduction and image quality improvement provided by high energy phase sensitive contrast imaging. PMID:24865208

  11. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  12. A new assessment method for image fusion quality

    NASA Astrophysics Data System (ADS)

    Li, Liu; Jiang, Wanying; Li, Jing; Yuchi, Ming; Ding, Mingyue; Zhang, Xuming

    2013-03-01

    Image fusion quality assessment plays a critically important role in the field of medical imaging. To evaluate image fusion quality effectively, a lot of assessment methods have been proposed. Examples include mutual information (MI), root mean square error (RMSE), and universal image quality index (UIQI). These image fusion assessment methods could not reflect the human visual inspection effectively. To address this problem, we have proposed a novel image fusion assessment method which combines the nonsubsampled contourlet transform (NSCT) with the regional mutual information in this paper. In this proposed method, the source medical images are firstly decomposed into different levels by the NSCT. Then the maximum NSCT coefficients of the decomposed directional images at each level are obtained to compute the regional mutual information (RMI). Finally, multi-channel RMI is computed by the weighted sum of the obtained RMI values at the various levels of NSCT. The advantage of the proposed method lies in the fact that the NSCT can represent image information using multidirections and multi-scales and therefore it conforms to the multi-channel characteristic of human visual system, leading to its outstanding image assessment performance. The experimental results using CT and MRI images demonstrate that the proposed assessment method outperforms such assessment methods as MI and UIQI based measure in evaluating image fusion quality and it can provide consistent results with human visual assessment.

  13. Raman chemical imaging system for food safety and quality inspection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Raman chemical imaging technique combines Raman spectroscopy and digital imaging to visualize composition and structure of a target, and it offers great potential for food safety and quality research. In this study, a laboratory-based Raman chemical imaging platform was designed and developed. The i...

  14. Study on the improvement of overall optical image quality via digital image processing

    NASA Astrophysics Data System (ADS)

    Tsai, Cheng-Mu; Fang, Yi Chin; Lin, Yu Chin

    2008-12-01

    This paper studies the effects of improving overall optical image quality via Digital Image Processing (DIP) and compares the promoted optical image with the non-processed optical image. Seen from the optical system, the improvement of image quality has a great influence on chromatic aberration and monochromatic aberration. However, overall image capture systems-such as cellphones and digital cameras-include not only the basic optical system but also many other factors, such as the electronic circuit system, transducer system, and so forth, whose quality can directly affect the image quality of the whole picture. Therefore, in this thesis Digital Image Processing technology is utilized to improve the overall image. It is shown via experiments that system modulation transfer function (MTF) based on the proposed DIP technology and applied to a comparatively bad optical system can be comparable to, even possibly superior to, the system MTF derived from a good optical system.

  15. Toward a Blind Deep Quality Evaluator for Stereoscopic Images Based on Monocular and Binocular Interactions.

    PubMed

    Shao, Feng; Tian, Weijun; Lin, Weisi; Jiang, Gangyi; Dai, Qionghai

    2016-05-01

    During recent years, blind image quality assessment (BIQA) has been intensively studied with different machine learning tools. Existing BIQA metrics, however, do not design for stereoscopic images. We believe this problem can be resolved by separating 3D images and capturing the essential attributes of images via deep neural network. In this paper, we propose a blind deep quality evaluator (DQE) for stereoscopic images (denoted by 3D-DQE) based on monocular and binocular interactions. The key technical steps in the proposed 3D-DQE are to train two separate 2D deep neural networks (2D-DNNs) from 2D monocular images and cyclopean images to model the process of monocular and binocular quality predictions, and combine the measured 2D monocular and cyclopean quality scores using different weighting schemes. Experimental results on four public 3D image quality assessment databases demonstrate that in comparison with the existing methods, the devised algorithm achieves high consistent alignment with subjective assessment. PMID:26960225

  16. Automated quality assessment in three-dimensional breast ultrasound images.

    PubMed

    Schwaab, Julia; Diez, Yago; Oliver, Arnau; Martí, Robert; van Zelst, Jan; Gubern-Mérida, Albert; Mourri, Ahmed Bensouda; Gregori, Johannes; Günther, Matthias

    2016-04-01

    Automated three-dimensional breast ultrasound (ABUS) is a valuable adjunct to x-ray mammography for breast cancer screening of women with dense breasts. High image quality is essential for proper diagnostics and computer-aided detection. We propose an automated image quality assessment system for ABUS images that detects artifacts at the time of acquisition. Therefore, we study three aspects that can corrupt ABUS images: the nipple position relative to the rest of the breast, the shadow caused by the nipple, and the shape of the breast contour on the image. Image processing and machine learning algorithms are combined to detect these artifacts based on 368 clinical ABUS images that have been rated manually by two experienced clinicians. At a specificity of 0.99, 55% of the images that were rated as low quality are detected by the proposed algorithms. The areas under the ROC curves of the single classifiers are 0.99 for the nipple position, 0.84 for the nipple shadow, and 0.89 for the breast contour shape. The proposed algorithms work fast and reliably, which makes them adequate for online evaluation of image quality during acquisition. The presented concept may be extended to further image modalities and quality aspects. PMID:27158633

  17. Teacher-child relationship quality and academic achievement of Chinese American children in immigrant families.

    PubMed

    Ly, Jennifer; Zhou, Qing; Chu, Keira; Chen, Stephen H

    2012-08-01

    This study examined the cross-sectional relations between teacher-child relationship quality (TCRQ) and math and reading achievement in a socio-economically diverse sample of Chinese American first- and second-grade children in immigrant families (N=207). Teachers completed a questionnaire measuring TCRQ dimensions including closeness, conflict, and intimacy, and children completed a questionnaire measuring overall TCRQ. Standardized tests were used to assess children's math and reading skills. Analyses were conducted to (a) test the factor structure of measures assessing TCRQ among Chinese American children, (b) examine the associations between teacher- and child-rated TCRQ and children's academic achievement, controlling for demographic characteristics, and (c) examine the potential role of child gender as a moderator in the relations between TCRQ and achievement. Results indicated that teacher-rated TCRQ Warmth was positively associated with Chinese American children's reading achievement. Two child gender-by-TCRQ interactions were found: (a) teacher-rated TCRQ Conflict was negatively associated with girls' (but not boys') math achievement, and (b) child-rated Overall TCRQ was positively associated with boys' (but not girls') reading achievement. These findings highlight the valuable role of TCRQ in the academic success of school-aged children in immigrant families. PMID:22710020

  18. Retinal image quality assessment through a visual similarity index

    NASA Astrophysics Data System (ADS)

    Pérez, Jorge; Espinosa, Julián; Vázquez, Carmen; Mas, David

    2013-04-01

    Retinal image quality is commonly analyzed through parameters inherited from instrumental optics. These parameters are defined for 'good optics' so they are hard to translate into visual quality metrics. Instead of using point or artificial functions, we propose a quality index that takes into account properties of natural images. These images usually show strong local correlations that help to interpret the image. Our aim is to derive an objective index that quantifies the quality of vision by taking into account the local structure of the scene, instead of focusing on a particular aberration. As we show, this index highly correlates with visual acuity and allows inter-comparison of natural images around the retina. The usefulness of the index is proven through the analysis of real eyes before and after undergoing corneal surgery, which usually are hard to analyze with standard metrics.

  19. Testing scanners for the quality of output images

    NASA Astrophysics Data System (ADS)

    Concepcion, Vicente P.; Nadel, Lawrence D.; D'Amato, Donald P.

    1995-01-01

    Document scanning is the means through which documents are converted to their digital image representation for electronic storage or distribution. Among the types of documents being scanned by government agencies are tax forms, patent documents, office correspondence, mail pieces, engineering drawings, microfilm, archived historical papers, and fingerprint cards. Increasingly, the resulting digital images are used as the input for further automated processing including: conversion to a full-text-searchable representation via machine printed or handwritten (optical) character recognition (OCR), postal zone identification, raster-to-vector conversion, and fingerprint matching. These diverse document images may be bi-tonal, gray scale, or color. Spatial sampling frequencies range from about 200 pixels per inch to over 1,000. The quality of the digital images can have a major effect on the accuracy and speed of any subsequent automated processing, as well as on any human-based processing which may be required. During imaging system design, there is, therefore, a need to specify the criteria by which image quality will be judged and, prior to system acceptance, to measure the quality of images produced. Unfortunately, there are few, if any, agreed-upon techniques for measuring document image quality objectively. In the output images, it is difficult to distinguish image degradation caused by the poor quality of the input paper or microfilm from that caused by the scanning system. We propose several document image quality criteria and have developed techniques for their measurement. These criteria include spatial resolution, geometric image accuracy, (distortion), gray scale resolution and linearity, and temporal and spatial uniformity. The measurement of these criteria requires scanning one or more test targets along with computer-based analyses of the test target images.

  20. No-reference visual quality assessment for image inpainting

    NASA Astrophysics Data System (ADS)

    Voronin, V. V.; Frantc, V. A.; Marchuk, V. I.; Sherstobitov, A. I.; Egiazarian, K.

    2015-03-01

    Inpainting has received a lot of attention in recent years and quality assessment is an important task to evaluate different image reconstruction approaches. In many cases inpainting methods introduce a blur in sharp transitions in image and image contours in the recovery of large areas with missing pixels and often fail to recover curvy boundary edges. Quantitative metrics of inpainting results currently do not exist and researchers use human comparisons to evaluate their methodologies and techniques. Most objective quality assessment methods rely on a reference image, which is often not available in inpainting applications. Usually researchers use subjective quality assessment by human observers. It is difficult and time consuming procedure. This paper focuses on a machine learning approach for no-reference visual quality assessment for image inpainting based on the human visual property. Our method is based on observation that Local Binary Patterns well describe local structural information of the image. We use a support vector regression learned on assessed by human images to predict perceived quality of inpainted images. We demonstrate how our predicted quality value correlates with qualitative opinion in a human observer study. Results are shown on a human-scored dataset for different inpainting methods.

  1. Improvement of image quality by polarization mixing

    NASA Astrophysics Data System (ADS)

    Kasahara, Ryosuke; Itoh, Izumi; Hirai, Hideaki

    2014-03-01

    Information about the polarization of light is valuable because it contains information about the light source illuminating an object, the illumination angle, and the object material. However, polarization information strongly depends on the direction of the light source, and it is difficult to use a polarization image with various recognition algorithms outdoors because the angle of the sun varies. We propose an image enhancement method for utilizing polarization information in many such situations where the light source is not fixed. We take two approaches to overcome this problem. First, we compute an image that is the combination of a polarization image and the corresponding brightness image. Because of the angle of the light source, the polarization contains no information about some scenes. Therefore, it is difficult to use only polarization information in any scene for applications such as object detection. However, if we use a combination of a polarization image and a brightness image, the brightness image can complement the lack of scene information. The second approach is finding features that depend less on the direction of the light source. We propose a method for extracting scene features based on a calculation of the reflection model including polarization effects. A polarization camera that has micro-polarizers on each pixel of the image sensor was built and used for capturing images. We discuss examples that demonstrate the improved visibility of objects by applying our proposed method to, e.g., the visibility of lane markers on wet roads.

  2. Meat quality evaluation by hyperspectral imaging technique: an overview.

    PubMed

    Elmasry, Gamal; Barbin, Douglas F; Sun, Da-Wen; Allen, Paul

    2012-01-01

    During the last two decades, a number of methods have been developed to objectively measure meat quality attributes. Hyperspectral imaging technique as one of these methods has been regarded as a smart and promising analytical tool for analyses conducted in research and industries. Recently there has been a renewed interest in using hyperspectral imaging in quality evaluation of different food products. The main inducement for developing the hyperspectral imaging system is to integrate both spectroscopy and imaging techniques in one system to make direct identification of different components and their spatial distribution in the tested product. By combining spatial and spectral details together, hyperspectral imaging has proved to be a promising technology for objective meat quality evaluation. The literature presented in this paper clearly reveals that hyperspectral imaging approaches have a huge potential for gaining rapid information about the chemical structure and related physical properties of all types of meat. In addition to its ability for effectively quantifying and characterizing quality attributes of some important visual features of meat such as color, quality grade, marbling, maturity, and texture, it is able to measure multiple chemical constituents simultaneously without monotonous sample preparation. Although this technology has not yet been sufficiently exploited in meat process and quality assessment, its potential is promising. Developing a quality evaluation system based on hyperspectral imaging technology to assess the meat quality parameters and to ensure its authentication would bring economical benefits to the meat industry by increasing consumer confidence in the quality of the meat products. This paper provides a detailed overview of the recently developed approaches and latest research efforts exerted in hyperspectral imaging technology developed for evaluating the quality of different meat products and the possibility of its widespread

  3. Image quality assessment by preprocessing and full reference model combination

    NASA Astrophysics Data System (ADS)

    Bianco, S.; Ciocca, G.; Marini, F.; Schettini, R.

    2009-01-01

    This paper focuses on full-reference image quality assessment and presents different computational strategies aimed to improve the robustness and accuracy of some well known and widely used state of the art models, namely the Structural Similarity approach (SSIM) by Wang and Bovik and the S-CIELAB spatial-color model by Zhang and Wandell. We investigate the hypothesis that combining error images with a visual attention model could allow a better fit of the psycho-visual data of the LIVE Image Quality assessment Database Release 2. We show that the proposed quality assessment metric better correlates with the experimental data.

  4. Method and tool for generating and managing image quality allocations through the design and development process

    NASA Astrophysics Data System (ADS)

    Sparks, Andrew W.; Olson, Craig; Theisen, Michael J.; Addiego, Chris J.; Hutchins, Tiffany G.; Goodman, Timothy D.

    2016-05-01

    Performance models for infrared imaging systems require image quality parameters; optical design engineers need image quality design goals; systems engineers develop image quality allocations to test imaging systems against. It is a challenge to maintain consistency and traceability amongst the various expressions of image quality. We present a method and parametric tool for generating and managing expressions of image quality during the system modeling, requirements specification, design, and testing phases of an imaging system design and development project.

  5. Image Quality Assessment Based on Inter-Patch and Intra-Patch Similarity

    PubMed Central

    Zhou, Fei; Lu, Zongqing; Wang, Can; Sun, Wen; Xia, Shu-Tao; Liao, Qingmin

    2015-01-01

    In this paper, we propose a full-reference (FR) image quality assessment (IQA) scheme, which evaluates image fidelity from two aspects: the inter-patch similarity and the intra-patch similarity. The scheme is performed in a patch-wise fashion so that a quality map can be obtained. On one hand, we investigate the disparity between one image patch and its adjacent ones. This disparity is visually described by an inter-patch feature, where the hybrid effect of luminance masking and contrast masking is taken into account. The inter-patch similarity is further measured by modifying the normalized correlation coefficient (NCC). On the other hand, we also attach importance to the impact of image contents within one patch on the IQA problem. For the intra-patch feature, we consider image curvature as an important complement of image gradient. According to local image contents, the intra-patch similarity is measured by adaptively comparing image curvature and gradient. Besides, a nonlinear integration of the inter-patch and intra-patch similarity is presented to obtain an overall score of image quality. The experiments conducted on six publicly available image databases show that our scheme achieves better performance in comparison with several state-of-the-art schemes. PMID:25793282

  6. Image quality assessment using Takagi-Sugeno-Kang fuzzy model

    NASA Astrophysics Data System (ADS)

    Äńordević, Dragana; Kukolj, Dragan; Schelkens, Peter

    2015-03-01

    The main aim of the paper is to present a non-linear image quality assessment model based on a fuzzy logic estimator, namely the Takagi-Sugeno-Kang fuzzy model. This image quality assessment model uses a clustered space of input objective metrics. Main advantages of the introduced quality model are simplicity and understandably of its fuzzy rules. As reference model the polynomial 3 rd order model was chosen. The parameters of the Takagi-Sugeno-Kang fuzzy model are optimized in accordance to the mapping criteria of the selected set of input objective quality measures to the Mean Opinion Score (MOS) scale.

  7. Objective analysis of image quality of video image capture systems

    NASA Astrophysics Data System (ADS)

    Rowberg, Alan H.

    1990-07-01

    As Picture Archiving and Communication System (PACS) technology has matured, video image capture has become a common way of capturing digital images from many modalities. While digital interfaces, such as those which use the ACR/NEMA standard, will become more common in the future, and are preferred because of the accuracy of image transfer, video image capture will be the dominant method in the short term, and may continue to be used for some time because of the low cost and high speed often associated with such devices. Currently, virtually all installed systems use methods of digitizing the video signal that is produced for display on the scanner viewing console itself. A series of digital test images have been developed for display on either a GE CT9800 or a GE Signa MRI scanner. These images have been captured with each of five commercially available image capture systems, and the resultant images digitally transferred on floppy disk to a PC1286 computer containing Optimast' image analysis software. Here the images can be displayed in a comparative manner for visual evaluation, in addition to being analyzed statistically. Each of the images have been designed to support certain tests, including noise, accuracy, linearity, gray scale range, stability, slew rate, and pixel alignment. These image capture systems vary widely in these characteristics, in addition to the presence or absence of other artifacts, such as shading and moire pattern. Other accessories such as video distribution amplifiers and noise filters can also add or modify artifacts seen in the captured images, often giving unusual results. Each image is described, together with the tests which were performed using them. One image contains alternating black and white lines, each one pixel wide, after equilibration strips ten pixels wide. While some systems have a slew rate fast enough to track this correctly, others blur it to an average shade of gray, and do not resolve the lines, or give

  8. Feature maps driven no-reference image quality prediction of authentically distorted images

    NASA Astrophysics Data System (ADS)

    Ghadiyaram, Deepti; Bovik, Alan C.

    2015-03-01

    Current blind image quality prediction models rely on benchmark databases comprised of singly and synthetically distorted images, thereby learning image features that are only adequate to predict human perceived visual quality on such inauthentic distortions. However, real world images often contain complex mixtures of multiple distortions. Rather than a) discounting the effect of these mixtures of distortions on an image's perceptual quality and considering only the dominant distortion or b) using features that are only proven to be efficient for singly distorted images, we deeply study the natural scene statistics of authentically distorted images, in different color spaces and transform domains. We propose a feature-maps-driven statistical approach which avoids any latent assumptions about the type of distortion(s) contained in an image, and focuses instead on modeling the remarkable consistencies in the scene statistics of real world images in the absence of distortions. We design a deep belief network that takes model-based statistical image features derived from a very large database of authentically distorted images as input and discovers good feature representations by generalizing over different distortion types, mixtures, and severities, which are later used to learn a regressor for quality prediction. We demonstrate the remarkable competence of our features for improving automatic perceptual quality prediction on a benchmark database and on the newly designed LIVE Authentic Image Quality Challenge Database and show that our approach of combining robust statistical features and the deep belief network dramatically outperforms the state-of-the-art.

  9. Impact of image acquisition timing on image quality for dual energy contrast-enhanced breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Hill, Melissa L.; Mainprize, James G.; Puong, Sylvie; Carton, Ann-Katherine; Iordache, Razvan; Muller, Serge; Yaffe, Martin J.

    2012-03-01

    Dual-energy contrast-enhanced digital breast tomosynthesis (DE CE-DBT) image quality is affected by a large parameter space including the tomosynthesis acquisition geometry, imaging technique factors, the choice of reconstruction algorithm, and the subject breast characteristics. The influence of most of these factors on reconstructed image quality is well understood for DBT. However, due to the contrast agent uptake kinetics in CE imaging, the subject breast characteristics change over time, presenting a challenge for optimization . In this work we experimentally evaluate the sensitivity of the reconstructed image quality to timing of the low-energy and high-energy images and changes in iodine concentration during image acquisition. For four contrast uptake patterns, a variety of acquisition protocols were tested with different timing and geometry. The influence of the choice of reconstruction algorithm (SART or FBP) was also assessed. Image quality was evaluated in terms of the lesion signal-difference-to-noise ratio (LSDNR) in the central slice of DE CE-DBT reconstructions. Results suggest that for maximum image quality, the low- and high-energy image acquisitions should be made within one x-ray tube sweep, as separate low- and high-energy tube sweeps can degrade LSDNR. In terms of LSDNR per square-root dose, the image quality is nearly equal between SART reconstructions with 9 and 15 angular views, but using fewer angular views can result in a significant improvement in the quantitative accuracy of the reconstructions due to the shorter imaging time interval.

  10. Interplay between JPEG-2000 image coding and quality estimation

    NASA Astrophysics Data System (ADS)

    Pinto, Guilherme O.; Hemami, Sheila S.

    2013-03-01

    Image quality and utility estimators aspire to quantify the perceptual resemblance and the usefulness of a distorted image when compared to a reference natural image, respectively. Image-coders, such as JPEG-2000, traditionally aspire to allocate the available bits to maximize the perceptual resemblance of the compressed image when compared to a reference uncompressed natural image. Specifically, this can be accomplished by allocating the available bits to minimize the overall distortion, as computed by a given quality estimator. This paper applies five image quality and utility estimators, SSIM, VIF, MSE, NICE and GMSE, within a JPEG-2000 encoder for rate-distortion optimization to obtain new insights on how to improve JPEG-2000 image coding for quality and utility applications, as well as to improve the understanding about the quality and utility estimators used in this work. This work develops a rate-allocation algorithm for arbitrary quality and utility estimators within the Post- Compression Rate-Distortion Optimization (PCRD-opt) framework in JPEG-2000 image coding. Performance of the JPEG-2000 image coder when used with a variety of utility and quality estimators is then assessed. The estimators fall into two broad classes, magnitude-dependent (MSE, GMSE and NICE) and magnitudeindependent (SSIM and VIF). They further differ on their use of the low-frequency image content in computing their estimates. The impact of these computational differences is analyzed across a range of images and bit rates. In general, performance of the JPEG-2000 coder below 1.6 bits/pixel with any of these estimators is highly content dependent, with the most relevant content being the amount of texture in an image and whether the strongest gradients in an image correspond to the main contours of the scene. Above 1.6 bits/pixel, all estimators produce visually equivalent images. As a result, the MSE estimator provides the most consistent performance across all images, while specific

  11. Quantity and Quality of Computer Use and Academic Achievement: Evidence from a Large-Scale International Test Program

    ERIC Educational Resources Information Center

    Cheema, Jehanzeb R.; Zhang, Bo

    2013-01-01

    This study looked at the effect of both quantity and quality of computer use on achievement. The Program for International Student Assessment (PISA) 2003 student survey comprising of 4,356 students (boys, n = 2,129; girls, n = 2,227) was used to predict academic achievement from quantity and quality of computer use while controlling for…

  12. Image quality requirements for the digitization of photographic collections

    NASA Astrophysics Data System (ADS)

    Frey, Franziska S.; Suesstrunk, Sabine E.

    1996-02-01

    Managers of photographic collections in libraries and archives are exploring digital image database systems, but they usually have few sources of technical guidance and analysis available. Correctly digitizing photographs puts high demands on the imaging system and the human operators involved in the task. Pictures are very dense with information, requiring high-quality scanning procedures. In order to provide advice to libraries and archives seeking to digitize photographic collections, it is necessary to thoroughly understand the nature of the various originals and the purposes for digitization. Only with this understanding is it possible to choose adequate image quality for the digitization process. The higher the quality, the more expertise, time, and cost is likely to be involved in generating and delivering the image. Despite all the possibilities for endless copying, distributing, and manipulating of digital images, image quality choices made when the files are first created have the same 'finality' that they have in conventional photography. They will have a profound effect on project cost, the value of the final project to researchers, and the usefulness of the images as preservation surrogates. Image quality requirements therefore have to be established carefully before a digitization project starts.

  13. Blind noisy image quality evaluation using a deformable ant colony algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Li; Huang, Xiaotong; Tian, Jing; Fu, Xiaowei

    2014-04-01

    The objective of blind noisy image quality assessment is to evaluate the quality of the degraded noisy image without the knowledge of the ground truth image. Its performance relies on the accuracy of the noise statistics estimated from homogenous blocks. The major challenge of block-based approaches lies in the block size selection, as it affects the local noise derivation. To tackle this challenge, a deformable ant colony optimization (DACO) approach is proposed in this paper to adaptively adjust the ant size for image block selection. The proposed DACO approach considers that the size of the ant is adjustable during foraging. For the smooth image blocks, more pheromone is deposited, and then the size of ant is increased. Therefore, this strategy enables the ants to have dynamic food-search capability, leading to more accurate selection of homogeneous blocks. Furthermore, the regression analysis is used to obtain image quality score by exploiting the above-estimated noise statistics. Experimental results are provided to justify that the proposed approach outperforms conventional approaches to provide more accurate noise statistics estimation and achieve a consistent image quality evaluation performance for both the artificially generated and real-world noisy images.

  14. Image Quality of the Evryscope: Method for On-Site Optical Alignment

    NASA Astrophysics Data System (ADS)

    Wulfken, Philip J.; Law, Nicholas M.; Ratzloff, Jeffrey; Fors, Octavi

    2015-01-01

    Previous wide field surveys have been conducted by taking many images each night to cover thousands of square degrees. The Evryscope is a new type of system designed to search for transiting exoplanets around nearby bright stars, M-dwarfs, white dwarfs, and other transients. The Evryscope is an array of 70 mm telescopes that will continuously image 10200 square degrees of the night sky at once. One of the image quality requirements is for the PSFs to be well-sampled at two pixels across and it was found that tilt caused by slight misalignment between the optics and the CCD increased the size of the FWHM towards the edges and corners of the image. Here we describe the image quality of the Evryscope cameras and the alignment procedure to achieve the required 2 pixel FWHM.

  15. MEO based secured, robust, high capacity and perceptual quality image watermarking in DWT-SVD domain.

    PubMed

    Gunjal, Baisa L; Mali, Suresh N

    2015-01-01

    The aim of this paper is to present multiobjective evolutionary optimizer (MEO) based highly secured and strongly robust image watermarking technique using discrete wavelet transform (DWT) and singular value decomposition (SVD). Many researchers have failed to achieve optimization of perceptual quality and robustness with high capacity watermark embedding. Here, we achieved optimized peak signal to noise ratio (PSNR) and normalized correlation (NC) using MEO. Strong security is implemented through eight different security levels including watermark scrambling by Fibonacci-Lucas transformation (FLT). Haar wavelet is selected for DWT decomposition to compare practical performance of wavelets from different wavelet families. The technique is non-blind and tested with cover images of size 512x512 and grey scale watermark of size 256x256. The achieved perceptual quality in terms of PSNR is 79.8611dBs for Lena, 87.8446 dBs for peppers and 93.2853 dBs for lake images by varying scale factor K1 from 1 to 5. All candidate images used for testing namely Lena, peppers and lake images show exact recovery of watermark giving NC equals to 1. The robustness is tested against variety of attacks on watermarked image. The experimental demonstration proved that proposed method gives NC more than 0.96 for majority of attacks under consideration. The performance evaluation of this technique is found superior to all existing hybrid image watermarking techniques under consideration. PMID:25830081

  16. Digital Receptor Image Quality Evaluation: Effect of Different Filtration Schemes

    NASA Astrophysics Data System (ADS)

    Murphy, Simon; Christianson, Olav; Amurao, Maxwell; Samei, Ehsan

    2010-04-01

    The International Electrotechnical Commission provides a standard measurement methodology to provide performance intercomparison between imaging systems. Its formalism specifies beam quality based on half value layer attained by target kVp and additional Al filtration. Similar beam quality may be attained more conveniently using a filtration combination of Cu and Al. This study aimed to compare the two filtration schemes by their effects on image quality in terms of signal-difference-to-noise ratio, spatial resolution, exposure index, noise power spectrum, modulation transfer function, and detective quantum efficiency. A comparative assessment of the images was performed by analyzing commercially available image quality assessment phantom and by following the IEC 62220-3 formalism.

  17. The use of modern electronic flat panel devices for image guided radiation therapy:. Image quality comparison, intra fraction motion monitoring and quality assurance applications

    NASA Astrophysics Data System (ADS)

    Nill, S.; Stützel, J.; Häring, P.; Oelfke, U.

    2008-06-01

    With modern radiotherapy delivery techniques like intensity modulated radiotherapy (IMRT) it is possible to delivery a more conformal dose distribution to the tumor while better sparing the organs at risk (OAR) compared to 3D conventional radiation therapy. Due to the theoretically high dose conformity achievable it is very important to know the exact position of the target volume during the treatment. With more and more modern linear accelerators equipped with imaging devices this is now almost possible. These imaging devices are using energies between 120kV and 6MV and therefore different detector systems are used but the vast majority is using amorphous silicon flat panel devices with different scintilator screens and build up materials. The technical details and the image quality of these systems are discussed and first results of the comparison are presented. In addition new methods to deal with motion management and quality assurance procedures are shortly discussed.

  18. Figure of Image Quality and Information Capacity in Digital Mammography

    PubMed Central

    Michail, Christos M.; Kalyvas, Nektarios E.; Valais, Ioannis G.; Fudos, Ioannis P.; Fountos, George P.; Dimitropoulos, Nikos; Kandarakis, Ioannis S.

    2014-01-01

    Objectives. In this work, a simple technique to assess the image quality characteristics of the postprocessed image is developed and an easy to use figure of image quality (FIQ) is introduced. This FIQ characterizes images in terms of resolution and noise. In addition information capacity, defined within the context of Shannon's information theory, was used as an overall image quality index. Materials and Methods. A digital mammographic image was postprocessed with three digital filters. Resolution and noise were calculated via the Modulation Transfer Function (MTF), the coefficient of variation, and the figure of image quality. In addition, frequency dependent parameters such as the noise power spectrum (NPS) and noise equivalent quanta (NEQ) were estimated and used to assess information capacity. Results. FIQs for the “raw image” data and the image processed with the “sharpen edges” filter were found 907.3 and 1906.1, correspondingly. The information capacity values were 60.86 × 103 and 78.96 × 103 bits/mm2. Conclusion. It was found that, after the application of the postprocessing techniques (even commercial nondedicated software) on the raw digital mammograms, MTF, NPS, and NEQ are improved for medium to high spatial frequencies leading to resolving smaller structures in the final image. PMID:24895593

  19. PLÉIADES Project: Assessment of Georeferencing Accuracy, Image Quality, Pansharpening Performence and Dsm/dtm Quality

    NASA Astrophysics Data System (ADS)

    Topan, Hüseyin; Cam, Ali; Özendi, Mustafa; Oruç, Murat; Jacobsen, Karsten; Taşkanat, Talha

    2016-06-01

    Pléiades 1A and 1B are twin optical satellites of Optical and Radar Federated Earth Observation (ORFEO) program jointly running by France and Italy. They are the first satellites of Europe with sub-meter resolution. Airbus DS (formerly Astrium Geo) runs a MyGIC (formerly Pléiades Users Group) program to validate Pléiades images worldwide for various application purposes. The authors conduct three projects, one is within this program, the second is supported by BEU Scientific Research Project Program, and the third is supported by TÜBİTAK. Assessment of georeferencing accuracy, image quality, pansharpening performance and Digital Surface Model/Digital Terrain Model (DSM/DTM) quality subjects are investigated in these projects. For these purposes, triplet panchromatic (50 cm Ground Sampling Distance (GSD)) and VNIR (2 m GSD) Pléiades 1A images were investigated over Zonguldak test site (Turkey) which is urbanised, mountainous and covered by dense forest. The georeferencing accuracy was estimated with a standard deviation in X and Y (SX, SY) in the range of 0.45m by bias corrected Rational Polynomial Coefficient (RPC) orientation, using ~170 Ground Control Points (GCPs). 3D standard deviation of ±0.44m in X, ±0.51m in Y, and ±1.82m in Z directions have been reached in spite of the very narrow angle of convergence by bias corrected RPC orientation. The image quality was also investigated with respect to effective resolution, Signal to Noise Ratio (SNR) and blur coefficient. The effective resolution was estimated with factor slightly below 1.0, meaning that the image quality corresponds to the nominal resolution of 50cm. The blur coefficients were achieved between 0.39-0.46 for triplet panchromatic images, indicating a satisfying image quality. SNR is in the range of other comparable space borne images which may be caused by de-noising of Pléiades images. The pansharpened images were generated by various methods, and are validated by most common statistical

  20. Image quality evaluation and control of computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Hiroshi; Yamaguchi, Takeshi; Uetake, Hiroki

    2016-03-01

    Image quality of the computer-generated holograms are usually evaluated subjectively. For example, the re- constructed image from the hologram is compared with other holograms, or evaluated by the double-stimulus impairment scale method to compare with the original image. This paper proposes an objective image quality evaluation of a computer-generated hologram by evaluating both diffraction efficiency and peak signal-to-noise ratio. Theory and numerical experimental results are shown on Fourier transform transmission hologram of both amplitude and phase modulation. Results without the optimized random phase show that the amplitude transmission hologram gives better peak signal-to noise ratio, but the phase transmission hologram provides about 10 times higher diffraction efficiency to the amplitude type. As an optimized phase hologram, Kinoform is evaluated. In addition, we investigate to control image quality by non-linear operation.

  1. Dosimetry and image quality assessment in a direct radiography system

    PubMed Central

    Oliveira, Bruno Beraldo; de Oliveira, Marcio Alves; Paixão, Lucas; Teixeira, Maria Helena Araújo; Nogueira, Maria do Socorro

    2014-01-01

    Objective To evaluate the mean glandular dose with a solid state detector and the image quality in a direct radiography system, utilizing phantoms. Materials and Methods Irradiations were performed with automatic exposure control and polymethyl methacrylate slabs with different thicknesses to calculate glandular dose values. The image quality was evaluated by means of the structures visualized on the images of the phantoms. Results Considering the uncertainty of the measurements, the mean glandular dose results are in agreement with the values provided by the equipment and with internationally adopted reference levels. Results obtained from images of the phantoms were in agreement with the reference values. Conclusion The present study contributes to verify the equipment conformity as regards dose values and image quality. PMID:25741119

  2. Impact of Computed Tomography Image Quality on Image-Guided Radiation Therapy Based on Soft Tissue Registration

    SciTech Connect

    Morrow, Natalya V.; Lawton, Colleen A.; Qi, X. Sharon; Li, X. Allen

    2012-04-01

    Purpose: In image-guided radiation therapy (IGRT), different computed tomography (CT) modalities with varying image quality are being used to correct for interfractional variations in patient set-up and anatomy changes, thereby reducing clinical target volume to the planning target volume (CTV-to-PTV) margins. We explore how CT image quality affects patient repositioning and CTV-to-PTV margins in soft tissue registration-based IGRT for prostate cancer patients. Methods and Materials: Four CT-based IGRT modalities used for prostate RT were considered in this study: MV fan beam CT (MVFBCT) (Tomotherapy), MV cone beam CT (MVCBCT) (MVision; Siemens), kV fan beam CT (kVFBCT) (CTVision, Siemens), and kV cone beam CT (kVCBCT) (Synergy; Elekta). Daily shifts were determined by manual registration to achieve the best soft tissue agreement. Effect of image quality on patient repositioning was determined by statistical analysis of daily shifts for 136 patients (34 per modality). Inter- and intraobserver variability of soft tissue registration was evaluated based on the registration of a representative scan for each CT modality with its corresponding planning scan. Results: Superior image quality with the kVFBCT resulted in reduced uncertainty in soft tissue registration during IGRT compared with other image modalities for IGRT. The largest interobserver variations of soft tissue registration were 1.1 mm, 2.5 mm, 2.6 mm, and 3.2 mm for kVFBCT, kVCBCT, MVFBCT, and MVCBCT, respectively. Conclusions: Image quality adversely affects the reproducibility of soft tissue-based registration for IGRT and necessitates a careful consideration of residual uncertainties in determining different CTV-to-PTV margins for IGRT using different image modalities.

  3. Contrast vs noise effects on image quality

    NASA Astrophysics Data System (ADS)

    Hadar, Ofer; Corse, N.; Rotman, Stanley R.; Kopeika, Norman S.

    1996-11-01

    Low noise images are contract-limited, and image restoration techniques can improve resolution significantly. However, as noise level increases, resolution improvements via image processing become more limited because image restoration increases noise. This research attempts to construct a reliable quantitative means of characterizing the perceptual difference between target and background. A method is suggested for evaluating the extent to which it is possible to discriminate an object which has merged with its surroundings, in noise-limited and contrast limited images, i.e., how hard it would be for an observer to recognize the object against various backgrounds as a function of noise level. The suggested model will be a first order model to begin with, using a regular bar-chart with additive uncorrelated Gaussian noise degraded by standard atmospheric blurring filters. The second phase will comprise a model dealing with higher-order images. This computational model relates the detectability or distinctness of the object to measurable parameters. It also must characterize human perceptual response, i.e. the model must develop metrics which are highly correlated to the ease or difficulty which the human observer experiences in discerning the target from its background. This requirement can be fulfilled only by conducting psychophysical experiments quantitatively comparing the perceptual evaluations of the observers with the results of the mathematical model.

  4. Dose reduction and image quality optimizations in CT of pediatric and adult patients: phantom studies

    NASA Astrophysics Data System (ADS)

    Jeon, P.-H.; Lee, C.-L.; Kim, D.-H.; Lee, Y.-J.; Jeon, S.-S.; Kim, H.-J.

    2014-03-01

    Multi-detector computed tomography (MDCT) can be used to easily and rapidly perform numerous acquisitions, possibly leading to a marked increase in the radiation dose to individual patients. Technical options dedicated to automatically adjusting the acquisition parameters according to the patient's size are of specific interest in pediatric radiology. A constant tube potential reduction can be achieved for adults and children, while maintaining a constant detector energy fluence. To evaluate radiation dose, the weighted CT dose index (CTDIw) was calculated based on the CT dose index (CTDI) measured using an ion chamber, and image noise and image contrast were measured from a scanned image to evaluate image quality. The dose-weighted contrast-to-noise ratio (CNRD) was calculated from the radiation dose, image noise, and image contrast measured from a scanned image. The noise derivative (ND) is a quality index for dose efficiency. X-ray spectra with tube voltages ranging from 80 to 140 kVp were used to compute the average photon energy. Image contrast and the corresponding contrast-to-noise ratio (CNR) were determined for lesions of soft tissue, muscle, bone, and iodine relative to a uniform water background, as the iodine contrast increases at lower energy (i.e., k-edge of iodine is 33 keV closer to the beam energy) using mixed water-iodine contrast normalization (water 0, iodine 25, 100, 200, and 1000 HU, respectively). The proposed values correspond to high quality images and can be reduced if only high-contrast organs are assessed. The potential benefit of lowering the tube voltage is an improved CNRD, resulting in a lower radiation dose and optimization of image quality. Adjusting the tube potential in abdominal CT would be useful in current pediatric radiography, where the choice of X-ray techniques generally takes into account the size of the patient as well as the need to balance the conflicting requirements of diagnostic image quality and radiation dose

  5. HgCdTe Detectors for Space and Science Imaging: General Issues and Latest Achievements

    NASA Astrophysics Data System (ADS)

    Gravrand, O.; Rothman, J.; Cervera, C.; Baier, N.; Lobre, C.; Zanatta, J. P.; Boulade, O.; Moreau, V.; Fieque, B.

    2016-05-01

    HgCdTe (MCT) is a very versatile material system for infrared (IR) detection, suitable for high performance detection in a wide range of applications and spectral ranges. Indeed, the ability to tailor the cutoff frequency as close as possible to the needs makes it a perfect candidate for high performance detection. Moreover, the high quality material available today, grown either by molecular beam epitaxy or liquid phase epitaxy, allows for very low dark currents at low temperatures, suitable for low flux detection applications such as science imaging. MCT has also demonstrated robustness to the aggressive environment of space and faces, therefore, a large demand for space applications. A satellite may stare at the earth, in which case detection usually involves a lot of photons, called a high flux scenario. Alternatively, a satellite may stare at outer space for science purposes, in which case the detected photon number is very low, leading to low flux scenarios. This latter case induces very strong constraints onto the detector: low dark current, low noise, (very) large focal plane arrays. The classical structure used to fulfill those requirements are usually p/n MCT photodiodes. This type of structure has been deeply investigated in our laboratory for different spectral bands, in collaboration with the CEA Astrophysics lab. However, another alternative may also be investigated with low excess noise: MCT n/p avalanche photodiodes (APD). This paper reviews the latest achievements obtained on this matter at DEFIR (LETI and Sofradir common laboratory) from the short wave infrared (SWIR) band detection for classical astronomical needs, to long wave infrared (LWIR) band for exoplanet transit spectroscopy, up to very long wave infrared (VLWIR) bands. The different available diode architectures (n/p VHg or p/n, or even APDs) are reviewed, including different available ROIC architectures for low flux detection.

  6. HgCdTe Detectors for Space and Science Imaging: General Issues and Latest Achievements

    NASA Astrophysics Data System (ADS)

    Gravrand, O.; Rothman, J.; Cervera, C.; Baier, N.; Lobre, C.; Zanatta, J. P.; Boulade, O.; Moreau, V.; Fieque, B.

    2016-09-01

    HgCdTe (MCT) is a very versatile material system for infrared (IR) detection, suitable for high performance detection in a wide range of applications and spectral ranges. Indeed, the ability to tailor the cutoff frequency as close as possible to the needs makes it a perfect candidate for high performance detection. Moreover, the high quality material available today, grown either by molecular beam epitaxy or liquid phase epitaxy, allows for very low dark currents at low temperatures, suitable for low flux detection applications such as science imaging. MCT has also demonstrated robustness to the aggressive environment of space and faces, therefore, a large demand for space applications. A satellite may stare at the earth, in which case detection usually involves a lot of photons, called a high flux scenario. Alternatively, a satellite may stare at outer space for science purposes, in which case the detected photon number is very low, leading to low flux scenarios. This latter case induces very strong constraints onto the detector: low dark current, low noise, (very) large focal plane arrays. The classical structure used to fulfill those requirements are usually p/ n MCT photodiodes. This type of structure has been deeply investigated in our laboratory for different spectral bands, in collaboration with the CEA Astrophysics lab. However, another alternative may also be investigated with low excess noise: MCT n/ p avalanche photodiodes (APD). This paper reviews the latest achievements obtained on this matter at DEFIR (LETI and Sofradir common laboratory) from the short wave infrared (SWIR) band detection for classical astronomical needs, to long wave infrared (LWIR) band for exoplanet transit spectroscopy, up to very long wave infrared (VLWIR) bands. The different available diode architectures ( n/ p VHg or p/ n, or even APDs) are reviewed, including different available ROIC architectures for low flux detection.

  7. Latest achievements on MCT IR detectors for space and science imaging

    NASA Astrophysics Data System (ADS)

    Gravrand, O.; Rothman, J.; Castelein, P.; Cervera, C.; Baier, N.; Lobre, C.; De Borniol, E.; Zanatta, J. P.; Boulade, O.; Moreau, V.; Fieque, B.; Chorier, P.

    2016-05-01

    HgCdTe (MCT) is a very versatile material for IR detection. Indeed, the ability to tailor the cutoff frequency as close as possible to the detection needs makes it a perfect candidate for high performance detection in a wide range of applications and spectral ranges. Moreover, the high quality material available today, either by liquid phase epitaxy (LPE) or molecular beam epitaxy (MBE) allows for very low dark currents at low temperatures and make it suitable for very low flux detection application such as science imaging. MCT has also demonstrated its robustness to aggressive space environment and faces therefore a large demand for space application such as staring at the outer space for science purposes in which case, the detected photon number is very low This induces very strong constrains onto the detector: low dark current, low noise, low persistence, (very) large focal plane arrays. The MCT diode structure adapted to fulfill those requirements is naturally the p/n photodiode. Following the developments of this technology made at DEFIR and transferred to Sofradir in MWIR and LWIR ranges for tactical applications, our laboratory has consequently investigated its adaptation for ultra-low flux in different spectral bands, in collaboration with the CEA Astrophysics lab. Another alternative for ultra low flux applications in SWIR range, has also been investigated with low excess noise MCT n/p avalanche photodiodes (APD). Those APDs may in some cases open the gate to sub electron noise IR detection.. This paper will review the latest achievements obtained on this matter at DEFIR (CEA-LETI and Sofradir common laboratory) from the short wave (SWIR) band detection for classical astronomical needs, to the long wave (LWIR) band for exoplanet transit spectroscopy, up to the very long waves (VLWIR) band.

  8. Objective image quality assessment based on support vector regression.

    PubMed

    Narwaria, Manish; Lin, Weisi

    2010-03-01

    Objective image quality estimation is useful in many visual processing systems, and is difficult to perform in line with the human perception. The challenge lies in formulating effective features and fusing them into a single number to predict the quality score. In this brief, we propose a new approach to address the problem, with the use of singular vectors out of singular value decomposition (SVD) as features for quantifying major structural information in images and then support vector regression (SVR) for automatic prediction of image quality. The feature selection with singular vectors is novel and general for gauging structural changes in images as a good representative of visual quality variations. The use of SVR exploits the advantages of machine learning with the ability to learn complex data patterns for an effective and generalized mapping of features into a desired score, in contrast with the oft-utilized feature pooling process in the existing image quality estimators; this is to overcome the difficulty of model parameter determination for such a system to emulate the related, complex human visual system (HVS) characteristics. Experiments conducted with three independent databases confirm the effectiveness of the proposed system in predicting image quality with better alignment with the HVS's perception than the relevant existing work. The tests with untrained distortions and databases further demonstrate the robustness of the system and the importance of the feature selection. PMID:20100674

  9. Imaging quality full chip verification for yield improvement

    NASA Astrophysics Data System (ADS)

    Yang, Qing; Zhou, CongShu; Quek, ShyueFong; Lu, Mark; Foong, YeeMei; Qiu, JianHong; Pandey, Taksh; Dover, Russell

    2013-04-01

    Basic image intensity parameters, like maximum and minimum intensity values (Imin and Imax), image logarithm slope (ILS), normalized image logarithm slope (NILS) and mask error enhancement factor (MEEF) , are well known as indexes of photolithography imaging quality. For full chip verification, hotspot detection is typically based on threshold values for line pinching or bridging. For image intensity parameters it is generally harder to quantify an absolute value to define where the process limit will occur, and at which process stage; lithography, etch or post- CMP. However it is easy to conclude that hot spots captured by image intensity parameters are more susceptible to process variation and very likely to impact yield. In addition these image intensity hot spots can be missed by using resist model verification because the resist model normally is calibrated by the wafer data on a single resist plane and is an empirical model which is trying to fit the resist critical dimension by some mathematic algorithm with combining optical calculation. Also at resolution enhancement technology (RET) development stage, full chip imaging quality check is also a method to qualify RET solution, like Optical Proximity Correct (OPC) performance. To add full chip verification using image intensity parameters is also not as costly as adding one more resist model simulation. From a foundry yield improvement and cost saving perspective, it is valuable to quantify the imaging quality to find design hot spots to correctly define the inline process control margin. This paper studies the correlation between image intensity parameters and process weakness or catastrophic hard failures at different process stages. It also demonstrated how OPC solution can improve full chip image intensity parameters. Rigorous 3D resist profile simulation across the full height of the resist stack was also performed to identify a correlation to the image intensity parameter. A methodology of post-OPC full

  10. Applying image quality in cell phone cameras: lens distortion

    NASA Astrophysics Data System (ADS)

    Baxter, Donald; Goma, Sergio R.; Aleksic, Milivoje

    2009-01-01

    This paper describes the framework used in one of the pilot studies run under the I3A CPIQ initiative to quantify overall image quality in cell-phone cameras. The framework is based on a multivariate formalism which tries to predict overall image quality from individual image quality attributes and was validated in a CPIQ pilot program. The pilot study focuses on image quality distortions introduced in the optical path of a cell-phone camera, which may or may not be corrected in the image processing path. The assumption is that the captured image used is JPEG compressed and the cellphone camera is set to 'auto' mode. As the used framework requires that the individual attributes to be relatively perceptually orthogonal, in the pilot study, the attributes used are lens geometric distortion (LGD) and lateral chromatic aberrations (LCA). The goal of this paper is to present the framework of this pilot project starting with the definition of the individual attributes, up to their quantification in JNDs of quality, a requirement of the multivariate formalism, therefore both objective and subjective evaluations were used. A major distinction in the objective part from the 'DSC imaging world' is that the LCA/LGD distortions found in cell-phone cameras, rarely exhibit radial behavior, therefore a radial mapping/modeling cannot be used in this case.

  11. What Is Quality Education? How Can It Be Achieved? The Perspectives of School Middle Leaders in Singapore

    ERIC Educational Resources Information Center

    Ng, Pak Tee

    2015-01-01

    This paper presents the findings of a research project that examines how middle leaders in Singapore schools understand "quality education" and how they think quality education can be achieved. From the perspective of these middle leaders, quality education emphasises holistic development, equips students with the knowledge and skills…

  12. Full-reference quality assessment of stereoscopic images by learning binocular receptive field properties.

    PubMed

    Shao, Feng; Li, Kemeng; Lin, Weisi; Jiang, Gangyi; Yu, Mei; Dai, Qionghai

    2015-10-01

    Quality assessment of 3D images encounters more challenges than its 2D counterparts. Directly applying 2D image quality metrics is not the solution. In this paper, we propose a new full-reference quality assessment for stereoscopic images by learning binocular receptive field properties to be more in line with human visual perception. To be more specific, in the training phase, we learn a multiscale dictionary from the training database, so that the latent structure of images can be represented as a set of basis vectors. In the quality estimation phase, we compute sparse feature similarity index based on the estimated sparse coefficient vectors by considering their phase difference and amplitude difference, and compute global luminance similarity index by considering luminance changes. The final quality score is obtained by incorporating binocular combination based on sparse energy and sparse complexity. Experimental results on five public 3D image quality assessment databases demonstrate that in comparison with the most related existing methods, the devised algorithm achieves high consistency with subjective assessment. PMID:26011880

  13. Importance of the grayscale in early assessment of image quality gains with iterative CT reconstruction

    NASA Astrophysics Data System (ADS)

    Noo, F.; Hahn, K.; Guo, Z.

    2016-03-01

    Iterative reconstruction methods have become an important research topic in X-ray computed tomography (CT), due to their ability to yield improvements in image quality in comparison with the classical filtered bacprojection method. There are many ways to design an effective iterative reconstruction method. Moreover, for each design, there may be a large number of parameters that can be adjusted. Thus, early assessment of image quality, before clinical deployment, plays a large role in identifying and refining solutions. Currently, there are few publications reporting on early, task-based assessment of image quality achieved with iterative reconstruction methods. We report here on such an assessment, and we illustrate at the same time the importance of the grayscale used for image display when conducting this type of assessment. Our results further support observations made by others that the edge preserving penalty term used in iterative reconstruction is a key ingredient to improving image quality in terms of detection task. Our results also provide a clear demonstration of an implication made in one of our previous publications, namely that the grayscale window plays an important role in image quality comparisons involving iterative CT reconstruction methods.

  14. Sci—Fri AM: Mountain — 02: A comparison of dose reduction methods on image quality for cone beam CT

    SciTech Connect

    Webb, R; Buckley, LA

    2014-08-15

    Modern radiotherapy uses highly conformai dose distributions and therefore relies on daily image guidance for accurate patient positioning. Kilovoltage cone beam CT is one technique that is routinely used for patient set-up and results in a high dose to the patient relative to planar imaging techniques. This study uses an Elekta Synergy linac equipped with XVI cone beam CT to investigate the impact of various imaging parameters on dose and image quality. Dose and image quality are assessed as functions of x-ray tube voltage, tube current and the number of projections in the scan. In each case, the dose measurements confirm that as each parameter increases the dose increases. The assessment of high contrast resolution shows little dependence on changes to the image technique. However, low contrast visibility suggests a trade off between dose and image quality. Particularly for changes in tube potential, the dose increases much faster as a function of voltage than the corresponding increase in low contrast image quality. This suggests using moderate values of the peak tube voltage (100 – 120 kVp) since higher values result in significant dose increases with little gain in image quality. Measurements also indicate that increasing tube current achieves the greatest degree of improvement in the low contrast visibility. The results of this study highlight the need to establish careful imaging protocols to limit dose to the patient and to limit changes to the imaging parameters to those cases where there is a clear clinical requirement for improved image quality.

  15. Raman chemical imaging technology for food safety and quality evaluation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Raman chemical imaging combines Raman spectroscopy and digital imaging to visualize composition and morphology of a target. This technique offers great potential for food safety and quality research. Most commercial Raman instruments perform measurement at microscopic level, and the spatial range ca...

  16. A feature-enriched completely blind image quality evaluator.

    PubMed

    Lin Zhang; Lei Zhang; Bovik, Alan C

    2015-08-01

    Existing blind image quality assessment (BIQA) methods are mostly opinion-aware. They learn regression models from training images with associated human subjective scores to predict the perceptual quality of test images. Such opinion-aware methods, however, require a large amount of training samples with associated human subjective scores and of a variety of distortion types. The BIQA models learned by opinion-aware methods often have weak generalization capability, hereby limiting their usability in practice. By comparison, opinion-unaware methods do not need human subjective scores for training, and thus have greater potential for good generalization capability. Unfortunately, thus far no opinion-unaware BIQA method has shown consistently better quality prediction accuracy than the opinion-aware methods. Here, we aim to develop an opinion-unaware BIQA method that can compete with, and perhaps outperform, the existing opinion-aware methods. By integrating the features of natural image statistics derived from multiple cues, we learn a multivariate Gaussian model of image patches from a collection of pristine natural images. Using the learned multivariate Gaussian model, a Bhattacharyya-like distance is used to measure the quality of each image patch, and then an overall quality score is obtained by average pooling. The proposed BIQA method does not need any distorted sample images nor subjective quality scores for training, yet extensive experiments demonstrate its superior quality-prediction performance to the state-of-the-art opinion-aware BIQA methods. The MATLAB source code of our algorithm is publicly available at www.comp.polyu.edu.hk/~cslzhang/IQA/ILNIQE/ILNIQE.htm. PMID:25915960

  17. Prospective optimization of CT under tube current modulation: II. image quality

    NASA Astrophysics Data System (ADS)

    Tian, Xiaoyu; Wilson, Josh; Frush, Donald; Samei, Ehsam

    2014-03-01

    Despite the significant clinical benefits of computed tomography (CT) in providing diagnostic information for a broad range of diseases, concerns have been raised regarding the potential cancer risk induced by CT radiation exposure. In that regard, optimizing CT protocols and minimizing radiation dose have become the core problem for the CT community. To develop strategies to optimize radiation dose, it is crucial to effectively characterize CT image quality. Such image quality estimates need to be prospective to ensure that optimization can be performed before the scan is initiated. The purpose of this study was to establish a phantombased methodology to predict quantum noise in CT images as a first step in our image quality prediction. Quantum noise was measured using a variable-sized phantom under clinical protocols. The mathematical relationship between noise and water-equivalent-diameter (Dw) was further established. The prediction was achieved by ascribing a noise value to a patient according to the patient's water-equivalent-diameter. The prediction accuracy was evaluated in anthropomorphic phantoms across a broad range of sizes, anatomy, and reconstruction algorithms. The differences between the measured and predicted noise were within 10% for anthropomorphic phantoms across all sizes and anatomy. This study proposed a practically applicable technique to predict noise in CT images. With a prospective estimation of image quality level, the scanning parameters can then by adjusted to ensure optimized imaging performance.

  18. A patient image-based technique to assess the image quality of clinical chest radiographs

    NASA Astrophysics Data System (ADS)

    Lin, Yuan; Samei, Ehsan; Luo, Hui; Dobbins, James T., III; McAdams, H. Page; Wang, Xiaohui; Sehnert, William J.; Barski, Lori; Foos, David H.

    2011-03-01

    Current clinical image quality assessment techniques mainly analyze image quality for the imaging system in terms of factors such as the capture system DQE and MTF, the exposure technique, and the particular image processing method and processing parameters. However, when assessing a clinical image, radiologists seldom refer to these factors, but rather examine several specific regions of the image to see whether the image is suitable for diagnosis. In this work, we developed a new strategy to learn and simulate radiologists' evaluation process on actual clinical chest images. Based on this strategy, a preliminary study was conducted on 254 digital chest radiographs (38 AP without grids, 35 AP with 6:1 ratio grids and 151 PA with 10:1 ratio grids). First, ten regional based perceptual qualities were summarized through an observer study. Each quality was characterized in terms of a physical quantity measured from the image, and as a first step, the three physical quantities in lung region were then implemented algorithmically. A pilot observer study was performed to verify the correlation between image perceptual qualities and physical quantitative qualities. The results demonstrated that our regional based metrics have promising performance for grading perceptual properties of chest radiographs.

  19. Impact of contact lens zone geometry and ocular optics on bifocal retinal image quality

    PubMed Central

    Bradley, Arthur; Nam, Jayoung; Xu, Renfeng; Harman, Leslie; Thibos, Larry

    2014-01-01

    Purpose To examine the separate and combined influences of zone geometry, pupil size, diffraction, apodisation and spherical aberration on the optical performance of concentric zonal bifocals. Methods Zonal bifocal pupil functions representing eye + ophthalmic correction were defined by interleaving wavefronts from separate optical zones of the bifocal. A two-zone design (a central circular inner zone surrounded by an annular outer-zone which is bounded by the pupil) and a five-zone design (a central small circular zone surrounded by four concentric annuli) were configured with programmable zone geometry, wavefront phase and pupil transmission characteristics. Using computational methods, we examined the effects of diffraction, Stiles Crawford apodisation, pupil size and spherical aberration on optical transfer functions for different target distances. Results Apodisation alters the relative weighting of each zone, and thus the balance of near and distance optical quality. When spherical aberration is included, the effective distance correction, add power and image quality depend on zone-geometry and Stiles Crawford Effect apodisation. When the outer zone width is narrow, diffraction limits the available image contrast when focused, but as pupil dilates and outer zone width increases, aberrations will limit the best achievable image quality. With two-zone designs, balancing near and distance image quality is not achieved with equal area inner and outer zones. With significant levels of spherical aberration, multi-zone designs effectively become multifocals. Conclusion Wave optics and pupil varying ocular optics significantly affect the imaging capabilities of different optical zones of concentric bifocals. With two-zone bifocal designs, diffraction, pupil apodisation spherical aberration, and zone size influence both the effective add power and the pupil size required to balance near and distance image quality. Five-zone bifocal designs achieve a high degree of

  20. Analysis of the Effects of Image Quality on Digital Map Generation from Satellite Images

    NASA Astrophysics Data System (ADS)

    Kim, H.; Kim, D.; Kim, S.; Kim, T.

    2012-07-01

    High resolution satellite images are widely used to produce and update a digital map since they became widely available. It is well known that the accuracy of digital map produced from satellite images is decided largely by the accuracy of geometric modelling. However digital maps are made by a series of photogrammetric workflow. Therefore the accuracy of digital maps are also affected by the quality of satellite images, such as image interpretability. For satellite images, parameters such as Modulation Transfer Function(MTF), Signal to Noise Ratio(SNR) and Ground Sampling Distance(GSD) are used to present images quality. Our previous research stressed that such quality parameters may not represent the quality of image products such as digital maps and that parameters for image interpretability such as Ground Resolved Distance(GRD) and National Imagery Interpretability Rating Scale(NIIRS) need to be considered. In this study, we analyzed the effects of the image quality on accuracy of digital maps produced by satellite images. QuickBird, IKONOS and KOMPSAT-2 imagery were used to analyze as they have similar GSDs. We measured various image quality parameters mentioned above from these images. Then we produced digital maps from the images using a digital photogrammetric workstation. We analyzed the accuracy of the digital maps in terms of their location accuracy and their level of details. Then we compared the correlation between various image quality parameters and the accuracy of digital maps. The results of this study showed that GRD and NIIRS were more critical for map production then GSD, MTF or SNR.

  1. Perceived quality of wood images influenced by the skewness of image histogram

    NASA Astrophysics Data System (ADS)

    Katsura, Shigehito; Mizokami, Yoko; Yaguchi, Hirohisa

    2015-08-01

    The shape of image luminance histograms is related to material perception. We investigated how the luminance histogram contributed to improvements in the perceived quality of wood images by examining various natural wood and adhesive vinyl sheets with printed wood grain. In the first experiment, we visually evaluated the perceived quality of wood samples. In addition, we measured the colorimetric parameters of the wood samples and calculated statistics of image luminance. The relationship between visual evaluation scores and image statistics suggested that skewness and kurtosis affected the perceived quality of wood. In the second experiment, we evaluated the perceived quality of wood images with altered luminance skewness and kurtosis using a paired comparison method. Our result suggests that wood images are more realistic if the skewness of the luminance histogram is slightly negative.

  2. Image Quality Evalutation on ALOS/PRISM and AVNIR-2

    NASA Astrophysics Data System (ADS)

    Mukaida, Akira; Imoto, Naritoshi; Tadono, Takeo; Murakami, Hiroshi; Kawamoto, Sachi

    2008-11-01

    Image quality evaluation on ALOS (Advanced Land Observing Satellite) / PRISM (Panchromatic Remote-sensing Instrument for Stereo Mapping) and AVNIR-2 (Advanced Visible and Near Infrared Radiometer 2) has been carried out during operational phase. This is a report on result of evaluation for image quality in terms of MTF (Modulation Transfer Function) and SNR (Signal to Noise Ratio) for both PRISM and AVNIR-2. SNR of PRISM image has been increased following the up dating of radiometric correction and implementation of JPEG noise reduction filter. The result was in range of specification for both sensors.

  3. Digital image quality measurements by objective and subjective methods from series of parametrically degraded images

    NASA Astrophysics Data System (ADS)

    Tachó, Aura; Mitjà, Carles; Martínez, Bea; Escofet, Jaume; Ralló, Miquel

    2013-11-01

    Many digital image applications like digitization of cultural heritage for preservation purposes operate with compressed files in one or more image observing steps. For this kind of applications JPEG compression is one of the most widely used. Compression level, final file size and quality loss are parameters that must be managed optimally. Although this loss can be monitored by means of objective image quality measurements, the real challenge is to know how it can be related with the perceived image quality by observers. A pictorial image has been degraded by two different procedures. The first, applying different levels of low pass filtering by convolving the image with progressively broad Gauss kernels. The second, saving the original file to a series of JPEG compression levels. In both cases, the objective image quality measurement is done by analysis of the image power spectrum. In order to obtain a measure of the perceived image quality, both series of degraded images are displayed on a computer screen organized in random pairs. The observers are compelled to choose the best image of each pair. Finally, a ranking is established applying Thurstone scaling method. Results obtained by both measurements are compared between them and with other objective measurement method as the Slanted Edge Test.

  4. Total Quality Can Help Your District's Image.

    ERIC Educational Resources Information Center

    Cokeley, Sandra

    1996-01-01

    Describes how educators in the Pearl River School District, Pearl River, New York, have implemented Total Quality Management (TQM) principles to evaluate and improve their effectiveness. Includes two charts that depict key indicators of financial and academic performance and a seven-year profile of the district's budget, enrollment, diploma rate,…

  5. Optimization and image quality assessment of the alpha-image reconstruction algorithm: iterative reconstruction with well-defined image quality metrics

    NASA Astrophysics Data System (ADS)

    Lebedev, Sergej; Sawall, Stefan; Kuchenbecker, Stefan; Faby, Sebastian; Knaup, Michael; Kachelrieß, Marc

    2015-03-01

    The reconstruction of CT images with low noise and highest spatial resolution is a challenging task. Usually, a trade-off between at least these two demands has to be found or several reconstructions with mutually exclusive properties, i.e. either low noise or high spatial resolution, have to be performed. Iterative reconstruction methods might be suitable tools to overcome these limitations and provide images of highest diagnostic quality with formerly mutually exclusive image properties. While image quality metrics like the modulation transfer function (MTF) or the point spread function (PSF) are well-defined in case of standard reconstructions, e.g. filtered backprojection, the iterative algorithms lack these metrics. To overcome this issue alternate methodologies like the model observers have been proposed recently to allow a quantification of a usually task-dependent image quality metric.1 As an alternative we recently proposed an iterative reconstruction method, the alpha-image reconstruction (AIR), providing well-defined image quality metrics on a per-voxel basis.2 In particular, the AIR algorithm seeks to find weighting images, the alpha-images, that are used to blend between basis images with mutually exclusive image properties. The result is an image with highest diagnostic quality that provides a high spatial resolution and a low noise level. As the estimation of the alpha-images is computationally demanding we herein aim at optimizing this process and highlight the favorable properties of AIR using patient measurements.

  6. Study of a water quality imager for coastal zone missions

    NASA Technical Reports Server (NTRS)

    Staylor, W. F.; Harrison, E. F.; Wessel, V. W.

    1975-01-01

    The present work surveys water quality user requirements and then determines the general characteristics of an orbiting imager (the Applications Explorer, or AE) dedicated to the measurement of water quality, which could be used as a low-cost means of testing advanced imager concepts and assessing the ability of imager techniques to meet the goals of a comprehensive water quality monitoring program. The proposed imager has four spectral bands, a spatial resolution of 25 meters, and swath width of 36 km with a pointing capability of 330 km. Silicon photodetector arrays, pointing systems, and several optical features are included. A nominal orbit of 500 km altitude at an inclination of 50 deg is recommended.

  7. Quality evaluation of extra high quality images based on key assessment word

    NASA Astrophysics Data System (ADS)

    Kameda, Masashi; Hayashi, Hidehiko; Akamatsu, Shigeru; Miyahara, Makoto M.

    2001-06-01

    An all encompassing goal of our research is to develop an extra high quality imaging system which is able to convey a high level artistic impression faithfully. We have defined a high order sensation as such a high level artistic impression, and it is supposed that the high order sensation is expressed by the combination of the psychological factor which can be described by plural assessment words. In order to pursue the quality factors that are important for the reproduction of the high order sensation, we have focused on the image quality evaluation of the extra high quality images using the assessment words considering the high order sensation. In this paper, we have obtained the hierarchical structure between the collected assessment words and the principles of European painting based on the conveyance model of the high order sensation, and we have determined a key assessment word 'plasticity' which is able to evaluate the reproduction of the high order sensation more accurately. The results of the subjective assessment experiments using the prototype of the developed extra high quality imaging system have shown that the obtained key assessment word 'plasticity' is the most appropriate assessment word to evaluate the image quality of the extra high quality images quasi-quantitatively.

  8. Imaging quality assessment of multi-modal miniature microscope.

    PubMed

    Lee, Junwon; Rogers, Jeremy; Descour, Michael; Hsu, Elizabeth; Aaron, Jesse; Sokolov, Konstantin; Richards-Kortum, Rebecca

    2003-06-16

    We are developing a multi-modal miniature microscope (4M device) to image morphology and cytochemistry in vivo and provide better delineation of tumors. The 4M device is designed to be a complete microscope on a chip, including optical, micro-mechanical, and electronic components. It has advantages such as compact size and capability for microscopic-scale imaging. This paper presents an optics-only prototype 4M device, the very first imaging system made of sol-gel material. The microoptics used in the 4M device has a diameter of 1.3 mm. Metrology of the imaging quality assessment of the prototype device is presented. We describe causes of imaging performance degradation in order to improve the fabrication process. We built a multi-modal imaging test-bed to measure first-order properties and to assess the imaging quality of the 4M device. The 4M prototype has a field of view of 290 microm in diameter, a magnification of -3.9, a working distance of 250 microm and a depth of field of 29.6+/-6 microm. We report the modulation transfer function (MTF) of the 4M device as a quantitative metric of imaging quality. Based on the MTF data, we calculated a Strehl ratio of 0.59. In order to investigate the cause of imaging quality degradation, the surface characterization of lenses in 4M devices is measured and reported. We also imaged both polystyrene microspheres similar in size to epithelial cell nuclei and cervical cancer cells. Imaging results indicate that the 4M prototype can resolve cellular detail necessary for detection of precancer. PMID:19466016

  9. The effect of image quality and forensic expertise in facial image comparisons.

    PubMed

    Norell, Kristin; Läthén, Klas Brorsson; Bergström, Peter; Rice, Allyson; Natu, Vaidehi; O'Toole, Alice

    2015-03-01

    Images of perpetrators in surveillance video footage are often used as evidence in court. In this study, identification accuracy was compared for forensic experts and untrained persons in facial image comparisons as well as the impact of image quality. Participants viewed thirty image pairs and were asked to rate the level of support garnered from their observations for concluding whether or not the two images showed the same person. Forensic experts reached their conclusions with significantly fewer errors than did untrained participants. They were also better than novices at determining when two high-quality images depicted the same person. Notably, lower image quality led to more careful conclusions by experts, but not for untrained participants. In summary, the untrained participants had more false negatives and false positives than experts, which in the latter case could lead to a higher risk of an innocent person being convicted for an untrained witness. PMID:25537273

  10. Improving high resolution retinal image quality using speckle illumination HiLo imaging

    PubMed Central

    Zhou, Xiaolin; Bedggood, Phillip; Metha, Andrew

    2014-01-01

    Retinal image quality from flood illumination adaptive optics (AO) ophthalmoscopes is adversely affected by out-of-focus light scatter due to the lack of confocality. This effect is more pronounced in small eyes, such as that of rodents, because the requisite high optical power confers a large dioptric thickness to the retina. A recently-developed structured illumination microscopy (SIM) technique called HiLo imaging has been shown to reduce the effect of out-of-focus light scatter in flood illumination microscopes and produce pseudo-confocal images with significantly improved image quality. In this work, we adopted the HiLo technique to a flood AO ophthalmoscope and performed AO imaging in both (physical) model and live rat eyes. The improvement in image quality from HiLo imaging is shown both qualitatively and quantitatively by using spatial spectral analysis. PMID:25136486

  11. Image science and image-quality research in the Optical Sciences Center

    NASA Astrophysics Data System (ADS)

    Barrett, Harrison H.; Myers, Kyle J.

    2014-09-01

    This paper reviews the history of research into imaging and image quality at the Optical Sciences Center (OSC), with emphasis on the period 1970-1990. The work of various students in the areas of psychophysical studies of human observers of images; mathematical model observers; image simulation and analysis, and the application of these methods to radiology and nuclear medicine is summarized. The rapid progress in computational power, at OSC and elsewhere, which enabled the steady advances in imaging and the emergence of a science of imaging, is also traced. The implications of these advances to ongoing research and the current Image Science curriculum at the College of Optical Sciences are discussed.

  12. APQ-102 imaging radar digital image quality study

    NASA Astrophysics Data System (ADS)

    Griffin, C. R.; Estes, J. M.

    1982-11-01

    A modified APQ-102 sidelooking radar collected synthetic aperture radar (SAR) data which was digitized and recorded on wideband magnetic tape. These tapes were then ground processed into computer compatible tapes (CCT's). The CCT's may then be processed into high resolution radar images by software on the CYBER computer.

  13. APQ-102 imaging radar digital image quality study

    NASA Technical Reports Server (NTRS)

    Griffin, C. R.; Estes, J. M.

    1982-01-01

    A modified APQ-102 sidelooking radar collected synthetic aperture radar (SAR) data which was digitized and recorded on wideband magnetic tape. These tapes were then ground processed into computer compatible tapes (CCT's). The CCT's may then be processed into high resolution radar images by software on the CYBER computer.

  14. Illumination strategies to achieve effective indoor millimeter wave imaging for personnel screening applications

    NASA Astrophysics Data System (ADS)

    Doyle, Rory; Lyons, Brendan; Lettington, Alan; McEnroe, Tony; Walshe, John; McNaboe, John; Curtin, Peter

    2005-05-01

    The ability of millimetre-waves (mm-wave) to penetrate obscurants, be they clothing, fog etc., enables unique imaging applications in areas such as security screening of personnel and landing aids for aircraft. When used in an outdoor application, the natural thermal contrast provided by cold sky reflections off of objects allow for direct imaging of a scene. Imaging at mm-wave frequencies in an indoor situation requires that a thermal contrast be generated in order to illuminate and detect objects of interest. In the case of a portal screening application the illumination needs to be provided over the imaged area in a uniform, omni-directional manner and at a sufficient level of contrast to achieve the desired signal to noise ratio at the sensor. The primary options are to generate this contrast by using active noise sources or to develop a passive thermally induced source of mm-wave energy. This paper describes the approaches taken to developing and implementing an indoor imaging configuration for a mm-wave camera that is to be used in people screening applications. The camera uses a patented mechanical scanning method to directly generate a raster frame image of portal dimensions. Imaging has been conducted at a range of frequencies with the main focus being on 94GHz operation. Experiences with both active and passive illumination schemes are described with conclusions on the merits or otherwise of each. The results of imaging trials demonstrate the potential for using mm-wave imaging in an indoor situation and example images illustrate the capability of the camera and the illumination methods when used for personnel screening.

  15. Imaging without lenses: achievements and remaining challenges of wide-field on-chip microscopy

    PubMed Central

    Greenbaum, Alon; Luo, Wei; Su, Ting-Wei; Göröcs, Zoltán; Xue, Liang; Isikman, Serhan O; Coskun, Ahmet F; Mudanyali, Onur; Ozcan, Aydogan

    2012-01-01

    We discuss unique features of lens-free computational imaging tools and report some of their emerging results for wide-field on-chip microscopy, such as the achievement of a numerical aperture (NA) of ~0.8–0.9 across a field of view (FOV) of more than 20 mm2 or an NA of ~0.1 across a FOV of ~18 cm2, which corresponds to an image with more than 1.5 gigapixels. We also discuss the current challenges that these computational on-chip microscopes face, shedding light on their future directions and applications. PMID:22936170

  16. A clinical comparison of image quality and patient exposure reduction in panoramic radiography with heavy metal filtration

    SciTech Connect

    Kapa, S.F.; Tyndall, D.A.

    1989-06-01

    Laboratory and clinical studies with the use of rare earth intensifying screens and four different forms of heavy metal elements serving as additional beam filtration were performed for panoramic radiography to identify the most efficacious system. Balanced density images were evaluated for contrast indices, resolution, relative dose reduction, and subjective image quality. Clinical studies were performed with a standard calcium tungstate imaging system and the four most promising experimental imaging systems that showed improvement over the standard system. Dosimetric studies were performed with the use of ionization chambers and thermoluminescent dosimetry (TLD) dosimeters. Exposure reductions of 34% to 79%, depending on the anatomic site and the imaging system used, were achieved. Subjective image quality was evaluated and analyzed statistically. This study concluded that the use of a Kodak Lanex regular screen/T-Mat G film with either Lanex screen or yttrium added beam filtration results in reduced patient exposure in panoramic radiography while image quality is maintained or improved.

  17. A clinical comparison of image quality and patient exposure reduction in panoramic radiography with heavy metal filtration.

    PubMed

    Kapa, S F; Tyndall, D A

    1989-06-01

    Laboratory and clinical studies with the use of rare earth intensifying screens and four different forms of heavy metal elements serving as additional beam filtration were performed for panoramic radiography to identify the most efficacious system. Balanced density images were evaluated for contrast indices, resolution, relative dose reduction, and subjective image quality. Clinical studies were performed with a standard calcium tungstate imaging system and the four most promising experimental imaging systems that showed improvement over the standard system. Dosimetric studies were performed with the use of ionization chambers and thermoluminescent dosimetry (TLD) dosimeters. Exposure reductions of 34% to 79%, depending on the anatomic site and the imaging system used, were achieved. Subjective image quality was evaluated and analyzed statistically. This study concluded that the use of a Kodak Lanex regular screen/T-Mat G film with either Lanex screen or yttrium added beam filtration results in reduced patient exposure in panoramic radiography while image quality is maintained or improved. PMID:2740096

  18. Determination of pork quality attributes using hyperspectral imaging technique

    NASA Astrophysics Data System (ADS)

    Qiao, Jun; Wang, Ning; Ngadi, M. O.; Gunenc, Aynur

    2005-11-01

    Meat grading has always been a research topic because of large variations among meat products. Many subjective assessment methods with poor repeatability and tedious procedures are still widely used in meat industry. In this study, a hyperspectral-imaging-based technique was developed to achieve fast, accurate, and objective determination of pork quality attributes. The system was able to extract the spectral and spatial characteristics for simultaneous determination of drip loss and pH in pork meat. Two sets of six significant feature wavelengths were selected for predicting the drip loss (590, 645, 721, 752, 803 and 850 nm) and pH (430, 448, 470, 890, 980 and 999 nm). Two feed-forward neural network models were developed. The results showed that the correlation coefficient (r) between the predicted and actual drip loss and pH were 0.71, and 0.58, respectively, by Model 1 and 0.80 for drip loss and 0.67 for pH by Model 2. The color levels of meat samples were also mapped successfully based on a digitalized Meat Color Standard.

  19. Validation of no-reference image quality index for the assessment of digital mammographic images

    NASA Astrophysics Data System (ADS)

    de Oliveira, Helder C. R.; Barufaldi, Bruno; Borges, Lucas R.; Gabarda, Salvador; Bakic, Predrag R.; Maidment, Andrew D. A.; Schiabel, Homero; Vieira, Marcelo A. C.

    2016-03-01

    To ensure optimal clinical performance of digital mammography, it is necessary to obtain images with high spatial resolution and low noise, keeping radiation exposure as low as possible. These requirements directly affect the interpretation of radiologists. The quality of a digital image should be assessed using objective measurements. In general, these methods measure the similarity between a degraded image and an ideal image without degradation (ground-truth), used as a reference. These methods are called Full-Reference Image Quality Assessment (FR-IQA). However, for digital mammography, an image without degradation is not available in clinical practice; thus, an objective method to assess the quality of mammograms must be performed without reference. The purpose of this study is to present a Normalized Anisotropic Quality Index (NAQI), based on the Rényi entropy in the pseudo-Wigner domain, to assess mammography images in terms of spatial resolution and noise without any reference. The method was validated using synthetic images acquired through an anthropomorphic breast software phantom, and the clinical exposures on anthropomorphic breast physical phantoms and patient's mammograms. The results reported by this noreference index follow the same behavior as other well-established full-reference metrics, e.g., the peak signal-to-noise ratio (PSNR) and structural similarity index (SSIM). Reductions of 50% on the radiation dose in phantom images were translated as a decrease of 4dB on the PSNR, 25% on the SSIM and 33% on the NAQI, evidencing that the proposed metric is sensitive to the noise resulted from dose reduction. The clinical results showed that images reduced to 53% and 30% of the standard radiation dose reported reductions of 15% and 25% on the NAQI, respectively. Thus, this index may be used in clinical practice as an image quality indicator to improve the quality assurance programs in mammography; hence, the proposed method reduces the subjectivity

  20. Sentinel-2 radiometric image quality commissioning: first results

    NASA Astrophysics Data System (ADS)

    Lachérade, S.; Lonjou, V.; Farges, M.; Gamet, P.; Marcq, S.; Raynaud, J.-L.; Trémas, T.

    2015-10-01

    In partnership with the European Commission and in the frame of the Copernicus program, the European Space Agency (ESA) is developing the Sentinel-2 optical imaging mission devoted to the operational monitoring of land and coastal areas. The Sentinel-2 mission is based on a satellites constellation deployed in polar sun-synchronous orbit. Sentinel-2 offers a unique combination of global coverage with a wide field of view (290km), a high revisit (5 days with two satellites), a high spatial resolution (10m, 20m and 60m) and multi-spectral imagery (13 spectral bands in visible and shortwave infrared domains). The first satellite, Sentinel-2A, has been launched in June 2015. The Sentinel-2A Commissioning Phase starts immediately after the Launch and Early Orbit Phase and continues until the In-Orbit Commissioning Review which is planned three months after the launch. The Centre National d'Etudes Spatiales (CNES) supports ESA/ESTEC to insure the Calibration/Validation commissioning phase during the first three months in flight. This paper provides first an overview of the Sentinel-2 system and a description of the products delivered by the ground segment associated to the main radiometric specifications to achieve. Then the paper focuses on the preliminary radiometric results obtained during the in-flight commissioning phase. The radiometric methods and calibration sites used in the CNES image quality center to reach the specifications of the sensor are described. A status of the Sentinel-2A radiometric performances at the end of the first three months after the launch is presented. We will particularly address in this paper the results in term of absolute calibration, pixel to pixel relative sensitivity and MTF estimation.

  1. Influence of acquisition parameters on MV-CBCT image quality.

    PubMed

    Gayou, Olivier

    2012-01-01

    The production of high quality pretreatment images plays an increasing role in image-guided radiotherapy (IGRT) and adaptive radiation therapy (ART). Megavoltage cone-beam computed tomography (MV-CBCT) is the simplest solution of all the commercially available volumetric imaging systems for localization. It also suffers the most from relatively poor contrast due to the energy range of the imaging photons. Several avenues can be investigated to improve MV-CBCT image quality while maintaining an acceptable patient exposure: beam generation, detector technology, reconstruction parameters, and acquisition parameters. This article presents a study of the effects of the acquisition scan length and number of projections of a Siemens Artiste MV-CBCT system on image quality within the range provided by the manufacturer. It also discusses other aspects not related to image quality one should consider when selecting an acquisition protocol. Noise and uniformity were measured on the image of a cylindrical water phantom. Spatial resolution was measured using the same phantom half filled with water to provide a sharp water/air interface to derive the modulation transfer function (MTF). Contrast-to-noise ratio (CNR) was measured on a pelvis-shaped phantom with four inserts of different electron densities relative to water (1.043, 1.117, 1.513, and 0.459). Uniformity was independent of acquisition protocol. Noise decreased from 1.96% to 1.64% when the total number of projections was increased from 100 to 600 for a total exposure of 13.5 MU. The CNR showed a ± 5% dependence on the number of projections and 10% dependence on the scan length. However, these variations were not statistically significant. The spatial resolution was unaffected by the arc length or the sampling rate. Acquisition parameters have little to no effect on the image quality of the MV-CBCT system within the range of parameters available on the system. Considerations other than image quality, such as memory

  2. Can we ID from CCTV? Image quality in digital CCTV and face identification performance

    NASA Astrophysics Data System (ADS)

    Keval, Hina U.; Sasse, M. Angela

    2008-04-01

    CCTV is used for an increasing number of purposes, and the new generation of digital systems can be tailored to serve a wide range of security requirements. However, configuration decisions are often made without considering specific task requirements, e.g. the video quality needed for reliable person identification. Our study investigated the relationship between video quality and the ability of untrained viewers to identify faces from digital CCTV images. The task required 80 participants to identify 64 faces belonging to 4 different ethnicities. Participants compared face images taken from a high quality photographs and low quality CCTV stills, which were recorded at 4 different video quality bit rates (32, 52, 72 and 92 Kbps). We found that the number of correct identifications decreased by 12 (~18%) as MPEG-4 quality decreased from 92 to 32 Kbps, and by 4 (~6%) as Wavelet video quality decreased from 92 to 32 Kbps. To achieve reliable and effective face identification, we recommend that MPEG-4 CCTV systems should be used over Wavelet, and video quality should not be lowered below 52 Kbps during video compression. We discuss the practical implications of these results for security, and contribute a contextual methodology for assessing CCTV video quality.

  3. Body image and quality of life in a Spanish population

    PubMed Central

    Lobera, Ignacio Jáuregui; Ríos, Patricia Bolaños

    2011-01-01

    Purpose The aim of the current study was to analyze the psychometric properties, factor structure, and internal consistency of the Spanish version of the Body Image Quality of Life Inventory (BIQLI-SP) as well as its test–retest reliability. Further objectives were to analyze different relationships with key dimensions of psychosocial functioning (ie, self-esteem, presence of psychopathological symptoms, eating and body image-related problems, and perceived stress) and to evaluate differences in body image quality of life due to gender. Patients and methods The sample comprised 417 students without any psychiatric history, recruited from the Pablo de Olavide University and the University of Seville. There were 140 men (33.57%) and 277 women (66.43%), and the mean age was 21.62 years (standard deviation = 5.12). After obtaining informed consent from all participants, the following questionnaires were administered: BIQLI, Eating Disorder Inventory-2 (EDI-2), Perceived Stress Questionnaire (PSQ), Self-Esteem Scale (SES), and Symptom Checklist-90-Revised (SCL-90-R). Results The BIQLI-SP shows adequate psychometric properties, and it may be useful to determine the body image quality of life in different physical conditions. A more positive body image quality of life is associated with better self-esteem, better psychological wellbeing, and fewer eating-related dysfunctional attitudes, this being more evident among women. Conclusion The BIQLI-SP may be useful to determine the body image quality of life in different contexts with regard to dermatology, cosmetic and reconstructive surgery, and endocrinology, among others. In these fields of study, a new trend has emerged to assess body image-related quality of life. PMID:21403794

  4. Analysis of image quality for laser display scanner test

    NASA Astrophysics Data System (ADS)

    Specht, H.; Kurth, S.; Billep, D.; Gessner, T.

    2009-02-01

    The scanning laser display technology is one of the most promising technologies for highly integrated projection display applications (e. g. in PDAs, mobile phones or head mounted displays) due to its advantages regarding image quality, miniaturization level and low cost potential. As a couple of research teams found during their investigations on laser scanning projections systems, the image quality of such systems is - beside from laser source and video signal processing - crucially determined by the scan engine, including MEMS scanner, driving electronics, scanning regime and synchronization. Even though a number of technical parameters can be measured with high accuracy, the test procedure is challenging because the influence of these parameters on image quality is often insufficiently understood. Thus, in many cases it is not clear how to define limiting values for characteristic parameters. In this paper the relationship between parameters characterizing the scan engine and their influence on image quality will be discussed. Those include scanner topography, geometry of the path of light as well as trajectory parameters. Understanding this enables a new methodology for testing and characterization of the scan engine, based on evaluation of one or a series of projected test images. Due to the fact that the evaluation process can be easily automated by digital image processing this methodology has the potential to become integrated into the production process of laser displays.

  5. Teacher student relationship quality type in elementary grades: Effects on trajectories for achievement and engagement

    PubMed Central

    Wu, Jiun-yu; Hughes, Jan N.; Kwok, Oi-man

    2010-01-01

    Teacher, peer, and student reports of the quality of the teacher-student relationship were obtained for an ethnically diverse and academically at-risk sample of 706 second and third grade students. Cluster analysis identified four types of relationships based on the consistency of child reports of support and conflict in the relationship with reports of others: Congruent positive, Congruent Negative, Incongruent Child Negative and Incongruent Child Positive. The cluster solution evidenced good internal consistency and construct validity. Cluster membership predicted growth trajectories for teacher-rated engagement and standardized achievement scores over the following three years, above prior performance. The predictive associations between child reports of teacher support and conflict and outcomes depended on whether child reports were consistent or inconsistent with reports of others. Study findings have implications for theory development, assessment of teacher-student relationships, and teacher professional development. PMID:20728688

  6. Taking advantage of ground data systems attributes to achieve quality results in testing software

    NASA Technical Reports Server (NTRS)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  7. Conditions required for high quality high magnification images in secondary electron-I scanning electron microscopy.

    PubMed

    Peters, K R

    1982-01-01

    High quality of secondary electron (SE) images, taken at useful magnifications of 100,000 to 200,000, require new signal generation and collection methods and new metal coating procedures. High quality is defined as the condition under which image contrast describes accurately the topographic features of the specimen in a size range that approximates the beam diameter. Such high resolution contrasts are produced by the SE (SE-I) generated by a small electron probe on the specimen surface. Tobacco mosiac virus and ferritin molecules deposited on bulk substrates were introduced as test specimens to check the image quality obtained. The SE-I signal contrast could be imaged when SE (SE-III), produced by backscattered electrons (BSE) at the pole piece of the final lens, were eliminated with an electron absorption device attached to the pole piece. This signal collection procedure will be referred to as "Secondary Electron-I Image" (SE-I image) mode. In addition to the SE-III, BSE generate SE-II in the specimen itself. On specimens deposited on bulk gold or platinum, and coated with the same metals SE-II produced a microroughness contrast that limited particle resolution in the SE-I image mode to approximately 10 nm. Reduction of SE-II and enrichment of the signal in SE-I was achieved by using continuous fine crystalline coatings of tantalum, niobium and chromium. By applying these metals in films of approximately 2.0 nm thickness, the SE-I contrast generation was found to be indepedent of the atomic number of the metal. Edge sharpness was improved when the specimens were coated with low atomic number metals. Under these conditions, the quality of images obtained in SE-I image mode equals that of images obtained in TEM from identically coated specimens and was limited only by the size of the topographic details, beam diameter and beam current. PMID:7184136

  8. Image quality assessment with manifold and machine learning

    NASA Astrophysics Data System (ADS)

    Charrier, Christophe; Lebrun, Gilles; Lezoray, Olivier

    2009-01-01

    A crucial step in image compression is the evaluation of its performance, and more precisely the available way to measure the final quality of the compressed image. In this paper, a machine learning expert, providing a final class number is designed. The quality measure is based on a learned classification process in order to respect the one of human observers. Instead of computing a final note, our method classifies the quality using the quality scale recommended by the UIT. This quality scale contains 5 ranks ordered from 1 (the worst quality) to 5 (the best quality). This was done constructing a vector containing many visual attributes. Finally, the final features vector contains more than 40 attibutes. Unfortunatley, no study about the existing interactions between the used visual attributes has been done. A feature selection algorithm could be interesting but the selection is highly related to the further used classifier. Therefore, we prefer to perform dimensionality reduction instead of feature selection. Manifold Learning methods are used to provide a low-dimensional new representation from the initial high dimensional feature space. The classification process is performed on this new low-dimensional representation of the images. Obtained results are compared to the one obtained without applying the dimension reduction process to judge the efficiency of the method.

  9. Faster, higher quality volume visualization for 3D medical imaging

    NASA Astrophysics Data System (ADS)

    Kalvin, Alan D.; Laine, Andrew F.; Song, Ting

    2008-03-01

    The two major volume visualization methods used in biomedical applications are Maximum Intensity Projection (MIP) and Volume Rendering (VR), both of which involve the process of creating sets of 2D projections from 3D images. We have developed a new method for very fast, high-quality volume visualization of 3D biomedical images, based on the fact that the inverse of this process (transforming 2D projections into a 3D image) is essentially equivalent to tomographic image reconstruction. This new method uses the 2D projections acquired by the scanner, thereby obviating the need for the two computationally expensive steps currently required in the complete process of biomedical visualization, that is, (i) reconstructing the 3D image from 2D projection data, and (ii) computing the set of 2D projections from the reconstructed 3D image As well as improvements in computation speed, this method also results in improvements in visualization quality, and in the case of x-ray CT we can exploit this quality improvement to reduce radiation dosage. In this paper, demonstrate the benefits of developing biomedical visualization techniques by directly processing the sensor data acquired by body scanners, rather than by processing the image data reconstructed from the sensor data. We show results of using this approach for volume visualization for tomographic modalities, like x-ray CT, and as well as for MRI.

  10. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  11. TH-C-18A-06: Combined CT Image Quality and Radiation Dose Monitoring Program Based On Patient Data to Assess Consistency of Clinical Imaging Across Scanner Models

    SciTech Connect

    Christianson, O; Winslow, J; Samei, E

    2014-06-15

    Purpose: One of the principal challenges of clinical imaging is to achieve an ideal balance between image quality and radiation dose across multiple CT models. The number of scanners and protocols at large medical centers necessitates an automated quality assurance program to facilitate this objective. Therefore, the goal of this work was to implement an automated CT image quality and radiation dose monitoring program based on actual patient data and to use this program to assess consistency of protocols across CT scanner models. Methods: Patient CT scans are routed to a HIPPA compliant quality assurance server. CTDI, extracted using optical character recognition, and patient size, measured from the localizers, are used to calculate SSDE. A previously validated noise measurement algorithm determines the noise in uniform areas of the image across the scanned anatomy to generate a global noise level (GNL). Using this program, 2358 abdominopelvic scans acquired on three commercial CT scanners were analyzed. Median SSDE and GNL were compared across scanner models and trends in SSDE and GNL with patient size were used to determine the impact of differing automatic exposure control (AEC) algorithms. Results: There was a significant difference in both SSDE and GNL across scanner models (9–33% and 15–35% for SSDE and GNL, respectively). Adjusting all protocols to achieve the same image noise would reduce patient dose by 27–45% depending on scanner model. Additionally, differences in AEC methodologies across vendors resulted in disparate relationships of SSDE and GNL with patient size. Conclusion: The difference in noise across scanner models indicates that protocols are not optimally matched to achieve consistent image quality. Our results indicated substantial possibility for dose reduction while achieving more consistent image appearance. Finally, the difference in AEC methodologies suggests the need for size-specific CT protocols to minimize variability in image

  12. Quality assurance of ultrasound imaging instruments by monitoring the monitor.

    PubMed

    Walker, J B; Thorne, G C; Halliwell, M

    1993-11-01

    Ultrasound quality assurance (QA) is a means of assuring the constant performance of an ultrasound instrument. A novel 'ultrasound image analyser' has been developed to allow objective, accurate and repeatable measurement of the image displayed on the ultrasound screen, i.e. as seen by the operator. The analyser uses a television camera/framestore combination to digitize and analyse this image. A QA scheme is described along with the procedures necessary to obtain a repeatable measurement of the image so that comparisons with earlier good images can be made. These include repositioning the camera and resetting the video display characteristics. The advantages of using the analyser over other methods are discussed. It is concluded that the analyser has distinct advantages over subjective image assessment methods and will be a valuable addition to current ultrasound QA programmes. PMID:8272435

  13. Investigation of perceptual attributes for mobile display image quality

    NASA Astrophysics Data System (ADS)

    Gong, Rui; Xu, Haisong; Wang, Qing; Wang, Zhehong; Li, Haifeng

    2013-08-01

    Large-scale psychophysical experiments are carried out on two types of mobile displays to evaluate the perceived image quality (IQ). Eight perceptual attributes, i.e., naturalness, colorfulness, brightness, contrast, sharpness, clearness, preference, and overall IQ, are visually assessed via categorical judgment method for various application types of test images, which were manipulated by different methods. Their correlations are deeply discussed, and further factor analysis revealed the two essential components to describe the overall IQ, i.e., the component of image detail aspect and the component of color information aspect. Clearness and naturalness are regarded as two principal factors for natural scene images, whereas clearness and colorfulness were selected as key attributes affecting the overall IQ for other application types of images. Accordingly, based on these selected attributes, two kinds of empirical models are built to predict the overall IQ of mobile displays for different application types of images.

  14. The influence of noise on image quality in phase-diverse coherent diffraction imaging

    NASA Astrophysics Data System (ADS)

    Wittler, H. P. A.; van Riessen, G. A.; Jones, M. W. M.

    2016-02-01

    Phase-diverse coherent diffraction imaging provides a route to high sensitivity and resolution with low radiation dose. To take full advantage of this, the characteristics and tolerable limits of measurement noise for high quality images must be understood. In this work we show the artefacts that manifest in images recovered from simulated data with noise of various characteristics in the illumination and diffraction pattern. We explore the limits at which images of acceptable quality can be obtained and suggest qualitative guidelines that would allow for faster data acquisition and minimize radiation dose.

  15. Use of a speckle reduction technique to improve the reconstruction image quality of CCD-based optical computed tomography scanner

    NASA Astrophysics Data System (ADS)

    Chang, Yuan-Jen

    2015-06-01

    This study proposed a speckle reduction technique (SRT) that employs a rotating diffuser in the parallel beam optical computed tomography (CT). Results showed that the mean and standard deviation of the gray level are 89.79±4.53 and 89.16±2.88 for reconstruction images without SRT and with SRT, respectively. The proposed SRT effectively removed ring artifacts. In addition, two image processing techniques, namely, the mean and Wiener filters, were also used to improve the reconstructed images. The image processing technique alone effectively reduced ring artifacts, but some fluctuations were still observed in the line profiles of the reconstructed images. Results proved that the proposed SRT is a simple method that is easily implemented to improve image quality for parallel beam optical CT. The combination of SRT and image filters was suggested to achieve the best image reconstruction quality through the full removal of ring artifacts.

  16. Real-time computer treatment of THz passive device images with the high image quality

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  17. Exploratory survey of image quality on CR digital mammography imaging systems in Mexico.

    PubMed

    Gaona, E; Rivera, T; Arreola, M; Franco, J; Molina, N; Alvarez, B; Azorín, C G; Casian, G

    2014-01-01

    The purpose of this study was to assess the current status of image quality and dose in computed radiographic digital mammography (CRDM) systems. Studies included CRDM systems of various models and manufacturers which dose and image quality comparisons were performed. Due to the recent rise in the use of digital radiographic systems in Mexico, CRDM systems are rapidly replacing conventional film-screen systems without any regard to quality control or image quality standards. Study was conducted in 65 mammography facilities which use CRDM systems in the Mexico City and surrounding States. The systems were tested as used clinically. This means that the dose and beam qualities were selected using the automatic beam selection and photo-timed features. All systems surveyed generate laser film hardcopies for the radiologist to read on a scope or mammographic high luminance light box. It was found that 51 of CRDM systems presented a variety of image artefacts and non-uniformities arising from inadequate acquisition and processing, as well as from the laser printer itself. Undisciplined alteration of image processing settings by the technologist was found to be a serious prevalent problem in 42 facilities. Only four of them showed an image QC program which is periodically monitored by a medical physicist. The Average Glandular Dose (AGD) in the surveyed systems was estimated to have a mean value of 2.4 mGy. To improve image quality in mammography and make more efficient screening mammographic in early detection of breast cancer is required new legislation. PMID:23938078

  18. Effects of sparse sampling schemes on image quality in low-dose CT

    SciTech Connect

    Abbas, Sajid; Lee, Taewon; Cho, Seungryong; Shin, Sukyoung; Lee, Rena

    2013-11-15

    Purpose: Various scanning methods and image reconstruction algorithms are actively investigated for low-dose computed tomography (CT) that can potentially reduce a health-risk related to radiation dose. Particularly, compressive-sensing (CS) based algorithms have been successfully developed for reconstructing images from sparsely sampled data. Although these algorithms have shown promises in low-dose CT, it has not been studied how sparse sampling schemes affect image quality in CS-based image reconstruction. In this work, the authors present several sparse-sampling schemes for low-dose CT, quantitatively analyze their data property, and compare effects of the sampling schemes on the image quality.Methods: Data properties of several sampling schemes are analyzed with respect to the CS-based image reconstruction using two measures: sampling density and data incoherence. The authors present five different sparse sampling schemes, and simulated those schemes to achieve a targeted dose reduction. Dose reduction factors of about 75% and 87.5%, compared to a conventional scan, were tested. A fully sampled circular cone-beam CT data set was used as a reference, and sparse sampling has been realized numerically based on the CBCT data.Results: It is found that both sampling density and data incoherence affect the image quality in the CS-based reconstruction. Among the sampling schemes the authors investigated, the sparse-view, many-view undersampling (MVUS)-fine, and MVUS-moving cases have shown promising results. These sampling schemes produced images with similar image quality compared to the reference image and their structure similarity index values were higher than 0.92 in the mouse head scan with 75% dose reduction.Conclusions: The authors found that in CS-based image reconstructions both sampling density and data incoherence affect the image quality, and suggest that a sampling scheme should be devised and optimized by use of these indicators. With this strategic

  19. Determination of pasture quality using airborne hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Pullanagari, R. R.; Kereszturi, G.; Yule, Ian J.; Irwin, M. E.

    2015-10-01

    Pasture quality is a critical determinant which influences animal performance (live weight gain, milk and meat production) and animal health. Assessment of pasture quality is therefore required to assist farmers with grazing planning and management, benchmarking between seasons and years. Traditionally, pasture quality is determined by field sampling which is laborious, expensive and time consuming, and the information is not available in real-time. Hyperspectral remote sensing has potential to accurately quantify biochemical composition of pasture over wide areas in great spatial detail. In this study an airborne imaging spectrometer (AisaFENIX, Specim) was used with a spectral range of 380-2500 nm with 448 spectral bands. A case study of a 600 ha hill country farm in New Zealand is used to illustrate the use of the system. Radiometric and atmospheric corrections, along with automatized georectification of the imagery using Digital Elevation Model (DEM), were applied to the raw images to convert into geocoded reflectance images. Then a multivariate statistical method, partial least squares (PLS), was applied to estimate pasture quality such as crude protein (CP) and metabolisable energy (ME) from canopy reflectance. The results from this study revealed that estimates of CP and ME had a R2 of 0.77 and 0.79, and RMSECV of 2.97 and 0.81 respectively. By utilizing these regression models, spatial maps were created over the imaged area. These pasture quality maps can be used for adopting precision agriculture practices which improves farm profitability and environmental sustainability.

  20. Achieving High Contrast for Exoplanet Imaging with a Kalman Filter and Stroke Minimization

    NASA Astrophysics Data System (ADS)

    Eldorado Riggs, A. J.; Groff, T. D.; Kasdin, N. J.; Carlotti, A.; Vanderbei, R. J.

    2014-01-01

    High contrast imaging requires focal plane wavefront control and estimation to correct aberrations in an optical system; non-common path errors prevent the use of conventional estimation with a separate wavefront sensor. The High Contrast Imaging Laboratory (HCIL) at Princeton has led the development of several techniques for focal plane wavefront control and estimation. In recent years, we developed a Kalman filter for optimal wavefront estimation. Our Kalman filter algorithm is an improvement upon DM Diversity, which requires at least two images pairs each iteration and does not utilize any prior knowledge of the system. The Kalman filter is a recursive estimator, meaning that it uses the data from prior estimates along with as few as one new image pairs per iteration to update the electric field estimate. Stroke minimization has proven to be a feasible controller for achieving high contrast. While similar to a variation of Electric Field Conjugation (EFC), stroke minimization achieves the same contrast with less stroke on the DMs. We recently utilized these algorithms to achieve high contrast for the first time in our experiment at the High Contrast Imaging Testbed (HCIT) at the Jet Propulsion Laboratory (JPL). Our HCIT experiment was also the first demonstration of symmetric dark hole correction in the image plane using two DMs--this is a major milestone for future space missions. Our ongoing work includes upgrading our optimal estimator to include an estimate of the incoherent light in the system, which allows for simultaneous estimation of the light from a planet along with starlight. The two-DM experiment at the HCIT utilized a shaped pupil coronagraph. Those tests utilized ripple style, free-standing masks etched out of silicon, but our current work is in designing 2-D optimized reflective shaped pupils. In particular, we have created several designs for the AFTA telescope, whose pupil presents major hurdles because of its atypical pupil obstructions. Our

  1. No-reference image quality assessment for horizontal-path imaging scenarios

    NASA Astrophysics Data System (ADS)

    Rios, Carlos; Gladysz, Szymon

    2013-05-01

    There exist several image-enhancement algorithms and tasks associated with imaging through turbulence that depend on defining the quality of an image. Examples include: "lucky imaging", choosing the width of the inverse filter for image reconstruction, or stopping iterative deconvolution. We collected a number of image quality metrics found in the literature. Particularly interesting are the blind, "no-reference" metrics. We discuss ways of evaluating the usefulness of these metrics, even when a fully objective comparison is impossible because of the lack of a reference image. Metrics are tested on simulated and real data. Field data comes from experiments performed by the NATO SET 165 research group over a 7 km distance in Dayton, Ohio.

  2. Perceived assessment metrics for visible and infrared color fused image quality without reference image

    NASA Astrophysics Data System (ADS)

    Yu, Xuelian; Chen, Qian; Gu, Guohua; Ren, Jianle; Sui, Xiubao

    2015-02-01

    Designing objective quality assessment of color-fused image is a very demanding and challenging task. We propose four no-reference metrics based on human visual system characteristics for objectively evaluating the quality of false color fusion image. The perceived edge metric (PEM) is defined based on visual perception model and color image gradient similarity between the fused image and the source images. The perceptual contrast metric (PCM) is established associating multi-scale contrast and varying contrast sensitivity filter (CSF) with color components. The linear combination of the standard deviation and mean value over the fused image construct the image colorfulness metric (ICM). The color comfort metric (CCM) is designed by the average saturation and the ratio of pixels with high and low saturation. The qualitative and quantitative experimental results demonstrate that the proposed metrics have a good agreement with subjective perception.

  3. Teacher Quality and Educational Equality: Do Teachers with Higher Standards-Based Evaluation Ratings Close Student Achievement Gaps?

    ERIC Educational Resources Information Center

    Borman, Geoffrey D.; Kimball, Steven M.

    2005-01-01

    Using standards-based evaluation ratings for nearly 400 teachers, and achievement results for over 7,000 students from grades 4-6, this study investigated the distribution and achievement effects of teacher quality in Washoe County, a mid-sized school district serving Reno and Sparks, Nevada. Classrooms with higher concentrations of minority,…

  4. Radiation dose and image quality for paediatric interventional cardiology

    NASA Astrophysics Data System (ADS)

    Vano, E.; Ubeda, C.; Leyton, F.; Miranda, P.

    2008-08-01

    Radiation dose and image quality for paediatric protocols in a biplane x-ray system used for interventional cardiology have been evaluated. Entrance surface air kerma (ESAK) and image quality using a test object and polymethyl methacrylate (PMMA) phantoms have been measured for the typical paediatric patient thicknesses (4-20 cm of PMMA). Images from fluoroscopy (low, medium and high) and cine modes have been archived in digital imaging and communications in medicine (DICOM) format. Signal-to-noise ratio (SNR), figure of merit (FOM), contrast (CO), contrast-to-noise ratio (CNR) and high contrast spatial resolution (HCSR) have been computed from the images. Data on dose transferred to the DICOM header have been used to test the values of the dosimetric display at the interventional reference point. ESAK for fluoroscopy modes ranges from 0.15 to 36.60 µGy/frame when moving from 4 to 20 cm PMMA. For cine, these values range from 2.80 to 161.10 µGy/frame. SNR, FOM, CO, CNR and HCSR are improved for high fluoroscopy and cine modes and maintained roughly constant for the different thicknesses. Cumulative dose at the interventional reference point resulted 25-45% higher than the skin dose for the vertical C-arm (depending of the phantom thickness). ESAK and numerical image quality parameters allow the verification of the proper setting of the x-ray system. Knowing the increases in dose per frame when increasing phantom thicknesses together with the image quality parameters will help cardiologists in the good management of patient dose and allow them to select the best imaging acquisition mode during clinical procedures.

  5. Radiometric quality evaluation of ZY-02C satellite panchromatic image

    NASA Astrophysics Data System (ADS)

    Zhao, Fengfan; Sun, Ke; Yang, Lei

    2014-11-01

    As the second Chinese civilian high spatial resolution satellite, the ZY-02C satellite was successfully launched on December 22, 2011. In this paper, we used two different methods, subjective evaluation and external evaluation, to evaluate radiation quality of ZY-02C panchromatic image, meanwhile, we compared with quality of CBERS-02B, SPOT-5 satellite. The external evaluation could give us quantitative image quality. The EIFOV of ZY-02C, one of parameters, is less than SPOT-5. The results demonstrate the spatial resolution of ZY-02C is greater than SPOT-5. The subjective results show that the quality of SPOT-5 is little preferable to ZY-02C - CBERS-02B, and the quality of ZY-02C is better than CBERS-02B for most land-cover types. The results in the subjective evaluation and the external evaluation show the excellent agreement. Therefore the comprehensive result of the image quality will be got based on combining parameters introduced in this paper.

  6. Compressed image quality metric based on perceptually weighted distortion.

    PubMed

    Hu, Sudeng; Jin, Lina; Wang, Hanli; Zhang, Yun; Kwong, Sam; Kuo, C-C Jay

    2015-12-01

    Objective quality assessment for compressed images is critical to various image compression systems that are essential in image delivery and storage. Although the mean squared error (MSE) is computationally simple, it may not be accurate to reflect the perceptual quality of compressed images, which is also affected dramatically by the characteristics of human visual system (HVS), such as masking effect. In this paper, an image quality metric (IQM) is proposed based on perceptually weighted distortion in terms of the MSE. To capture the characteristics of HVS, a randomness map is proposed to measure the masking effect and a preprocessing scheme is proposed to simulate the processing that occurs in the initial part of HVS. Since the masking effect highly depends on the structural randomness, the prediction error from neighborhood with a statistical model is used to measure the significance of masking. Meanwhile, the imperceptible signal with high frequency could be removed by preprocessing with low-pass filters. The relation is investigated between the distortions before and after masking effect, and a masking modulation model is proposed to simulate the masking effect after preprocessing. The performance of the proposed IQM is validated on six image databases with various compression distortions. The experimental results show that the proposed algorithm outperforms other benchmark IQMs. PMID:26415170

  7. Assessing the quality of rainfall data when aiming to achieve flood resilience

    NASA Astrophysics Data System (ADS)

    Hoang, C. T.; Tchiguirinskaia, I.; Schertzer, D.; Lovejoy, S.

    2012-04-01

    A new EU Floods Directive entered into force five years ago. This Directive requires Member States to coordinate adequate measures to reduce flood risk. European flood management systems require reliable rainfall statistics, e.g. the Intensity-duration-Frequency curves for shorter and shorter durations and for a larger and larger range of return periods. Preliminary studies showed that the number of floods was lower when using low time resolution data of high intensity rainfall events, compared to estimates obtained with the help of higher time resolution data. These facts suggest that a particular attention should be paid to the rainfall data quality in order to adequately investigate flood risk aiming to achieve flood resilience. The potential consequences of changes in measuring and recording techniques have been somewhat discussed in the literature with respect to a possible introduction of artificial inhomogeneities in time series. In this paper, we discuss how to detect another artificiality: most of the rainfall time series have a lower recording frequency than that is assumed, furthermore the effective high-frequency limit often depends on the recording year due to algorithm changes. This question is particularly important for operational hydrology, because an error on the effective recording high frequency introduces biases in the corresponding statistics. In this direction, we developed a first version of a SERQUAL procedure to automatically detect the effective time resolution of highly mixed data. Being applied to the 166 rainfall time series in France, the SERQUAL procedure has detected that most of them have an effective hourly resolution, rather than a 5 minutes resolution. Furthermore, series having an overall 5 minute resolution do not have it for all years. These results raise serious concerns on how to benchmark stochastic rainfall models at a sub-hourly resolution, which are particularly desirable for operational hydrology. Therefore, database

  8. A qualitative and quantitative analysis of radiation dose and image quality of computed tomography images using adaptive statistical iterative reconstruction.

    PubMed

    Hussain, Fahad Ahmed; Mail, Noor; Shamy, Abdulrahman M; Suliman, Alghamdi; Saoudi, Abdelhamid

    2016-01-01

    Image quality is a key issue in radiology, particularly in a clinical setting where it is important to achieve accurate diagnoses while minimizing radiation dose. Some computed tomography (CT) manufacturers have introduced algorithms that claim significant dose reduction. In this study, we assessed CT image quality produced by two reconstruction algorithms provided with GE Healthcare's Discovery 690 Elite positron emission tomography (PET) CT scanner. Image quality was measured for images obtained at various doses with both conventional filtered back-projection (FBP) and adaptive statistical iterative reconstruction (ASIR) algorithms. A stan-dard CT dose index (CTDI) phantom and a pencil ionization chamber were used to measure the CT dose at 120 kVp and an exposure of 260 mAs. Image quality was assessed using two phantoms. CT images of both phantoms were acquired at tube voltage (kV) of 120 with exposures ranging from 25 mAs to 400 mAs. Images were reconstructed using FBP and ASIR ranging from 10% to 100%, then analyzed for noise, low-contrast detectability, contrast-to-noise ratio (CNR), and modulation transfer function (MTF). Noise was 4.6 HU in water phantom images acquired at 260 mAs/FBP 120 kV and 130 mAs/50% ASIR 120 kV. The large objects (fre-quency < 7 lp/cm) retained fairly acceptable image quality at 130 mAs/50% ASIR, compared to 260 mAs/FBP. The application of ASIR for small objects (frequency >7 lp/cm) showed poor visibility compared to FBP at 260 mAs and even worse for images acquired at less than 130 mAs. ASIR blending more than 50% at low dose tends to reduce contrast of small objects (frequency >7 lp/cm). We concluded that dose reduction and ASIR should be applied with close attention if the objects to be detected or diagnosed are small (frequency > 7 lp/cm). Further investigations are required to correlate the small objects (frequency > 7 lp/cm) to patient anatomy and clinical diagnosis. PMID:27167261

  9. Why Is Quality in Higher Education Not Achieved? The View of Academics

    ERIC Educational Resources Information Center

    Cardoso, Sónia; Rosa, Maria J.; Stensaker, Bjørn

    2016-01-01

    Quality assurance is currently an established activity in Europe, driven either by national quality assurance agencies or by institutions themselves. However, whether quality assurance is perceived as actually being capable of promoting quality is still a question open to discussion. Based on three different views on quality derived from the…

  10. Dose and diagnostic image quality in digital tomosynthesis imaging of facial bones in pediatrics

    NASA Astrophysics Data System (ADS)

    King, J. M.; Hickling, S.; Elbakri, I. A.; Reed, M.; Wrogemann, J.

    2011-03-01

    The purpose of this study was to evaluate the use of digital tomosynthesis (DT) for pediatric facial bone imaging. We compared the eye lens dose and diagnostic image quality of DT facial bone exams relative to digital radiography (DR) and computed tomography (CT), and investigated whether we could modify our current DT imaging protocol to reduce patient dose while maintaining sufficient diagnostic image quality. We measured the dose to the eye lens for all three modalities using high-sensitivity thermoluminescent dosimeters (TLDs) and an anthropomorphic skull phantom. To assess the diagnostic image quality of DT compared to the corresponding DR and CT images, we performed an observer study where the visibility of anatomical structures in the DT phantom images were rated on a four-point scale. We then acquired DT images at lower doses and had radiologists indicate whether the visibility of each structure was adequate for diagnostic purposes. For typical facial bone exams, we measured eye lens doses of 0.1-0.4 mGy for DR, 0.3-3.7 mGy for DT, and 26 mGy for CT. In general, facial bone structures were visualized better with DT then DR, and the majority of structures were visualized well enough to avoid the need for CT. DT imaging provides high quality diagnostic images of the facial bones while delivering significantly lower doses to the lens of the eye compared to CT. In addition, we found that by adjusting the imaging parameters, the DT effective dose can be reduced by up to 50% while maintaining sufficient image quality.

  11. Flattening filter removal for improved image quality of megavoltage fluoroscopy

    SciTech Connect

    Christensen, James D.; Kirichenko, Alexander; Gayou, Olivier

    2013-08-15

    Purpose: Removal of the linear accelerator (linac) flattening filter enables a high rate of dose deposition with reduced treatment time. When used for megavoltage imaging, an unflat beam has reduced primary beam scatter resulting in sharper images. In fluoroscopic imaging mode, the unflat beam has higher photon count per image frame yielding higher contrast-to-noise ratio. The authors’ goal was to quantify the effects of an unflat beam on the image quality of megavoltage portal and fluoroscopic images.Methods: 6 MV projection images were acquired in fluoroscopic and portal modes using an electronic flat-panel imager. The effects of the flattening filter on the relative modulation transfer function (MTF) and contrast-to-noise ratio were quantified using the QC3 phantom. The impact of FF removal on the contrast-to-noise ratio of gold fiducial markers also was studied under various scatter conditions.Results: The unflat beam had improved contrast resolution, up to 40% increase in MTF contrast at the highest frequency measured (0.75 line pairs/mm). The contrast-to-noise ratio was increased as expected from the increased photon flux. The visualization of fiducial markers was markedly better using the unflat beam under all scatter conditions, enabling visualization of thin gold fiducial markers, the thinnest of which was not visible using the unflat beam.Conclusions: The removal of the flattening filter from a clinical linac leads to quantifiable improvements in the image quality of megavoltage projection images. These gains enable observers to more easily visualize thin fiducial markers and track their motion on fluoroscopic images.

  12. Body image quality of life in eating disorders

    PubMed Central

    Jáuregui Lobera, Ignacio; Bolaños Ríos, Patricia

    2011-01-01

    Purpose: The objective was to examine how body image affects quality of life in an eating-disorder (ED) clinical sample, a non-ED clinical sample, and a nonclinical sample. We hypothesized that ED patients would show the worst body image quality of life. We also hypothesized that body image quality of life would have a stronger negative association with specific ED-related variables than with other psychological and psychopathological variables, mainly among ED patients. On the basis of previous studies, the influence of gender on the results was explored, too. Patients and methods: The final sample comprised 70 ED patients (mean age 22.65 ± 7.76 years; 59 women and 11 men); 106 were patients with other psychiatric disorders (mean age 28.20 ± 6.52; 67 women and 39 men), and 135 were university students (mean age 21.57 ± 2.58; 81 women and 54 men), with no psychiatric history. After having obtained informed consent, the following questionnaires were administered: Body Image Quality of Life Inventory-Spanish version (BIQLI-SP), Eating Disorders Inventory-2 (EDI-2), Perceived Stress Questionnaire (PSQ), Self-Esteem Scale (SES), and Symptom Checklist-90-Revised (SCL-90-R). Results: The ED patients’ ratings on the BIQLI-SP were the lowest and negatively scored (BIQLI-SP means: +20.18, +5.14, and −6.18, in the student group, the non-ED patient group, and the ED group, respectively). The effect of body image on quality of life was more negative in the ED group in all items of the BIQLI-SP. Body image quality of life was negatively associated with specific ED-related variables, more than with other psychological and psychopathological variables, but not especially among ED patients. Conclusion: Body image quality of life was affected not only by specific pathologies related to body image disturbances, but also by other psychopathological syndromes. Nevertheless, the greatest effect was related to ED, and seemed to be more negative among men. This finding is the

  13. Real-time imaging systems' combination of methods to achieve automatic target recognition

    NASA Astrophysics Data System (ADS)

    Maraviglia, Carlos G.; Williams, Elmer F.; Pezzulich, Alan Z.

    1998-03-01

    Using a combination of strategies real time imaging weapons systems are achieving their goals of detecting their intended targets. The demands of acquiring a target in a cluttered environment in a timely manner with a high degree of confidence demands compromise be made as to having a truly automatic system. A combination of techniques such as dedicated image processing hardware, real time operating systems, mixes of algorithmic methods, and multi-sensor detectors are a forbearance of the unleashed potential of future weapons system and their incorporation in truly autonomous target acquisition. Elements such as position information, sensor gain controls, way marks for mid course correction, and augmentation with different imaging spectrums as well as future capabilities such as neural net expert systems and decision processors over seeing a fusion matrix architecture may be considered tools for a weapon system's achievement of its ultimate goal. Currently, acquiring a target in a cluttered environment in a timely manner with a high degree of confidence demands compromises be made as to having a truly automatic system. It is now necessary to include a human in the track decision loop, a system feature that may be long lived. Automatic Track Recognition will still be the desired goal in future systems due to the variability of military missions and desirability of an expendable asset. Furthermore, with the increasing incorporation of multi-sensor information into the track decision the human element's real time contribution must be carefully engineered.

  14. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  15. Evaluation of Diffraction Efficiency and Image Quality in Optical Reconstruction of Digital Fresnel Holograms

    NASA Astrophysics Data System (ADS)

    Evtikhiev, N. N.; Starikov, S. N.; Cheremkhin, P. A.; Kurbatova, E. A.

    2015-01-01

    We evaluate diffraction efficiency and image quality in the process of optical reconstruction of the digital holograms, which are displayed on spatial light modulators (SLM) with 2 and 256 brightness levels. The dependences of the above-mentioned parameters on the ratio between the intensities of the object and reference waves during recording of digital holograms are found. Numerically synthesized digital Fresnel holograms were used for modeling of the optical image retrieval. The results of the analysis were used to determine the ratios of intensities of the object and reference waves, at which the best ratios of diffraction efficiency and quality of the optically reconstructed images are achieved in the cases of using the amplitude and phase SLMs.

  16. Impact of atmospheric aerosols on long range image quality

    NASA Astrophysics Data System (ADS)

    LeMaster, Daniel A.; Eismann, Michael T.

    2012-06-01

    Image quality in high altitude long range imaging systems can be severely limited by atmospheric absorption, scattering, and turbulence. Atmospheric aerosols contribute to this problem by scattering target signal out of the optical path and by scattering in unwanted light from the surroundings. Target signal scattering may also lead to image blurring though, in conventional modeling, this effect is ignored. The validity of this choice is tested in this paper by developing an aerosol modulation transfer function (MTF) model for an inhomogeneous atmosphere and then applying it to real-world scenarios using MODTRAN derived scattering parameters. The resulting calculations show that aerosol blurring can be effectively ignored.

  17. Automating PACS quality control with the Vanderbilt image processing enterprise resource

    NASA Astrophysics Data System (ADS)

    Esparza, Michael L.; Welch, E. Brian; Landman, Bennett A.

    2012-02-01

    Precise image acquisition is an integral part of modern patient care and medical imaging research. Periodic quality control using standardized protocols and phantoms ensures that scanners are operating according to specifications, yet such procedures do not ensure that individual datasets are free from corruption; for example due to patient motion, transient interference, or physiological variability. If unacceptable artifacts are noticed during scanning, a technologist can repeat a procedure. Yet, substantial delays may be incurred if a problematic scan is not noticed until a radiologist reads the scans or an automated algorithm fails. Given scores of slices in typical three-dimensional scans and widevariety of potential use cases, a technologist cannot practically be expected inspect all images. In large-scale research, automated pipeline systems have had great success in achieving high throughput. However, clinical and institutional workflows are largely based on DICOM and PACS technologies; these systems are not readily compatible with research systems due to security and privacy restrictions. Hence, quantitative quality control has been relegated to individual investigators and too often neglected. Herein, we propose a scalable system, the Vanderbilt Image Processing Enterprise Resource (VIPER) to integrate modular quality control and image analysis routines with a standard PACS configuration. This server unifies image processing routines across an institutional level and provides a simple interface so that investigators can collaborate to deploy new analysis technologies. VIPER integrates with high performance computing environments has successfully analyzed all standard scans from our institutional research center over the course of the last 18 months.

  18. Pyramid wavefront sensor for image quality evaluation of optical system

    NASA Astrophysics Data System (ADS)

    Chen, Zhendong

    2015-08-01

    When the pyramid wavefront sensor is used to evaluate the imaging quality, placed at the focal plane of the aberrated optical system e.g., a telescope, it splits the light into four beams. Four images of the pupil are created on the detector and the detection signals of the pyramid wavefront sensor are calculated with these four intensity patterns, providing information on the derivatives of the aberrated wavefront. Based on the theory of the pyramid wavefront sensor, we are going to develop simulation software and a wavefront detector which can be used to test the imaging quality of the telescope. In our system, the subpupil image intensity through the pyramid sensor is calculated to obtain the aberration of wavefront where the piston, tilt, defocus, spherical, coma, astigmatism and other high level aberrations are separately represented by Zernike polynomials. The imaging quality of the optical system is then evaluated by the subsequent wavefront reconstruction. The performance of our system is to be checked by comparing with the measurements carried out using Puntino wavefront instrument (the method of SH wavefront sensor). Within this framework, the measurement precision of pyramid sensor will be discussed as well through detailed experiments. In general, this project would be very helpful both in our understanding of the principle of the wavefront reconstruction and its future technical applications. So far, we have produced the pyramid and established the laboratory setup of the image quality detecting system based on this wavefront sensor. Preliminary results are obtained, in that we have obtained the intensity images of the four pupils. Additional work is needed to analyze the characteristics of the pyramid wavefront sensor.

  19. Image quality-based adaptive illumination normalisation for face recognition

    NASA Astrophysics Data System (ADS)

    Sellahewa, Harin; Jassim, Sabah A.

    2009-05-01

    Automatic face recognition is a challenging task due to intra-class variations. Changes in lighting conditions during enrolment and identification stages contribute significantly to these intra-class variations. A common approach to address the effects such of varying conditions is to pre-process the biometric samples in order normalise intra-class variations. Histogram equalisation is a widely used illumination normalisation technique in face recognition. However, a recent study has shown that applying histogram equalisation on well-lit face images could lead to a decrease in recognition accuracy. This paper presents a dynamic approach to illumination normalisation, based on face image quality. The quality of a given face image is measured in terms of its luminance distortion by comparing this image against a known reference face image. Histogram equalisation is applied to a probe image if its luminance distortion is higher than a predefined threshold. We tested the proposed adaptive illumination normalisation method on the widely used Extended Yale Face Database B. Identification results demonstrate that our adaptive normalisation produces better identification accuracy compared to the conventional approach where every image is normalised, irrespective of the lighting condition they were acquired.

  20. Criterion to Evaluate the Quality of Infrared Small Target Images

    NASA Astrophysics Data System (ADS)

    Mao, Xia; Diao, Wei-He

    2009-01-01

    In this paper, we propose a new criterion to estimate the quality of infrared small target images. To describe the criterion quantitatively, two indicators are defined. One is the “degree of target being confused” that represents the ability of infrared small target image to provide fake targets. The other one is the “degree of target being shielded”, which reflects the contribution of the image to shield the target. Experimental results reveal that this criterion is more robust than the traditional method (Signal-to-Noise Ratio). It is not only valid to infrared small target images which Signal-to-Noise Ratio could correctly describe, but also to the images that the traditional criterion could not accurately estimate. In addition, the results of this criterion can provide information about the cause of background interfering with target detection.

  1. DIANE stationary neutron radiography system image quality and industrial applications

    NASA Astrophysics Data System (ADS)

    Cluzeau, S.; Huet, J.; Le Tourneur, P.

    1994-05-01

    The SODERN neutron radiography laboratory has operated since February 1993 using a sealed tube generator (GENIE 46). An experimental programme of characterization (dosimetry, spectroscopy) has confirmed the expected performances concerning: neutron flux intensity, neutron energy range, residual gamma flux. Results are given in a specific report [2]. This paper is devoted to the image performance reporting. ASTM and specific indicators have been used to test the image quality with various converters and films. The corresponding modulation transfer functions are to be determined from image processing. Some industrial applications have demonstrated the capabilities of the system: corrosion detection in aircraft parts, ammunitions filling testing, detection of polymer lacks in sandwich steel sheets, detection of moisture in a probe for geophysics, residual ceramic cores imaging in turbine blades. Various computerized electronic imaging systems will be tested to improve the industrial capabilities.

  2. The critical elements within a journey towards the achievement of quality use of medicines.

    PubMed

    Bellchambers, Helen; McMillan, Margaret

    2007-01-01

    In Australia, as in much of the rest of the Western world, the changing demographics of the population has resulted ina need for the aged care industry and the nursing profession tomodify their traditional skill mix in order to better respond to changes in the needs of older Australians. In particular, the role of the Enrolled Nurse (EN) in New South Wales (NSW) Australia has been under considerable scrutiny for the last decade resulting in a number of changes in the EN's scope of practice; notably the 2004 amendment to legislation and education that enabled ENs to administer medication. This changed scope of EN practice has resulted in a need to reconsider the role of all members of the aged care medication team. Quality Use of Medicines (QUM) within a Residential Aged Care (RAC) facility is dependent upon systematic and evidence-based approaches underpinning the relevant structures and processes involved in the management of medications. Implementation of effective QUM systems and processes is particularly dependent upon the nature of relationships among members within the aged care medication team. Although there are numerous mechanisms to support QUM by nurses, the RAC sector in Australia lacks a tool that is both contemporaneous and context-specific and supports an expanded scope of practice for ENs who work in the RAC sector. This paper reports on an action research project that aimed to enhance of the health of older Australians in one RAC facility by developing an implementation framework that supports ENs as practitioners involved in QUM. The research identified the enabling factors and barriers to the achievement of QUM within the aged care medication team. The project encompassed a re-conceptualization and reconfiguration of local nursing medication-related practices and culminated in a conceptual framework to support quality medicine outcomes for older Australians. The outcomes of the research included a set of principles to guide the expanded role of

  3. Iosmos (ionian Sea Water Quality Monitoring by Satellite Data) Project: Strategy and First Achievements

    NASA Astrophysics Data System (ADS)

    Lacava, T.; Bernini, G.; Ciancia, E.; Coviello, I.; Di Polito, C.; Madonia, A.; Marcelli, M.; Pascucci, S.; Paciello, R.; Palombo, A.; Pergola, N.; Piermattei, V.; Pignatti, S.; Santini, F.; Satriano, V.; Vallianatos, F.; Tramutoli, V.

    2013-12-01

    IOSMOS (IOnian Sea water quality MOnitoring by Satellite data) is a project for European Transnational Cooperation co-founded by Basilicata Region (Southern Italy) in the framework of its European Regional Development Fund (ERDF) Operational Program 2007-2013. IOSMOS main objective is the development of advanced satellite products and techniques for the study and monitoring of Ionian sea water quality along Basilicata and Crete Island coasts. In particular the RST (Robust Satellite Technique) approach has been applied to more than 15 years of MODIS-OC products in order to identify the areas at highest level of degradation and/or at greatest potential risk. Following RST approach anomalous space-time variations of optical variables (e.g. upwelling normalized water-leaving radiances) and bio-optical parameters such as chlorophyll-a concentration, Cromophormic Dissolved Organic Matter (CDOM), diffuse attenuation coefficient at 490 nm (Kd490), etc. have been identified taking into account of the site history (in terms of expected values and normal variability of each selected parameter) as obtained from the analysis of long-term series (2003-2012) of MODIS-OC satellite products. Such an approach allowed to generate similar products both for shallow and deep water. Specific measurements campaigns have been planned with the collection of in-situ and airborne data, in order to define and calibrate new algorithms for quantitative estimation of the above mentioned parameters even in the more critical situation (e.g. shallow waters). In this paper, first achievement of IOSMOS project will be presented and discussed.

  4. Image quality, space-qualified UV interference filters

    NASA Technical Reports Server (NTRS)

    Mooney, Thomas A.

    1992-01-01

    The progress during the contract period is described. The project involved fabrication of image quality, space-qualified bandpass filters in the 200-350 nm spectral region. Ion-assisted deposition (IAD) was applied to produce stable, reasonably durable filter coatings on space compatible UV substrates. Thin film materials and UV transmitting substrates were tested for resistance to simulated space effects.

  5. Perceived interest versus overt visual attention in image quality assessment

    NASA Astrophysics Data System (ADS)

    Engelke, Ulrich; Zhang, Wei; Le Callet, Patrick; Liu, Hantao

    2015-03-01

    We investigate the impact of overt visual attention and perceived interest on the prediction performance of image quality metrics. Towards this end we performed two respective experiments to capture these mechanisms: an eye gaze tracking experiment and a region-of-interest selection experiment. Perceptual relevance maps were created from both experiments and integrated into the design of the image quality metrics. Correlation analysis shows that indeed there is an added value of integrating these perceptual relevance maps. We reveal that the improvement in prediction accuracy is not statistically different between fixation density maps from eye gaze tracking data and region-of-interest maps, thus, indicating the robustness of different perceptual relevance maps for the performance gain of image quality metrics. Interestingly, however, we found that thresholding of region-of-interest maps into binary maps significantly deteriorates prediction performance gain for image quality metrics. We provide a detailed analysis and discussion of the results as well as the conceptual and methodological differences between capturing overt visual attention and perceived interest.

  6. SCID: full reference spatial color image quality metric

    NASA Astrophysics Data System (ADS)

    Ouni, S.; Chambah, M.; Herbin, M.; Zagrouba, E.

    2009-01-01

    The most used full reference image quality assessments are error-based methods. Thus, these measures are performed by pixel based difference metrics like Delta E ( E), MSE, PSNR, etc. Therefore, a local fidelity of the color is defined. However, these metrics does not correlate well with the perceived image quality. Indeed, they omit the properties of the HVS. Thus, they cannot be a reliable predictor of the perceived visual quality. All this metrics compute the differences pixel to pixel. Therefore, a local fidelity of the color is defined. However, the human visual system is rather sensitive to a global quality. In this paper, we present a novel full reference color metric that is based on characteristics of the human visual system by considering the notion of adjacency. This metric called SCID for Spatial Color Image Difference, is more perceptually correlated than other color differences such as Delta E. The suggested full reference metric is generic and independent of image distortion type. It can be used in different application such as: compression, restoration, etc.

  7. Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging

    PubMed Central

    Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.

    2013-01-01

    Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low

  8. Simultaneous analysis and quality assurance for diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A

    2013-01-01

    Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low

  9. A Novel Image Quality Assessment With Globally and Locally Consilient Visual Quality Perception.

    PubMed

    Bae, Sung-Ho; Kim, Munchurl

    2016-05-01

    Computational models for image quality assessment (IQA) have been developed by exploring effective features that are consistent with the characteristics of a human visual system (HVS) for visual quality perception. In this paper, we first reveal that many existing features used in computational IQA methods can hardly characterize visual quality perception for local image characteristics and various distortion types. To solve this problem, we propose a new IQA method, called the structural contrast-quality index (SC-QI), by adopting a structural contrast index (SCI), which can well characterize local and global visual quality perceptions for various image characteristics with structural-distortion types. In addition to SCI, we devise some other perceptually important features for our SC-QI that can effectively reflect the characteristics of HVS for contrast sensitivity and chrominance component variation. Furthermore, we develop a modified SC-QI, called structural contrast distortion metric (SC-DM), which inherits desirable mathematical properties of valid distance metricability and quasi-convexity. So, it can effectively be used as a distance metric for image quality optimization problems. Extensive experimental results show that both SC-QI and SC-DM can very well characterize the HVS's properties of visual quality perception for local image characteristics and various distortion types, which is a distinctive merit of our methods compared with other IQA methods. As a result, both SC-QI and SC-DM have better performances with a strong consilience of global and local visual quality perception as well as with much lower computation complexity, compared with the state-of-the-art IQA methods. The MATLAB source codes of the proposed SC-QI and SC-DM are publicly available online at https://sites.google.com/site/sunghobaecv/iqa. PMID:27046873

  10. Optoelectronic complex inner product for evaluating quality of image segmentation

    NASA Astrophysics Data System (ADS)

    Power, Gregory J.; Awwal, Abdul Ahad S.

    2000-11-01

    In automatic target recognition and machine vision applications, segmentation of the images is a key step. Poor segmentation reduces the recognition performance. For some imaging systems such as MRI and Synthetic Aperture Radar (SAR) it is difficult even for humans to agree on the location of the edge which allows for segmentation. A real- time dynamic approach to determine the quality of segmentation can enable vision systems to refocus of apply appropriate algorithms to ensure high quality segmentation for recognition. A recent approach to evaluate the quality of image segmentation uses percent-pixels-different (PPD). For some cases, PPD provides a reasonable quality evaluation, but it has a weakness in providing a measure for how well the shape of the segmentation matches the true shape. This paper introduces the complex inner product approach for providing a goodness measure for evaluating the segmentation quality based on shape. The complex inner product approach is demonstrated on SAR target chips obtained from the Moving and Stationary Target Acquisition and Recognition (MSTAR) program sponsored by the Defense Advanced Research Projects Agency (DARPA) and the Air Force Research Laboratory (AFRL). The results are compared to the PPD approach. A design for an optoelectronic implementation of the complex inner product for dynamic segmentation evaluation is introduced.

  11. Nanoscopy—imaging life at the nanoscale: a Nobel Prize achievement with a bright future

    NASA Astrophysics Data System (ADS)

    Blom, Hans; Bates, Mark

    2015-10-01

    A grand scientific prize was awarded last year to three pioneering scientists, for their discovery and development of molecular ‘ON-OFF’ switching which, when combined with optical imaging, can be used to see the previously invisible with light microscopy. The Royal Swedish Academy of Science announced on October 8th their decision and explained that this achievement—rooted in physics and applied in biology and medicine—was awarded with the Nobel Prize in Chemistry for controlling fluorescent molecules to create images of specimens smaller than anything previously observed with light. The story of how this noble switch in optical microscopy was achieved and how it was engineered to visualize life at the nanoscale is highlighted in this invited comment.

  12. An electron beam imaging system for quality assurance in IORT

    NASA Astrophysics Data System (ADS)

    Casali, F.; Rossi, M.; Morigi, M. P.; Brancaccio, R.; Paltrinieri, E.; Bettuzzi, M.; Romani, D.; Ciocca, M.; Tosi, G.; Ronsivalle, C.; Vignati, M.

    2004-01-01

    Intraoperative radiation therapy is a special radiotherapy technique, which enables a high dose of radiation to be given in a single fraction during oncological surgery. The major stumbling block to the large-scale application of the technique is the transfer of the patient, with an open wound, from the operating room to the radiation therapy bunker, with the consequent organisational problems and the increased risk of infection. To overcome these limitations, in the last few years a new kind of linear accelerator, the Novac 7, conceived for direct use in the surgical room, has become available. Novac 7 can deliver electron beams of different energies (3, 5, 7 and 9 MeV), with a high dose rate (up to 20 Gy/min). The aim of this work, funded by ENEA in the framework of a research contract, is the development of an innovative system for on-line measurements of 2D dose distributions and electron beam characterisation, before radiotherapy treatment with Novac 7. The system is made up of the following components: (a) an electron-light converter; (b) a 14 bit cooled CCD camera; (c) a personal computer with an ad hoc written software for image acquisition and processing. The performances of the prototype have been characterised experimentally with different electron-light converters. Several tests have concerned the assessment of the detector response as a function of impulse number and electron beam energy. Finally, the experimental results concerning beam profiles have been compared with data acquired with other dosimetric techniques. The achieved results make it possible to say that the developed system is suitable for fast quality assurance measurements and verification of 2D dose distributions.

  13. Classroom quality as a predictor of first graders' time in non-instructional activities and literacy achievement.

    PubMed

    McLean, Leigh; Sparapani, Nicole; Toste, Jessica R; Connor, Carol McDonald

    2016-06-01

    This study investigated how quality of the classroom learning environment influenced first grade students' (n=533) time spent in two non-instructional classroom activities (off-task and in transition) and their subsequent literacy outcomes. Hierarchical linear modeling revealed that higher classroom quality was related to higher student performance in reading comprehension and expressive vocabulary. Further, classroom quality predicted the amount of time students spent off-task and in transitions in the classroom, with slopes of change across the year particularly impacted. Mediation effects were detected in the case of expressive vocabulary such that the influence of classroom quality on students' achievement operated through students' time spent in these non-instructional activities. Results highlight the importance of overall classroom quality to how students navigate the classroom environment during learning opportunities, with subsequent literacy achievement impacted. Implications for policy and educational practices are discussed. PMID:27268569

  14. Achieving thermography with a thermal security camera using uncooled amorphous silicon microbolometer image sensors

    NASA Astrophysics Data System (ADS)

    Wang, Yu-Wei; Tesdahl, Curtis; Owens, Jim; Dorn, David

    2012-06-01

    Advancements in uncooled microbolometer technology over the last several years have opened up many commercial applications which had been previously cost prohibitive. Thermal technology is no longer limited to the military and government market segments. One type of thermal sensor with low NETD which is available in the commercial market segment is the uncooled amorphous silicon (α-Si) microbolometer image sensor. Typical thermal security cameras focus on providing the best image quality by auto tonemaping (contrast enhancing) the image, which provides the best contrast depending on the temperature range of the scene. While this may provide enough information to detect objects and activities, there are further benefits of being able to estimate the actual object temperatures in a scene. This thermographic ability can provide functionality beyond typical security cameras by being able to monitor processes. Example applications of thermography[2] with thermal camera include: monitoring electrical circuits, industrial machinery, building thermal leaks, oil/gas pipelines, power substations, etc...[3][5] This paper discusses the methodology of estimating object temperatures by characterizing/calibrating different components inside a thermal camera utilizing an uncooled amorphous silicon microbolometer image sensor. Plots of system performance across camera operating temperatures will be shown.

  15. Effects of display rendering on HDR image quality assessment

    NASA Astrophysics Data System (ADS)

    Zerman, Emin; Valenzise, Giuseppe; De Simone, Francesca; Banterle, Francesco; Dufaux, Frederic

    2015-09-01

    High dynamic range (HDR) displays use local backlight modulation to produce both high brightness levels and large contrast ratios. Thus, the display rendering algorithm and its parameters may greatly affect HDR visual experience. In this paper, we analyze the impact of display rendering on perceived quality for a specific display (SIM2 HDR47) and for a popular application scenario, i.e., HDR image compression. To this end, we assess whether significant differences exist between subjective quality of compressed images, when these are displayed using either the built-in rendering of the display, or a rendering algorithm developed by ourselves. As a second contribution of this paper, we investigate whether the possibility to estimate the true pixel-wise luminance emitted by the display, offered by our rendering approach, can improve the performance of HDR objective quality metrics that require true pixel-wise luminance as input.

  16. Full-Reference Image Quality Assessment with Linear Combination of Genetically Selected Quality Measures

    PubMed Central

    2016-01-01

    Information carried by an image can be distorted due to different image processing steps introduced by different electronic means of storage and communication. Therefore, development of algorithms which can automatically assess a quality of the image in a way that is consistent with human evaluation is important. In this paper, an approach to image quality assessment (IQA) is proposed in which the quality of a given image is evaluated jointly by several IQA approaches. At first, in order to obtain such joint models, an optimisation problem of IQA measures aggregation is defined, where a weighted sum of their outputs, i.e., objective scores, is used as the aggregation operator. Then, the weight of each measure is considered as a decision variable in a problem of minimisation of root mean square error between obtained objective scores and subjective scores. Subjective scores reflect ground-truth and involve evaluation of images by human observers. The optimisation problem is solved using a genetic algorithm, which also selects suitable measures used in aggregation. Obtained multimeasures are evaluated on four largest widely used image benchmarks and compared against state-of-the-art full-reference IQA approaches. Results of comparison reveal that the proposed approach outperforms other competing measures. PMID:27341493

  17. Full-Reference Image Quality Assessment with Linear Combination of Genetically Selected Quality Measures.

    PubMed

    Oszust, Mariusz

    2016-01-01

    Information carried by an image can be distorted due to different image processing steps introduced by different electronic means of storage and communication. Therefore, development of algorithms which can automatically assess a quality of the image in a way that is consistent with human evaluation is important. In this paper, an approach to image quality assessment (IQA) is proposed in which the quality of a given image is evaluated jointly by several IQA approaches. At first, in order to obtain such joint models, an optimisation problem of IQA measures aggregation is defined, where a weighted sum of their outputs, i.e., objective scores, is used as the aggregation operator. Then, the weight of each measure is considered as a decision variable in a problem of minimisation of root mean square error between obtained objective scores and subjective scores. Subjective scores reflect ground-truth and involve evaluation of images by human observers. The optimisation problem is solved using a genetic algorithm, which also selects suitable measures used in aggregation. Obtained multimeasures are evaluated on four largest widely used image benchmarks and compared against state-of-the-art full-reference IQA approaches. Results of comparison reveal that the proposed approach outperforms other competing measures. PMID:27341493

  18. Structural similarity analysis for brain MR image quality assessment

    NASA Astrophysics Data System (ADS)

    Punga, Mirela Visan; Moldovanu, Simona; Moraru, Luminita

    2014-11-01

    Brain MR images are affected and distorted by various artifacts as noise, blur, blotching, down sampling or compression and as well by inhomogeneity. Usually, the performance of pre-processing operation is quantified by using the quality metrics as mean squared error and its related metrics such as peak signal to noise ratio, root mean squared error and signal to noise ratio. The main drawback of these metrics is that they fail to take the structural fidelity of the image into account. For this reason, we addressed to investigate the structural changes related to the luminance and contrast variation (as non-structural distortions) and to denoising process (as structural distortion)through an alternative metric based on structural changes in order to obtain the best image quality.

  19. Improving Image Quality of Bronchial Arteries with Virtual Monochromatic Spectral CT Images

    PubMed Central

    Ma, Guangming; He, Taiping; Yu, Yong; Duan, Haifeng; Yang, Chuangbo

    2016-01-01

    Objective To evaluate the clinical value of using monochromatic images in spectral CT pulmonary angiography to improve image quality of bronchial arteries. Methods We retrospectively analyzed the chest CT images of 38 patients who underwent contrast-enhanced spectral CT. These images included a set of 140kVp polychromatic images and the default 70keV monochromatic images. Using the standard Gemstone Spectral Imaging (GSI) viewer on an advanced workstation (AW4.6,GE Healthcare), an optimal energy level (in keV) for obtaining the best contrast-to-noise ratio (CNR) for the artery could be automatically obtained. The signal-to-noise ratio (SNR), CNR and objective image quality score (1–5) for these 3 image sets (140kVp, 70keV and optimal energy level) were obtained and, statistically compared. The image quality score consistency between the two observers was also evaluated using Kappa test. Results The optimal energy levels for obtaining the best CNR were 62.58±2.74keV.SNR and CNR from the 140kVp polychromatic, 70keV and optimal keV monochromatic images were (16.44±5.85, 13.24±5.52), (20.79±7.45, 16.69±6.27) and (24.9±9.91, 20.53±8.46), respectively. The corresponding subjective image quality scores were 1.97±0.82, 3.24±0.75, and 4.47±0.60. SNR, CNR and subjective scores had significant difference among groups (all p<0.001). The optimal keV monochromatic images were superior to the 70keV monochromatic and 140kVp polychromatic images, and there was high agreement between the two observers on image quality score (kappa>0.80). Conclusions Virtual monochromatic images at approximately 63keV in dual-energy spectral CT pulmonary angiography yielded the best CNR and highest diagnostic confidence for imaging bronchial arteries. PMID:26967737

  20. Process Dimensions of Child Care Quality and Academic Achievement: An Instrumental Variables Analysis

    ERIC Educational Resources Information Center

    Auger, Anamarie; Farkas, George; Duncan, Greg; Burchinal, Peg; Vandell, Deborah Lowe

    2012-01-01

    Child care quality is usually measured along two dimensions--structural and process. In this paper the authors focus on process quality--the quality of child care center instructional practices and teacher interactions with students. They use an instrumental variables technique to estimate the effect of child care center process quality on…

  1. Improving Service Quality: Achieving High Performance in the Public and Private Sectors.

    ERIC Educational Resources Information Center

    Milakovich, Michael E.

    Quality-improvement principles are a sound means to respond to customer needs. However, when various quality and productivity theories and methods are applied, it is very difficult to consistently deliver quality results, especially in quasi-monopolistic, non-competitive, and regulated environments. This book focuses on quality-improvement methods…

  2. Image quality specification and maintenance for airborne SAR

    NASA Astrophysics Data System (ADS)

    Clinard, Mark S.

    2004-08-01

    Specification, verification, and maintenance of image quality over the lifecycle of an operational airborne SAR begin with the specification for the system itself. Verification of image quality-oriented specification compliance can be enhanced by including a specification requirement that a vendor provide appropriate imagery at the various phases of the system life cycle. The nature and content of the imagery appropriate for each stage of the process depends on the nature of the test, the economics of collection, and the availability of techniques to extract the desired information from the data. At the earliest lifecycle stages, Concept and Technology Development (CTD) and System Development and Demonstration (SDD), the test set could include simulated imagery to demonstrate the mathematical and engineering concepts being implemented thus allowing demonstration of compliance, in part, through simulation. For Initial Operational Test and Evaluation (IOT&E), imagery collected from precisely instrumented test ranges and targets of opportunity consisting of a priori or a posteriori ground-truthed cultural and natural features are of value to the analysis of product quality compliance. Regular monitoring of image quality is possible using operational imagery and automated metrics; more precise measurements can be performed with imagery of instrumented scenes, when available. A survey of image quality measurement techniques is presented along with a discussion of the challenges of managing an airborne SAR program with the scarce resources of time, money, and ground-truthed data. Recommendations are provided that should allow an improvement in the product quality specification and maintenance process with a minimal increase in resource demands on the customer, the vendor, the operational personnel, and the asset itself.

  3. Imaging quality automated measurement of image intensifier based on orthometric phase-shifting gratings.

    PubMed

    Sun, Song; Cao, Yiping

    2016-06-01

    A method for automatically measuring the imaging quality parameters of an image intensifier based on orthometric phase-shifting gratings (OPSG) is proposed. Two sets of phase-shifting gratings, one with a fringe direction at 45° and the other at 135°, are successively projected onto the input port of the image intensifier, and the corresponding deformed patterns modulated by the measured image intensifier on its output port are captured with a CCD camera. Two phases are retrieved from these two sets of deformed patterns by a phase-measuring algorithm. By building the relationship between these retrieved phases, the referential fringe period can be determined accurately. Meanwhile, the distorted phase distribution introduced by the image intensifier can also be efficiently separated wherein the subtle imaging quality information can be further decomposed. Subsequently, the magnification of the image intensifier is successfully measured by fringe period self-calibration. The experimental results have shown the feasibility of the proposed method, which can automatically measure the multiple imaging quality parameters of an image intensifier without human intervention. PMID:27411191

  4. Imaging through turbid media via sparse representation: imaging quality comparison of three projection matrices

    NASA Astrophysics Data System (ADS)

    Shao, Xiaopeng; Li, Huijuan; Wu, Tengfei; Dai, Weijia; Bi, Xiangli

    2015-05-01

    The incident light will be scattered away due to the inhomogeneity of the refractive index in many materials which will greatly reduce the imaging depth and degrade the imaging quality. Many exciting methods have been presented in recent years for solving this problem and realizing imaging through a highly scattering medium, such as the wavefront modulation technique and reconstruction technique. The imaging method based on compressed sensing (CS) theory can decrease the computational complexity because it doesn't require the whole speckle pattern to realize reconstruction. One of the key premises of this method is that the object is sparse or can be sparse representation. However, choosing a proper projection matrix is very important to the imaging quality. In this paper, we analyzed that the transmission matrix (TM) of a scattering medium obeys circular Gaussian distribution, which makes it possible that a scattering medium can be used as the measurement matrix in the CS theory. In order to verify the performance of this method, a whole optical system is simulated. Various projection matrices are introduced to make the object sparse, including the fast Fourier transform (FFT) basis, the discrete cosine transform (DCT) basis and the discrete wavelet transform (DWT) basis, the imaging performances of each of which are compared comprehensively. Simulation results show that for most targets, applying the discrete wavelet transform basis will obtain an image in good quality. This work can be applied to biomedical imaging and used to develop real-time imaging through highly scattering media.

  5. Comprehensive quality assurance phantom for cardiovascular imaging systems

    NASA Astrophysics Data System (ADS)

    Lin, Pei-Jan P.

    1998-07-01

    With the advent of high heat loading capacity x-ray tubes, high frequency inverter type generators, and the use of spectral shaping filters, the automatic brightness/exposure control (ABC) circuit logic employed in the new generation of angiographic imaging equipment has been significantly reprogrammed. These new angiographic imaging systems are designed to take advantage of the power train capabilities to yield higher contrast images while maintaining, or lower, the patient exposure. Since the emphasis of the imaging system design has been significantly altered, the system performance parameters one is interested and the phantoms employed for the quality assurance must also change in order to properly evaluate the imaging capability of the cardiovascular imaging systems. A quality assurance (QA) phantom has been under development in this institution and was submitted to various interested organizations such as American Association of Physicists in Medicine (AAPM), Society for Cardiac Angiography & Interventions (SCA&I), and National Electrical Manufacturers Association (NEMA) for their review and input. At the same time, in an effort to establish a unified standard phantom design for the cardiac catheterization laboratories (CCL), SCA&I and NEMA have formed a joint work group in early 1997 to develop a suitable phantom. The initial QA phantom design has since been accepted to serve as the base phantom by the SCA&I- NEMA Joint Work Group (JWG) from which a comprehensive QA Phantom is being developed.

  6. Investigation of grid performance using simple image quality tests

    PubMed Central

    Bor, Dogan; Birgul, Ozlem; Onal, Umran; Olgar, Turan

    2016-01-01

    Antiscatter grids improve the X-ray image contrast at a cost of patient radiation doses. The choice of appropriate grid or its removal requires a good knowledge of grid characteristics, especially for pediatric digital imaging. The aim of this work is to understand the relation between grid performance parameters and some numerical image quality metrics for digital radiological examinations. The grid parameters such as bucky factor (BF), selectivity (Σ), Contrast improvement factor (CIF), and signal-to-noise improvement factor (SIF) were determined following the measurements of primary, scatter, and total radiations with a digital fluoroscopic system for the thicknesses of 5, 10, 15, 20, and 25 cm polymethyl methacrylate blocks at the tube voltages of 70, 90, and 120 kVp. Image contrast for low- and high-contrast objects and high-contrast spatial resolution were measured with simple phantoms using the same scatter thicknesses and tube voltages. BF and SIF values were also calculated from the images obtained with and without grids. The correlation coefficients between BF values obtained using two approaches (grid parameters and image quality metrics) were in good agreement. Proposed approach provides a quick and practical way of estimating grid performance for different digital fluoroscopic examinations. PMID:27051166

  7. A study of image quality for radar image processing. [synthetic aperture radar imagery

    NASA Technical Reports Server (NTRS)

    King, R. W.; Kaupp, V. H.; Waite, W. P.; Macdonald, H. C.

    1982-01-01

    Methods developed for image quality metrics are reviewed with focus on basic interpretation or recognition elements including: tone or color; shape; pattern; size; shadow; texture; site; association or context; and resolution. Seven metrics are believed to show promise as a way of characterizing the quality of an image: (1) the dynamic range of intensities in the displayed image; (2) the system signal-to-noise ratio; (3) the system spatial bandwidth or bandpass; (4) the system resolution or acutance; (5) the normalized-mean-square-error as a measure of geometric fidelity; (6) the perceptual mean square error; and (7) the radar threshold quality factor. Selective levels of degradation are being applied to simulated synthetic radar images to test the validity of these metrics.

  8. New opportunities for quality enhancing of images captured by passive THz camera

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2014-10-01

    As it is well-known, the passive THz camera allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Obviously, efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection for concealed object: minimal size of the object; maximal distance of the detection; image quality. Computer processing of the THz image may lead to many times improving of the image quality without any additional engineering efforts. Therefore, developing of modern computer code for its application to THz images is urgent problem. Using appropriate new methods one may expect such temperature resolution which will allow to see banknote in pocket of a person without any real contact. Modern algorithms for computer processing of THz images allow also to see object inside the human body using a temperature trace on the human skin. This circumstance enhances essentially opportunity of passive THz camera applications for counterterrorism problems. We demonstrate opportunities, achieved at present time, for the detection both of concealed objects and of clothes components due to using of computer processing of images captured by passive THz cameras, manufactured by various companies. Another important result discussed in the paper consists in observation of both THz radiation emitted by incandescent lamp and image reflected from ceramic floorplate. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China). All algorithms for computer processing of the THz images under consideration in this paper were developed by Russian part of author list. Keywords: THz wave, passive imaging camera, computer processing, security screening, concealed and forbidden objects, reflected image, hand seeing, banknote seeing, ceramic floorplate, incandescent lamp.

  9. [An improved medical image fusion algorithm and quality evaluation].

    PubMed

    Chen, Meiling; Tao, Ling; Qian, Zhiyu

    2009-08-01

    Medical image fusion is of very important value for application in medical image analysis and diagnosis. In this paper, the conventional method of wavelet fusion is improved,so a new algorithm of medical image fusion is presented and the high frequency and low frequency coefficients are studied respectively. When high frequency coefficients are chosen, the regional edge intensities of each sub-image are calculated to realize adaptive fusion. The choice of low frequency coefficient is based on the edges of images, so that the fused image preserves all useful information and appears more distinctly. We apply the conventional and the improved fusion algorithms based on wavelet transform to fuse two images of human body and also evaluate the fusion results through a quality evaluation method. Experimental results show that this algorithm can effectively retain the details of information on original images and enhance their edge and texture features. This new algorithm is better than the conventional fusion algorithm based on wavelet transform. PMID:19813594

  10. Evaluation of image quality of a new CCD-based system for chest imaging

    NASA Astrophysics Data System (ADS)

    Sund, Patrik; Kheddache, Susanne; Mansson, Lars G.; Bath, Magnus; Tylen, Ulf

    2000-04-01

    The Imix radiography system (Qy Imix Ab, Finland)consists of an intensifying screen, optics, and a CCD camera. An upgrade of this system (Imix 2000) with a red-emitting screen and new optics has recently been released. The image quality of Imix (original version), Imix 200, and two storage-phosphor systems, Fuji FCR 9501 and Agfa ADC70 was evaluated in physical terms (DQE) and with visual grading of the visibility of anatomical structures in clinical images (141 kV). PA chest images of 50 healthy volunteers were evaluated by experienced radiologists. All images were evaluated on Siemens Simomed monitors, using the European Quality Criteria. The maximum DQE values for Imix, Imix 2000, Agfa and Fuji were 11%, 14%, 17% and 19%, respectively (141kV, 5μGy). Using the visual grading, the observers rated the systems in the following descending order. Fuji, Imix 2000, Agfa, and Imix. Thus, the upgrade to Imix 2000 resulted in higher DQE values and a significant improvement in clinical image quality. The visual grading agrees reasonably well with the DQE results; however, Imix 2000 received a better score than what could be expected from the DQE measurements. Keywords: CCD Technique, Chest Imaging, Digital Radiography, DQE, Image Quality, Visual Grading Analysis

  11. Telemedicine + OCT: toward design of optimized algorithms for high-quality compressed images

    NASA Astrophysics Data System (ADS)

    Mousavi, Mahta; Lurie, Kristen; Land, Julian; Javidi, Tara; Ellerbee, Audrey K.

    2014-03-01

    Telemedicine is an emerging technology that aims to provide clinical healthcare at a distance. Among its goals, the transfer of diagnostic images over telecommunication channels has been quite appealing to the medical community. When viewed as an adjunct to biomedical device hardware, one highly important consideration aside from the transfer rate and speed is the accuracy of the reconstructed image at the receiver end. Although optical coherence tomography (OCT) is an established imaging technique that is ripe for telemedicine, the effects of OCT data compression, which may be necessary on certain telemedicine platforms, have not received much attention in the literature. We investigate the performance and efficiency of several lossless and lossy compression techniques for OCT data and characterize their effectiveness with respect to achievable compression ratio, compression rate and preservation of image quality. We examine the effects of compression in the interferogram vs. A-scan domain as assessed with various objective and subjective metrics.

  12. Comparison of clinical and physical measures of image quality in chest and pelvis computed radiography at different tube voltages

    SciTech Connect

    Sandborg, Michael; Tingberg, Anders; Ullman, Gustaf; Dance, David R.; Alm Carlsson, Gudrun

    2006-11-15

    The aim of this work was to study the dependence of image quality in digital chest and pelvis radiography on tube voltage, and to explore correlations between clinical and physical measures of image quality. The effect on image quality of tube voltage in these two examinations was assessed using two methods. The first method relies on radiologists' observations of images of an anthropomorphic phantom, and the second method was based on computer modeling of the imaging system using an anthropomorphic voxel phantom. The tube voltage was varied within a broad range (50-150 kV), including those values typically used with screen-film radiography. The tube charge was altered so that the same effective dose was achieved for each projection. Two x-ray units were employed using a computed radiography (CR) image detector with standard tube filtration and antiscatter device. Clinical image quality was assessed by a group of radiologists using a visual grading analysis (VGA) technique based on the revised CEC image criteria. Physical image quality was derived from a Monte Carlo computer model in terms of the signal-to-noise ratio, SNR, of anatomical structures corresponding to the image criteria. Both the VGAS (visual grading analysis score) and SNR decrease with increasing tube voltage in both chest PA and pelvis AP examinations, indicating superior performance if lower tube voltages are employed. Hence, a positive correlation between clinical and physical measures of image quality was found. The pros and cons of using lower tube voltages with CR digital radiography than typically used in analog screen-film radiography are discussed, as well as the relevance of using VGAS and quantum-noise SNR as measures of image quality in pelvis and chest radiography.

  13. Magnetic Resonance Imaging (MRI) Analysis of Fibroid Location in Women Achieving Pregnancy After Uterine Artery Embolization

    SciTech Connect

    Walker, Woodruff J.; Bratby, Mark John

    2007-09-15

    The purpose of this study was to evaluate the fibroid morphology in a cohort of women achieving pregnancy following treatment with uterine artery embolization (UAE) for symptomatic uterine fibroids. A retrospective review of magnetic resonance imaging (MRI) of the uterus was performed to assess pre-embolization fibroid morphology. Data were collected on fibroid size, type, and number and included analysis of follow-up imaging to assess response. There have been 67 pregnancies in 51 women, with 40 live births. Intramural fibroids were seen in 62.7% of the women (32/48). Of these the fibroids were multiple in 16. A further 12 women had submucosal fibroids, with equal numbers of types 1 and 2. Two of these women had coexistent intramural fibroids. In six women the fibroids could not be individually delineated and formed a complex mass. All subtypes of fibroid were represented in those subgroups of women achieving a live birth versus those who did not. These results demonstrate that the location of uterine fibroids did not adversely affect subsequent pregnancy in the patient population investigated. Although this is only a small qualitative study, it does suggest that all types of fibroids treated with UAE have the potential for future fertility.

  14. Scanner-based image quality measurement system for automated analysis of EP output

    NASA Astrophysics Data System (ADS)

    Kipman, Yair; Mehta, Prashant; Johnson, Kate

    2003-12-01

    Inspection of electrophotographic print cartridge quality and compatibility requires analysis of hundreds of pages on a wide population of printers and copiers. Although print quality inspection is often achieved through the use of anchor prints and densitometry, more comprehensive analysis and quantitative data is desired for performance tracking, benchmarking and failure mode analysis. Image quality measurement systems range in price and performance, image capture paths and levels of automation. In order to address the requirements of a specific application, careful consideration was made to print volume, budgetary limits, and the scope of the desired image quality measurements. A flatbed scanner-based image quality measurement system was selected to support high throughput, maximal automation, and sufficient flexibility for both measurement methods and image sampling rates. Using an automatic document feeder (ADF) for sample management, a half ream of prints can be measured automatically without operator intervention. The system includes optical character recognition (OCR) for automatic determination of target type for measurement suite selection. This capability also enables measurement of mixed stacks of targets since each sample is identified prior to measurement. In addition, OCR is used to read toner ID, machine ID, print count, and other pertinent information regarding the printing conditions and environment. This data is saved to a data file along with the measurement results for complete test documentation. Measurement methods were developed to replace current methods of visual inspection and densitometry. The features that were being analyzed visually could be addressed via standard measurement algorithms. Measurement of density proved to be less simple since the scanner is not a densitometer and anything short of an excellent estimation would be meaningless. In order to address the measurement of density, a transfer curve was built to translate the

  15. A three-dimensional statistical approach to improved image quality for multislice helical CT.

    PubMed

    Thibault, Jean-Baptiste; Sauer, Ken D; Bouman, Charles A; Hsieh, Jiang

    2007-11-01

    Multislice helical computed tomography scanning offers the advantages of faster acquisition and wide organ coverage for routine clinical diagnostic purposes. However, image reconstruction is faced with the challenges of three-dimensional cone-beam geometry, data completeness issues, and low dosage. Of all available reconstruction methods, statistical iterative reconstruction (IR) techniques appear particularly promising since they provide the flexibility of accurate physical noise modeling and geometric system description. In this paper, we present the application of Bayesian iterative algorithms to real 3D multislice helical data to demonstrate significant image quality improvement over conventional techniques. We also introduce a novel prior distribution designed to provide flexibility in its parameters to fine-tune image quality. Specifically, enhanced image resolution and lower noise have been achieved, concurrently with the reduction of helical cone-beam artifacts, as demonstrated by phantom studies. Clinical results also illustrate the capabilities of the algorithm on real patient data. Although computational load remains a significant challenge for practical development, superior image quality combined with advancements in computing technology make IR techniques a legitimate candidate for future clinical applications. PMID:18072519

  16. Dosimetry and image quality in digital mammography facilities in the State of Minas Gerais, Brazil

    NASA Astrophysics Data System (ADS)

    da Silva, Sabrina Donato; Joana, Geórgia Santos; Oliveira, Bruno Beraldo; de Oliveira, Marcio Alves; Leyton, Fernando; Nogueira, Maria do Socorro

    2015-11-01

    According to the National Register of Health Care Facilities (CNES), there are approximately 477 mammography systems operating in the state of Minas Gerais, Brazil, of which an estimated 200 are digital apparatus using mainly computerized radiography (CR) or direct radiography (DR) systems. Mammography is irreplaceable in the diagnosis and early detection of breast cancer, the leading cause of cancer death among women worldwide. A high standard of image quality alongside smaller doses and optimization of procedures are essential if early detection is to occur. This study aimed to determine dosimetry and image quality in 68 mammography services in Minas Gerais using CR or DR systems. The data of this study were collected between the years of 2011 and 2013. The contrast-to-noise ratio proved to be a critical point in the image production chain in digital systems, since 90% of services were not compliant in this regard, mainly for larger PMMA thicknesses (60 and 70 mm). Regarding the image noise, only 31% of these were compliant. The average glandular dose found is of concern, since more than half of the services presented doses above acceptable limits. Therefore, despite the potential benefits of using CR and DR systems, the employment of this technology has to be revised and optimized to achieve better quality image and reduce radiation dose as much as possible.

  17. An improved Gabor enhancement method for low-quality fingerprint images

    NASA Astrophysics Data System (ADS)

    Geng, Hao; Li, Jicheng; Zhou, Jinwei; Chen, Dong

    2015-10-01

    The criminal's fingerprints often refer to those fingerprints that are extracted from crime scene and have played an important role in police' investigation and cracking the cases, but these fingerprints have features such as blur, incompleteness and low-contrast of ridges. Traditional fingerprint enhancement and identification methods have some limitations and the current automated fingerprint identification system (AFIS) hasn't not been applied extensively in police' investigation. Since the Gabor filter has drawbacks such as poor efficiency, low preciseness of the extracted ridge's orientation parameters, the enhancements of low-contrast fingerprint images can't achieve the desired effects. Therefore, an improved Gabor enhancement for low-quality fingerprint is proposed in this paper. Firstly, orientation image templates with different scales were used to distinguish the orientation images in the fingerprint area, and then orientation parameters of ridge were calculated. Secondly, mean frequencies of ridge were extracted based on local window of ridge's orientation and mean frequency parameters of ridges were calculated. Thirdly, the size and orientation of Gabor filter were self-adjusted according to local ridge's orientation and mean frequency. Finally, the poor-quality fingerprint images were enhanced. In the experiment, the improved Gabor filter has better performance for low-quality fingerprint images when compared with the traditional filtering methods.

  18. Image quality and dose assessment in digital breast tomosynthesis: A Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Baptista, M.; Di Maria, S.; Oliveira, N.; Matela, N.; Janeiro, L.; Almeida, P.; Vaz, P.

    2014-11-01

    Mammography is considered a standard technique for the early detection of breast cancer. However, its sensitivity is limited essentially due to the issue of the overlapping breast tissue. This limitation can be partially overcome, with a relatively new technique, called digital breast tomosynthesis (DBT). For this technique, optimization of acquisition parameters which maximize image quality, whilst complying with the ALARA principle, continues to be an area of considerable research. The aim of this work was to study the best quantum energies that optimize the image quality with the lowest achievable dose in DBT and compare these results with the digital mammography (DM) ones. Monte Carlo simulations were performed using the state-of-the-art computer program MCNPX 2.7.0 in order to generate several 2D cranio-caudal (CC) projections obtained during an acquisition of a standard DBT examination. Moreover, glandular absorbed doses and photon flux calculations, for each projection image, were performed. A homogeneous breast computational phantom with 50%/50% glandular/adipose tissue composition was used and two compressed breast thicknesses were evaluated: 4 cm and 8 cm. The simulated projection images were afterwards reconstructed with an algebraic reconstruction tool and the signal difference to noise ratio (SDNR) was calculated in order to evaluate the image quality in DBT and DM. Finally, a thorough comparison between the results obtained in terms of SDNR and dose assessment in DBT and DM was performed.

  19. A trial production of the image slicer unit for next generation infrared instruments and the assembly of the evaluation system of the pseudo slit image quality

    NASA Astrophysics Data System (ADS)

    Sakon, Itsuki; Onaka, Takashi; Kataza, Hirokazu; Okamoto, Yoshiko K.; Honda, Mitsuhiko; Tokoro, Hitoshi; Fujishiro, Naofumi; Ikeda, Yuji; Nakagawa, Hiroyuki; Kirino, Okiharu; Mitsui, Kenji; Okada, Norio

    2014-08-01

    We have carried out the trial production of small format (n=5) image slicer aiming to obtain the technical verification of the Integral Field Unit (IFU) that can be equipped to the next generation infrared instruments such as TMT/MICHI and SPICA/SMI. Our goal is to achieve stable pseudo slit image with high efficiency. Here we report the results of the assembly of the image slicer unit and the non-cryogenic evaluation system of the pseudo slit image quality in the infrared.

  20. Human Visual System-Based Fundus Image Quality Assessment of Portable Fundus Camera Photographs.

    PubMed

    Wang, Shaoze; Jin, Kai; Lu, Haitong; Cheng, Chuming; Ye, Juan; Qian, Dahong

    2016-04-01

    Telemedicine and the medical "big data" era in ophthalmology highlight the use of non-mydriatic ocular fundus photography, which has given rise to indispensable applications of portable fundus cameras. However, in the case of portable fundus photography, non-mydriatic image quality is more vulnerable to distortions, such as uneven illumination, color distortion, blur, and low contrast. Such distortions are called generic quality distortions. This paper proposes an algorithm capable of selecting images of fair generic quality that would be especially useful to assist inexperienced individuals in collecting meaningful and interpretable data with consistency. The algorithm is based on three characteristics of the human visual system--multi-channel sensation, just noticeable blur, and the contrast sensitivity function to detect illumination and color distortion, blur, and low contrast distortion, respectively. A total of 536 retinal images, 280 from proprietary databases and 256 from public databases, were graded independently by one senior and two junior ophthalmologists, such that three partial measures of quality and generic overall quality were classified into two categories. Binary classification was implemented by the support vector machine and the decision tree, and receiver operating characteristic (ROC) curves were obtained and plotted to analyze the performance of the proposed algorithm. The experimental results revealed that the generic overall quality classification achieved a sensitivity of 87.45% at a specificity of 91.66%, with an area under the ROC curve of 0.9452, indicating the value of applying the algorithm, which is based on the human vision system, to assess the image quality of non-mydriatic photography, especially for low-cost ophthalmological telemedicine applications. PMID:26672033

  1. Effects of task and image properties on visual-attention deployment in image-quality assessment

    NASA Astrophysics Data System (ADS)

    Alers, Hani; Redi, Judith; Liu, Hantao; Heynderickx, Ingrid

    2015-03-01

    It is important to understand how humans view images and how their behavior is affected by changes in the properties of the viewed images and the task they are given, particularly the task of scoring the image quality (IQ). This is a complex behavior that holds great importance for the field of image-quality research. This work builds upon 4 years of research work spanning three databases studying image-viewing behavior. Using eye-tracking equipment, it was possible to collect information on human viewing behavior of different kinds of stimuli and under different experimental settings. This work performs a cross-analysis on the results from all these databases using state-of-the-art similarity measures. The results strongly show that asking the viewers to score the IQ significantly changes their viewing behavior. Also muting the color saturation seems to affect the saliency of the images. However, a change in IQ was not consistently found to modify visual attention deployment, neither under free looking nor during scoring. These results are helpful in gaining a better understanding of image viewing behavior under different conditions. They also have important implications on work that collects subjective image-quality scores from human observers.

  2. No-reference image quality assessment in the spatial domain.

    PubMed

    Mittal, Anish; Moorthy, Anush Krishna; Bovik, Alan Conrad

    2012-12-01

    We propose a natural scene statistic-based distortion-generic blind/no-reference (NR) image quality assessment (IQA) model that operates in the spatial domain. The new model, dubbed blind/referenceless image spatial quality evaluator (BRISQUE) does not compute distortion-specific features, such as ringing, blur, or blocking, but instead uses scene statistics of locally normalized luminance coefficients to quantify possible losses of "naturalness" in the image due to the presence of distortions, thereby leading to a holistic measure of quality. The underlying features used derive from the empirical distribution of locally normalized luminances and products of locally normalized luminances under a spatial natural scene statistic model. No transformation to another coordinate frame (DCT, wavelet, etc.) is required, distinguishing it from prior NR IQA approaches. Despite its simplicity, we are able to show that BRISQUE is statistically better than the full-reference peak signal-to-noise ratio and the structural similarity index, and is highly competitive with respect to all present-day distortion-generic NR IQA algorithms. BRISQUE has very low computational complexity, making it well suited for real time applications. BRISQUE features may be used for distortion-identification as well. To illustrate a new practical application of BRISQUE, we describe how a nonblind image denoising algorithm can be augmented with BRISQUE in order to perform blind image denoising. Results show that BRISQUE augmentation leads to performance improvements over state-of-the-art methods. A software release of BRISQUE is available online: http://live.ece.utexas.edu/research/quality/BRISQUE_release.zip for public use and evaluation. PMID:22910118

  3. TL dosimetry for quality control of CR mammography imaging systems

    NASA Astrophysics Data System (ADS)

    Gaona, E.; Nieto, J. A.; Góngora, J. A. I. D.; Arreola, M.; Enríquez, J. G. F.

    The aim of this work is to estimate the average glandular dose with thermoluminescent (TL) dosimetry and comparison with quality imaging in computed radiography (CR) mammography. For a measuring dose, the Food and Drug Administration (FDA) and the American College of Radiology (ACR) use a phantom, so that dose and image quality are assessed with the same test object. The mammography is a radiological image to visualize early biological manifestations of breast cancer. Digital systems have two types of image-capturing devices, full field digital mammography (FFDM) and CR mammography. In Mexico, there are several CR mammography systems in clinical use, but only one system has been approved for use by the FDA. Mammography CR uses a photostimulable phosphor detector (PSP) system. Most CR plates are made of 85% BaFBr and 15% BaFI doped with europium (Eu) commonly called barium flourohalideE We carry out an exploratory survey of six CR mammography units from three different manufacturers and six dedicated X-ray mammography units with fully automatic exposure. The results show three CR mammography units (50%) have a dose greater than 3.0 mGy without demonstrating improved image quality. The differences between doses averages from TLD system and dosimeter with ionization chamber are less than 10%. TLD system is a good option for average glandular dose measurement for X-rays with a HVL (0.35-0.38 mmAl) and kVp (24-26) used in quality control procedures with ACR Mammography Accreditation Phantom.

  4. TRUTHS (Traceable Radiometry Underpinning Terrestrial- and Helio- Studies): A Mission to Achieve "Climate Quality" Data

    NASA Astrophysics Data System (ADS)

    Fox, N. P.

    2007-12-01

    Over recent years the debate as to whether climate change is real has largely subsided, however there is still significant controversy over its cause and most importantly the scale of its impact and means of mitigation. Much of the latter relies upon the predictive capabilities of sophisticated, but highly complex models. Such models need globally sampled measurements of a variety of key indicative physical parameters, and in some cases proxies of others, as input data and whilst generally predicting similar things the detail of their outputs in the decadal time scales can be highly variable. Clearly the quality of the input data is crucial to such models. However, since the key indicators of climate change may only vary by a few percent per decade, the absolute accuracy of such data also needs to be very small to allow detection and provide some means of validating/discriminating and improving the models. At the present time, the accuracy of currently measured data from space is rarely, if ever, adequate to meet this requirement. Instead, high risk strategies are developed which rely upon overlapping and renormalizing data sets from consecutive flights of similar instruments to establish a long-term trend. Such a strategy is doomed to failure!. The only means of achieving robust data sets of sufficient quality and accuracy with a guarantee of long term reproducibility sufficient to detect the subtle indicators of climate change and its cause (anthropogenic from natural) is through traceability to SI units. Such traceability needs to be regularly re-established and guaranteed throughout the lifetime of a mission. However, given that most sensors degrade in performance during launch, and most importantly whilst in orbit, this is difficult to achieve with sufficient accuracy, since such sensors cannot easily be retrieved and taken back to a national standards laboratory for recalibration. TRUTHS (Traceable Radiometry Underpinning Terrestrial- and Helio- Studies) is a

  5. A STUDY OF THE IMAGE QUALITY OF COMPUTED TOMOGRAPHY ADAPTIVE STATISTICAL ITERATIVE RECONSTRUCTED BRAIN IMAGES USING SUBJECTIVE AND OBJECTIVE METHODS.

    PubMed

    Mangat, J; Morgan, J; Benson, E; Båth, M; Lewis, M; Reilly, A

    2016-06-01

    The recent reintroduction of iterative reconstruction in computed tomography has facilitated the realisation of major dose saving. The aim of this article was to investigate the possibility of achieving further savings at a site with well-established Adaptive Statistical iterative Reconstruction (ASiR™) (GE Healthcare) brain protocols. An adult patient study was conducted with observers making visual grading assessments using image quality criteria, which were compared with the frequency domain metrics, noise power spectrum and modulation transfer function. Subjective image quality equivalency was found in the 40-70% ASiR™ range, leading to the proposal of ranges for the objective metrics defining acceptable image quality. Based on the findings of both the patient-based and objective studies of the ASiR™/tube-current combinations tested, 60%/305 mA was found to fall within all, but one, of these ranges. Therefore, it is recommended that an ASiR™ level of 60%, with a noise index of 12.20, is a viable alternative to the currently used protocol featuring a 40% ASiR™ level and a noise index of 11.20, potentially representing a 16% dose saving. PMID:27103646

  6. Optimizing 3D image quality and performance for stereoscopic gaming

    NASA Astrophysics Data System (ADS)

    Flack, Julien; Sanderson, Hugh; Pegg, Steven; Kwok, Simon; Paterson, Daniel

    2009-02-01

    The successful introduction of stereoscopic TV systems, such as Samsung's 3D Ready Plasma, requires high quality 3D content to be commercially available to the consumer. Console and PC games provide the most readily accessible source of high quality 3D content. This paper describes innovative developments in a generic, PC-based game driver architecture that addresses the two key issues affecting 3D gaming: quality and speed. At the heart of the quality issue are the same considerations that studios face producing stereoscopic renders from CG movies: how best to perform the mapping from a geometric CG environment into the stereoscopic display volume. The major difference being that for game drivers this mapping cannot be choreographed by hand but must be automatically calculated in real-time without significant impact on performance. Performance is a critical issue when dealing with gaming. Stereoscopic gaming has traditionally meant rendering the scene twice with the associated performance overhead. An alternative approach is to render the scene from one virtual camera position and use information from the z-buffer to generate a stereo pair using Depth-Image-Based Rendering (DIBR). We analyze this trade-off in more detail and provide some results relating to both 3D image quality and render performance.

  7. How much image noise can be added in cardiac x-ray imaging without loss in perceived image quality?

    NASA Astrophysics Data System (ADS)

    Gislason-Lee, Amber J.; Kumcu, Asli; Kengyelics, Stephen M.; Rhodes, Laura A.; Davies, Andrew G.

    2015-03-01

    Dynamic X-ray imaging systems are used for interventional cardiac procedures to treat coronary heart disease. X-ray settings are controlled automatically by specially-designed X-ray dose control mechanisms whose role is to ensure an adequate level of image quality is maintained with an acceptable radiation dose to the patient. Current commonplace dose control designs quantify image quality by performing a simple technical measurement directly from the image. However, the utility of cardiac X-ray images is in their interpretation by a cardiologist during an interventional procedure, rather than in a technical measurement. With the long term goal of devising a clinically-relevant image quality metric for an intelligent dose control system, we aim to investigate the relationship of image noise with clinical professionals' perception of dynamic image sequences. Computer-generated noise was added, in incremental amounts, to angiograms of five different patients selected to represent the range of adult cardiac patient sizes. A two alternative forced choice staircase experiment was used to determine the amount of noise which can be added to a patient image sequences without changing image quality as perceived by clinical professionals. Twenty-five viewing sessions (five for each patient) were completed by thirteen observers. Results demonstrated scope to increase the noise of cardiac X-ray images by up to 21% +/- 8% before it is noticeable by clinical professionals. This indicates a potential for 21% radiation dose reduction since X-ray image noise and radiation dose are directly related; this would be beneficial to both patients and personnel.

  8. Study on classification of pork quality using hyperspectral imaging technique

    NASA Astrophysics Data System (ADS)

    Zeng, Shan; Bai, Jun; Wang, Haibin

    2015-12-01

    The relative problems' research of chilled meat, thawed meat and spoiled meat discrimination by hyperspectral image technique were proposed, such the section of feature wavelengths, et al. First, based on 400 ~ 1000nm range hyperspectral image data of testing pork samples, by K-medoids clustering algorithm based on manifold distance, we select 30 important wavelengths from 753 wavelengths, and thus select 8 feature wavelengths (454.4, 477.5, 529.3, 546.8, 568.4, 580.3, 589.9 and 781.2nm) based on the discrimination value. Then 8 texture features of each image under 8 feature wavelengths were respectively extracted by two-dimensional Gabor wavelets transform as pork quality feature. Finally, we build a pork quality classification model using the fuzzy C-mean clustering algorithm. Through the experiment of extracting feature wavelengths, we found that although the hyperspectral images between adjacent bands have a strong linear correlation, they show a significant non-linear manifold relationship from the entire band. K-medoids clustering algorithm based on manifold distance used in this paper for selecting the characteristic wavelengths, which is more reasonable than traditional principal component analysis (PCA). Through the classification result, we conclude that hyperspectral imaging technology can distinguish among chilled meat, thawed meat and spoiled meat accurately.

  9. Automated quality assurance for image-guided radiation therapy.

    PubMed

    Schreibmann, Eduard; Elder, Eric; Fox, Tim

    2009-01-01

    The use of image-guided patient positioning requires fast and reliable Quality Assurance (QA) methods to ensure the megavoltage (MV) treatment beam coincides with the integrated kilovoltage (kV) or volumetric cone-beam CT (CBCT) imaging and guidance systems. Current QA protocol is based on visually observing deviations of certain features in acquired kV in-room treatment images such as markers, distances, or HU values from phantom specifications. This is a time-consuming and subjective task because these features are identified by human operators. The method implemented in this study automated an IGRT QA protocol by using specific image processing algorithms that rigorously detected phantom features and performed all measurements involved in a classical QA protocol. The algorithm was tested on four different IGRT QA phantoms. Image analysis algorithms were able to detect QA features with the same accuracy as the manual approach but significantly faster. All described tests were performed in a single procedure, with acquisition of the images taking approximately 5 minutes, and the automated software analysis taking less than 1 minute. The study showed that the automated image analysis based procedure may be used as a daily QA procedure because it is completely automated and uses a single phantom setup. PMID:19223842

  10. Quality assurance of multiport image-guided minimally invasive surgery at the lateral skull base.

    PubMed

    Nau-Hermes, Maria; Schmitt, Robert; Becker, Meike; El-Hakimi, Wissam; Hansen, Stefan; Klenzner, Thomas; Schipper, Jörg

    2014-01-01

    For multiport image-guided minimally invasive surgery at the lateral skull base a quality management is necessary to avoid the damage of closely spaced critical neurovascular structures. So far there is no standardized method applicable independently from the surgery. Therefore, we adapt a quality management method, the quality gates (QG), which is well established in, for example, the automotive industry and apply it to multiport image-guided minimally invasive surgery. QG divide a process into different sections. Passing between sections can only be achieved if previously defined requirements are fulfilled which secures the process chain. An interdisciplinary team of otosurgeons, computer scientists, and engineers has worked together to define the quality gates and the corresponding criteria that need to be fulfilled before passing each quality gate. In order to evaluate the defined QG and their criteria, the new surgery method was applied with a first prototype at a human skull cadaver model. We show that the QG method can ensure a safe multiport minimally invasive surgical process at the lateral skull base. Therewith, we present an approach towards the standardization of quality assurance of surgical processes. PMID:25105146

  11. Quality Assurance of Multiport Image-Guided Minimally Invasive Surgery at the Lateral Skull Base

    PubMed Central

    Nau-Hermes, Maria; Schmitt, Robert; Becker, Meike; El-Hakimi, Wissam; Hansen, Stefan; Klenzner, Thomas; Schipper, Jörg

    2014-01-01

    For multiport image-guided minimally invasive surgery at the lateral skull base a quality management is necessary to avoid the damage of closely spaced critical neurovascular structures. So far there is no standardized method applicable independently from the surgery. Therefore, we adapt a quality management method, the quality gates (QG), which is well established in, for example, the automotive industry and apply it to multiport image-guided minimally invasive surgery. QG divide a process into different sections. Passing between sections can only be achieved if previously defined requirements are fulfilled which secures the process chain. An interdisciplinary team of otosurgeons, computer scientists, and engineers has worked together to define the quality gates and the corresponding criteria that need to be fulfilled before passing each quality gate. In order to evaluate the defined QG and their criteria, the new surgery method was applied with a first prototype at a human skull cadaver model. We show that the QG method can ensure a safe multiport minimally invasive surgical process at the lateral skull base. Therewith, we present an approach towards the standardization of quality assurance of surgical processes. PMID:25105146

  12. Head Start Program Quality: Examination of Classroom Quality and Parent Involvement in Predicting Children's Vocabulary, Literacy, and Mathematics Achievement Trajectories

    ERIC Educational Resources Information Center

    Wen, Xiaoli; Bulotsky-Shearer, Rebecca J.; Hahs-Vaughn, Debbie L.; Korfmacher, Jon

    2012-01-01

    Guided by a developmental-ecological framework and Head Start's two-generational approach, this study examined two dimensions of Head Start program quality, classroom quality and parent involvement and their unique and interactive contribution to children's vocabulary, literacy, and mathematics skills growth from the beginning of Head Start…

  13. Evaluation of radiation dose and image quality following changes to tube potential (kVp) in conventional paediatric chest radiography

    PubMed Central

    Ramanaidu, S; Sta Maria, RB; Ng, KH; George, J; Kumar, G

    2006-01-01

    Purpose A study of radiation dose and image quality following changes to the tube potential (kVp) in paediatric chest radiography. Materials and Method A total of 109 patients ranging from 1 month to 15 years were included in two phases of the study. Phase 1 investigated the range of entrance surface air kerma (ESAK) values received from patients exposed to the existing exposure factors. In the second phase, new exposure factors using recommended values of tube potential (kVp) with reduced mAs were used. ESAK values were measured using thermoluminescent dosemeters (TLDs). Image quality in both phases was evaluated using image quality criteria proposed by the Council of the European Communities (CEC). Results of both techniques were analysed for any differences. Results The overall mean ESAK before the changes was 0.22 mGy (range: 0.05-0.43) Following changes in tube potential, the overall mean reduced to 0.15 mGy (range: 0.03-0.38), a significant reduction by 34%. The interquartile range was reduced from 45% to 40%. However, doses to those below a year in age still remained high. Assessment of image quality was found to have no significant differences as far as the two techniques used were concerned. However, higher image scores were achieved using higher kVps. Conclusion Significant dose reduction was achieved through appropriate changes in tube potential and reduction of mAs without any loss in image quality. PMID:21614244

  14. DES exposure checker: Dark Energy Survey image quality control crowdsourcer

    NASA Astrophysics Data System (ADS)

    Melchior, Peter; Sheldon, Erin; Drlica-Wagner, Alex; Rykoff, Eli S.

    2015-11-01

    DES exposure checker renders science-grade images directly to a web browser and allows users to mark problematic features from a set of predefined classes, thus allowing image quality control for the Dark Energy Survey to be crowdsourced through its web application. Users can also generate custom labels to help identify previously unknown problem classes; generated reports are fed back to hardware and software experts to help mitigate and eliminate recognized issues. These problem reports allow rapid correction of artifacts that otherwise may be too subtle or infrequent to be recognized.

  15. Metal artifact reduction and image quality evaluation of lumbar spine CT images using metal sinogram segmentation.

    PubMed

    Kaewlek, Titipong; Koolpiruck, Diew; Thongvigitmanee, Saowapak; Mongkolsuk, Manus; Thammakittiphan, Sastrawut; Tritrakarn, Siri-on; Chiewvit, Pipat

    2015-01-01

    Metal artifacts often appear in the images of computed tomography (CT) imaging. In the case of lumbar spine CT images, artifacts disturb the images of critical organs. These artifacts can affect the diagnosis, treatment, and follow up care of the patient. One approach to metal artifact reduction is the sinogram completion method. A mixed-variable thresholding (MixVT) technique to identify the suitable metal sinogram is proposed. This technique consists of four steps: 1) identify the metal objects in the image by using k-mean clustering with the soft cluster assignment, 2) transform the image by separating it into two sinograms, one of which is the sinogram of the metal object, with the surrounding tissue shown in the second sinogram. The boundary of the metal sinogram is then found by the MixVT technique, 3) estimate the new value of the missing data in the metal sinogram by linear interpolation from the surrounding tissue sinogram, 4) reconstruct a modified sinogram by using filtered back-projection and complete the image by adding back the image of the metal object into the reconstructed image to form the complete image. The quantitative and clinical image quality evaluation of our proposed technique demonstrated a significant improvement in image clarity and detail, which enhances the effectiveness of diagnosis and treatment. PMID:26756404

  16. Comparing hardcopy and softcopy results in the study of the impact of workflow on perceived reproduction quality of fine art images

    NASA Astrophysics Data System (ADS)

    Farnand, Susan; Jiang, Jun; Frey, Franziska

    2011-01-01

    A project, supported by the Andrew W. Mellon Foundation, is currently underway to evaluate current practices in fine art image reproduction, determine the image quality generally achievable, and establish a suggested framework for art image interchange. To determine the image quality currently being achieved, experimentation has been conducted in which a set of objective targets and pieces of artwork in various media were imaged by participating museums and other cultural heritage institutions. Prints and images for display made from the delivered image files at the Rochester Institute of Technology were used as stimuli in psychometric testing in which observers were asked to evaluate the prints as reproductions of the original artwork and as stand alone images. The results indicated that there were limited differences between assessments made using displayed images relative to printed reproductions. Further, the differences between rankings made with and without the original artwork present were much smaller than expected.

  17. Exploring V1 by modeling the perceptual quality of images.

    PubMed

    Zhang, Fan; Jiang, Wenfei; Autrusseau, Florent; Lin, Weisi

    2014-01-01

    We propose an image quality model based on phase and amplitude differences between a reference and a distorted image. The proposed model is motivated by the fact that polar representations can separate visual information in a more independent and efficient manner than Cartesian representations in the primary visual cortex (V1). We subsequently estimate the model parameters from a large subjective data set using maximum likelihood methods. By comparing the various model hypotheses on the functional form about the phase and amplitude, we find that: (a) discrimination of visual orientation is important for quality assessment and yet a coarse level of such discrimination seems sufficient; and (b) a product-based amplitude-phase combination before pooling is effective, suggesting an interesting viewpoint about the functional structure of the simple cells and complex cells in V1. PMID:24464165

  18. Image-inpainting and quality-guided phase unwrapping algorithm.

    PubMed

    Meng, Lei; Fang, Suping; Yang, Pengcheng; Wang, Leijie; Komori, Masaharu; Kubo, Aizoh

    2012-05-01

    For the wrapped phase map with regional abnormal fringes, a new phase unwrapping algorithm that combines the image-inpainting theory and the quality-guided phase unwrapping algorithm is proposed. First, by applying a threshold to the modulation map, the valid region (i.e., the interference region) is divided into the doubtful region (called the target region during the inpainting period) and the reasonable one (the source region). The wrapped phase of the doubtful region is thought to be unreliable, and the data are abandoned temporarily. Using the region-filling image-inpainting method, the blank target region is filled with new data, while nothing is changed in the source region. A new wrapped phase map is generated, and then it is unwrapped with the quality-guided phase unwrapping algorithm. Finally, a postprocessing operation is proposed for the final result. Experimental results have shown that the performance of the proposed algorithm is effective. PMID:22614426

  19. Assessing image quality and dose reduction of a new x-ray computed tomography iterative reconstruction algorithm using model observers

    SciTech Connect

    Tseng, Hsin-Wu Kupinski, Matthew A.; Fan, Jiahua; Sainath, Paavana; Hsieh, Jiang

    2014-07-15

    Purpose: A number of different techniques have been developed to reduce radiation dose in x-ray computed tomography (CT) imaging. In this paper, the authors will compare task-based measures of image quality of CT images reconstructed by two algorithms: conventional filtered back projection (FBP), and a new iterative reconstruction algorithm (IR). Methods: To assess image quality, the authors used the performance of a channelized Hotelling observer acting on reconstructed image slices. The selected channels are dense difference Gaussian channels (DDOG).A body phantom and a head phantom were imaged 50 times at different dose levels to obtain the data needed to assess image quality. The phantoms consisted of uniform backgrounds with low contrast signals embedded at various locations. The tasks the observer model performed included (1) detection of a signal of known location and shape, and (2) detection and localization of a signal of known shape. The employed DDOG channels are based on the response of the human visual system. Performance was assessed using the areas under ROC curves and areas under localization ROC curves. Results: For signal known exactly (SKE) and location unknown/signal shape known tasks with circular signals of different sizes and contrasts, the authors’ task-based measures showed that a FBP equivalent image quality can be achieved at lower dose levels using the IR algorithm. For the SKE case, the range of dose reduction is 50%–67% (head phantom) and 68%–82% (body phantom). For the study of location unknown/signal shape known, the dose reduction range can be reached at 67%–75% for head phantom and 67%–77% for body phantom case. These results suggest that the IR images at lower dose settings can reach the same image quality when compared to full dose conventional FBP images. Conclusions: The work presented provides an objective way to quantitatively assess the image quality of a newly introduced CT IR algorithm. The performance of the

  20. High image quality sub 100 picosecond gated framing camera development

    SciTech Connect

    Price, R.H.; Wiedwald, J.D.

    1983-11-17

    A major challenge for laser fusion is the study of the symmetry and hydrodynamic stability of imploding fuel capsules. Framed x-radiographs of 10-100 ps duration, excellent image quality, minimum geometrical distortion (< 1%), dynamic range greater than 1000, and more than 200 x 200 pixels are required for this application. Recent progress on a gated proximity focused intensifier which meets these requirements is presented.

  1. Incorporating detection tasks into the assessment of CT image quality

    NASA Astrophysics Data System (ADS)

    Scalzetti, E. M.; Huda, W.; Ogden, K. M.; Khan, M.; Roskopf, M. L.; Ogden, D.

    2006-03-01

    The purpose of this study was to compare traditional and task dependent assessments of CT image quality. Chest CT examinations were obtained with a standard protocol for subjects participating in a lung cancer-screening project. Images were selected for patients whose weight ranged from 45 kg to 159 kg. Six ABR certified radiologists subjectively ranked these images using a traditional six-point ranking scheme that ranged from 1 (inadequate) to 6 (excellent). Three subtle diagnostic tasks were identified: (1) a lung section containing a sub-centimeter nodule of ground-glass opacity in an upper lung (2) a mediastinal section with a lymph node of soft tissue density in the mediastinum; (3) a liver section with a rounded low attenuation lesion in the liver periphery. Each observer was asked to estimate the probability of detecting each type of lesion in the appropriate CT section using a six-point scale ranging from 1 (< 10%) to 6 (> 90%). Traditional and task dependent measures of image quality were plotted as a function of patient weight. For the lung section, task dependent evaluations were very similar to those obtained using the traditional scoring scheme, but with larger inter-observer differences. Task dependent evaluations for the mediastinal section showed no obvious trend with subject weight, whereas there the traditional score decreased from ~4.9 for smaller subjects to ~3.3 for the larger subjects. Task dependent evaluations for the liver section showed a decreasing trend from ~4.1 for the smaller subjects to ~1.9 for the larger subjects, whereas the traditional evaluation had a markedly narrower range of scores. A task-dependent method of assessing CT image quality can be implemented with relative ease, and is likely to be more meaningful in the clinical setting.

  2. Radiometric Quality Evaluation of INSAT-3D Imager Data

    NASA Astrophysics Data System (ADS)

    Prakash, S.; Jindal, D.; Badal, N.; Kartikeyan, B.; Gopala Krishna, B.

    2014-11-01

    INSAT-3D is an advanced meteorological satellite of ISRO which acquires imagery in optical and infra-red (IR) channels for study of weather dynamics in Indian sub-continent region. In this paper, methodology of radiometric quality evaluation for Level-1 products of Imager, one of the payloads onboard INSAT-3D, is described. Firstly, overall visual quality of scene in terms of dynamic range, edge sharpness or modulation transfer function (MTF), presence of striping and other image artefacts is computed. Uniform targets in Desert and Sea region are identified for which detailed radiometric performance evaluation for IR channels is carried out. Mean brightness temperature (BT) of targets is computed and validated with independently generated radiometric references. Further, diurnal/seasonal trends in target BT values and radiometric uncertainty or sensor noise are studied. Results of radiometric quality evaluation over duration of eight months (January to August 2014) and comparison of radiometric consistency pre/post yaw flip of satellite are presented. Radiometric Analysis indicates that INSAT-3D images have high contrast (MTF > 0.2) and low striping effects. A bias of <4K is observed in the brightness temperature values of TIR-1 channel measured during January-August 2014 indicating consistent radiometric calibration. Diurnal and seasonal analysis shows that Noise equivalent differential temperature (NEdT) for IR channels is consistent and well within specifications.

  3. Beef quality parameters estimation using ultrasound and color images

    PubMed Central

    2015-01-01

    Background Beef quality measurement is a complex task with high economic impact. There is high interest in obtaining an automatic quality parameters estimation in live cattle or post mortem. In this paper we set out to obtain beef quality estimates from the analysis of ultrasound (in vivo) and color images (post mortem), with the measurement of various parameters related to tenderness and amount of meat: rib eye area, percentage of intramuscular fat and backfat thickness or subcutaneous fat. Proposal An algorithm based on curve evolution is implemented to calculate the rib eye area. The backfat thickness is estimated from the profile of distances between two curves that limit the steak and the rib eye, previously detected. A model base in Support Vector Regression (SVR) is trained to estimate the intramuscular fat percentage. A series of features extracted on a region of interest, previously detected in both ultrasound and color images, were proposed. In all cases, a complete evaluation was performed with different databases including: color and ultrasound images acquired by a beef industry expert, intramuscular fat estimation obtained by an expert using a commercial software, and chemical analysis. Conclusions The proposed algorithms show good results to calculate the rib eye area and the backfat thickness measure and profile. They are also promising in predicting the percentage of intramuscular fat. PMID:25734452

  4. How much image noise can be added in cardiac x-ray imaging without loss in perceived image quality?

    NASA Astrophysics Data System (ADS)

    Gislason-Lee, Amber J.; Kumcu, Asli; Kengyelics, Stephen M.; Brettle, David S.; Treadgold, Laura A.; Sivananthan, Mohan; Davies, Andrew G.

    2015-09-01

    Cardiologists use x-ray image sequences of the moving heart acquired in real-time to diagnose and treat cardiac patients. The amount of radiation used is proportional to image quality; however, exposure to radiation is damaging to patients and personnel. The amount by which radiation dose can be reduced without compromising patient care was determined. For five patient image sequences, increments of computer-generated quantum noise (white + colored) were added to the images, frame by frame using pixel-to-pixel addition, to simulate corresponding increments of dose reduction. The noise adding software was calibrated for settings used in cardiac procedures, and validated using standard objective and subjective image quality measurements. The degraded images were viewed next to corresponding original (not degraded) images in a two-alternative-forced-choice staircase psychophysics experiment. Seven cardiologists and five radiographers selected their preferred image based on visualization of the coronary arteries. The point of subjective equality, i.e., level of degradation where the observer could not perceive a difference between the original and degraded images, was calculated; for all patients the median was 33%±15% dose reduction. This demonstrates that a 33%±15% increase in image noise is feasible without being perceived, indicating potential for 33%±15% dose reduction without compromising patient care.

  5. Perceptual quality measurement of 3D images based on binocular vision.

    PubMed

    Zhou, Wujie; Yu, Lu

    2015-07-20

    Three-dimensional (3D) technology has become immensely popular in recent years and widely adopted in various applications. Hence, perceptual quality measurement of symmetrically and asymmetrically distorted 3D images has become an important, fundamental, and challenging issue in 3D imaging research. In this paper, we propose a binocular-vision-based 3D image-quality measurement (IQM) metric. Consideration of the 3D perceptual properties of the primary visual cortex (V1) and the higher visual areas (V2) for 3D-IQM is the major technical contribution to this research. To be more specific, first, the metric simulates the receptive fields of complex cells (V1) using binocular energy response and binocular rivalry response and the higher visual areas (V2) using local binary patterns features. Then, three similarity scores of 3D perceptual properties between the reference and distorted 3D images are measured. Finally, by using support vector regression, three similarity scores are integrated into an overall 3D quality score. Experimental results for two public benchmark databases demonstrate that, in comparison with most current 2D and 3D metrics, the proposed metric achieves significantly higher consistency in alignment with subjective fidelity ratings. PMID:26367842

  6. A virtual image chain for perceived image quality of medical display

    NASA Astrophysics Data System (ADS)

    Marchessoux, Cédric; Jung, Jürgen

    2006-03-01

    This paper describes a virtual image chain for medical display (project VICTOR: granted in the 5th framework program by European commission). The chain starts from raw data of an image digitizer (CR, DR) or synthetic patterns and covers image enhancement (MUSICA by Agfa) and both display possibilities, hardcopy (film on viewing box) and softcopy (monitor). Key feature of the chain is a complete image wise approach. A first prototype is implemented in an object-oriented software platform. The display chain consists of several modules. Raw images are either taken from scanners (CR-DR) or from a pattern generator, in which characteristics of DR- CR systems are introduced by their MTF and their dose-dependent Poisson noise. The image undergoes image enhancement and comes to display. For soft display, color and monochrome monitors are used in the simulation. The image is down-sampled. The non-linear response of a color monitor is taken into account by the GOG or S-curve model, whereas the Standard Gray-Scale-Display-Function (DICOM) is used for monochrome display. The MTF of the monitor is applied on the image in intensity levels. For hardcopy display, the combination of film, printer, lightbox and viewing condition is modeled. The image is up-sampled and the DICOM-GSDF or a Kanamori Look-Up-Table is applied. An anisotropic model for the MTF of the printer is applied on the image in intensity levels. The density-dependent color (XYZ) of the hardcopy film is introduced by Look-Up-tables. Finally a Human Visual System Model is applied to the intensity images (XYZ in terms of cd/m2) in order to eliminate nonvisible differences. Comparison leads to visible differences, which are quantified by higher order image quality metrics. A specific image viewer is used for the visualization of the intensity image and the visual difference maps.

  7. Image gathering and digital restoration for fidelity and visual quality

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Alter-Gartenberg, Rachel; Rahman, Zia-Ur

    1991-01-01

    The fidelity and resolution of the traditional Wiener restorations given in the prevalent digital processing literature can be significantly improved when the transformations between the continuous and discrete representations in image gathering and display are accounted for. However, the visual quality of these improved restorations also is more sensitive to the defects caused by aliasing artifacts, colored noise, and ringing near sharp edges. In this paper, these visual defects are characterized, and methods for suppressing them are presented. It is demonstrated how the visual quality of fidelity-maximized images can be improved when (1) the image-gathering system is specifically designed to enhance the performance of the image-restoration algorithm, and (2) the Wiener filter is combined with interactive Gaussian smoothing, synthetic high edge enhancement, and nonlinear tone-scale transformation. The nonlinear transformation is used primarily to enhance the spatial details that are often obscurred when the normally wide dynamic range of natural radiance fields is compressed into the relatively narrow dynamic range of film and other displays.

  8. ECG-synchronized DSA exposure control: improved cervicothoracic image quality

    SciTech Connect

    Kelly, W.M.; Gould, R.; Norman, D.; Brant-Zawadzki, M.; Cox, L.

    1984-10-01

    An electrocardiogram (ECG)-synchronized x-ray exposure sequence was used to acquire digital subtraction angiographic (DSA) images during 13 arterial injection studies of the aortic arch or carotid bifurcations. These gated images were compared with matched ungated DSA images acquired using the same technical factors, contrast material volume, and patient positioning. Subjective assessments by five experienced observers of edge definition, vessel conspicuousness, and overall diagnostic quality showed overall preference for one of the two acquisition methods in 69% of cases studied. Of these, the ECG-synchronized exposure series were rated superior in 76%. These results, as well as the relatively simple and inexpensive modifications required, suggest that routine use of ECG exposure control can facilitate improved arterial DSA evaluations of suspected cervicothoracic vascular disease.

  9. Effects of characteristics of image quality in an immersive environment

    NASA Technical Reports Server (NTRS)

    Duh, Henry Been-Lirn; Lin, James J W.; Kenyon, Robert V.; Parker, Donald E.; Furness, Thomas A.

    2002-01-01

    Image quality issues such as field of view (FOV) and resolution are important for evaluating "presence" and simulator sickness (SS) in virtual environments (VEs). This research examined effects on postural stability of varying FOV, image resolution, and scene content in an immersive visual display. Two different scenes (a photograph of a fountain and a simple radial pattern) at two different resolutions were tested using six FOVs (30, 60, 90, 120, 150, and 180 deg.). Both postural stability, recorded by force plates, and subjective difficulty ratings varied as a function of FOV, scene content, and image resolution. Subjects exhibited more balance disturbance and reported more difficulty in maintaining posture in the wide-FOV, high-resolution, and natural scene conditions.

  10. Objective assessment of image quality. IV. Application to adaptive optics

    PubMed Central

    Barrett, Harrison H.; Myers, Kyle J.; Devaney, Nicholas; Dainty, Christopher

    2008-01-01

    The methodology of objective assessment, which defines image quality in terms of the performance of specific observers on specific tasks of interest, is extended to temporal sequences of images with random point spread functions and applied to adaptive imaging in astronomy. The tasks considered include both detection and estimation, and the observers are the optimal linear discriminant (Hotelling observer) and the optimal linear estimator (Wiener). A general theory of first- and second-order spatiotemporal statistics in adaptive optics is developed. It is shown that the covariance matrix can be rigorously decomposed into three terms representing the effect of measurement noise, random point spread function, and random nature of the astronomical scene. Figures of merit are developed, and computational methods are discussed. PMID:17106464

  11. Image quality criteria for wide-field x-ray imaging applications

    NASA Astrophysics Data System (ADS)

    Thompson, Patrick L.; Harvey, James E.

    1999-10-01

    For staring, wide-field applications, such as a solar x-ray imager, the severe off-axis aberrations of the classical Wolter Type-I grazing incidence x-ray telescope design drastically limits the 'resolution' near the solar limb. A specification upon on-axis fractional encircled energy is thus not an appropriate image quality criterion for such wide-angle applications. A more meaningful image quality criterion would be a field-weighted-average measure of 'resolution.' Since surface scattering effects from residual optical fabrication errors are always substantial at these very short wavelengths, the field-weighted-average half- power radius is a far more appropriate measure of aerial resolution. If an ideal mosaic detector array is being used in the focal plane, the finite pixel size provides a practical limit to this system performance. Thus, the total number of aerial resolution elements enclosed by the operational field-of-view, expressed as a percentage of the n umber of ideal detector pixels, is a further improved image quality criterion. In this paper we describe the development of an image quality criterion for wide-field applications of grazing incidence x-ray telescopes which leads to a new class of grazing incidence designs described in a following companion paper.

  12. Homework Works If Homework Quality Is High: Using Multilevel Modeling to Predict the Development of Achievement in Mathematics

    ERIC Educational Resources Information Center

    Dettmers, Swantje; Trautwein, Ulrich; Ludtke, Oliver; Kunter, Mareike; Baumert, Jurgen

    2010-01-01

    The present study examined the associations of 2 indicators of homework quality (homework selection and homework challenge) with homework motivation, homework behavior, and mathematics achievement. Multilevel modeling was used to analyze longitudinal data from a representative national sample of 3,483 students in Grades 9 and 10; homework effects…

  13. Training Needs for Faculty Members: Towards Achieving Quality of University Education in the Light of Technological Innovations

    ERIC Educational Resources Information Center

    Abouelenein, Yousri Attia Mohamed

    2016-01-01

    The aim of this study was to identify training needs of university faculty members, in order to achieve the desired quality in the light of technological innovations. A list of training needs of faculty members was developed in terms of technological innovations in general, developing skills of faculty members in the use of technological…

  14. A Multilevel Analysis of the Role of School Quality and Family Background on Students' Mathematics Achievement in the Middle East

    ERIC Educational Resources Information Center

    Kareshki, Hossein; Hajinezhad, Zahra

    2014-01-01

    The purpose of the present study is investigating the correlation between school quality and family socioeconomic background and students' mathematics achievement in the Middle East. The countries in comparison are UAE, Syria, Qatar, Iran, Saudi Arabia, Oman, Lebanon, Jordan, and Bahrain. The study utilized data from IEA's Trends in International…

  15. Bhutanese Stakeholders' Perceptions about Multi-Grade Teaching as a Strategy for Achieving Quality Universal Primary Education

    ERIC Educational Resources Information Center

    Kucita, Pawan; Kivunja, Charles; Maxwell, T. W.; Kuyini, Bawa

    2013-01-01

    This study employed document analysis and qualitative interviews to explore the perceptions of different Bhutanese stakeholders about multi-grade teaching, which the Bhutanese Government identified as a strategy for achieving quality Universal Primary Education. The data from Ministry officials, teachers and student teachers were analyzed using…

  16. The Perception of Preservice Mathematics Teachers on the Role of Scaffolding in Achieving Quality Mathematics Classroom Instruction

    ERIC Educational Resources Information Center

    Bature, Iliya Joseph; Jibrin, Adamu Gagdi

    2015-01-01

    This paper was designed to investigate the perceptions of four preservice mathematics teachers on the role of scaffolding in supporting and assisting them achieves quality classroom teaching. A collaborative approach to teaching through a community of practice was used to obtain data for the three research objectives that were postulated. Two…

  17. Mathematics Achievement among Secondary Students in Relation to Enrollment/Nonenrollment in Music Programs of Differing Content or Quality

    ERIC Educational Resources Information Center

    Van der Vossen, Maria R.

    2012-01-01

    This causal-comparative study examined the relationship between enrollment/non-enrollment in music programs of differing content or quality and mathematical achievement among 739 secondary (grades 8-12) students from four different Maryland counties. The students, both female and male, were divided into sample groups by their participation in a…

  18. Evaluation of scatter effects on image quality for breast tomosynthesis

    SciTech Connect

    Wu Gang; Mainprize, James G.; Boone, John M.; Yaffe, Martin J.

    2009-10-15

    Digital breast tomosynthesis uses a limited number (typically 10-20) of low-dose x-ray projections to produce a pseudo-three-dimensional volume tomographic reconstruction of the breast. The purpose of this investigation was to characterize and evaluate the effect of scattered radiation on the image quality for breast tomosynthesis. In a simulation, scatter point spread functions generated by a Monte Carlo simulation method were convolved over the breast projection to estimate the distribution of scatter for each angle of tomosynthesis projection. The results demonstrate that in the absence of scatter reduction techniques, images will be affected by cupping artifacts, and there will be reduced accuracy of attenuation values inferred from the reconstructed images. The effect of x-ray scatter on the contrast, noise, and lesion signal-difference-to-noise ratio (SDNR) in tomosynthesis reconstruction was measured as a function of the tumor size. When a with-scatter reconstruction was compared to one without scatter for a 5 cm compressed breast, the following results were observed. The contrast in the reconstructed central slice image of a tumorlike mass (14 mm in diameter) was reduced by 30%, the voxel value (inferred attenuation coefficient) was reduced by 28%, and the SDNR fell by 60%. The authors have quantified the degree to which scatter degrades the image quality over a wide range of parameters relevant to breast tomosynthesis, including x-ray beam energy, breast thickness, breast diameter, and breast composition. They also demonstrate, though, that even without a scatter rejection device, the contrast and SDNR in the reconstructed tomosynthesis slice are higher than those of conventional mammographic projection images acquired with a grid at an equivalent total exposure.

  19. A new algorithm for integrated image quality measurement based on wavelet transform and human visual system

    NASA Astrophysics Data System (ADS)

    Wang, Haihui

    2006-01-01

    An essential determinant of the value of digital images is their quality. Over the past years, there have been many attempts to develop models or metrics for image quality that incorporate elements of human visual sensitivity. However, there is no current standard and objective definition of spectral image quality. This paper proposes a reliable automatic method for objective image quality measurement by wavelet transform and Human visual system. This way the proposed measure differentiates between the random and signal-dependant distortion, which have different effects on human observer. Performance of the proposed quality measure is illustrated by examples involving images with different types of degradation. The technique provides a means to relate the quality of an image to the interpretation and quantification throughout the frequency range, in which the noise level is estimated for quality evaluation. The experimental results of using this method for image quality measurement exhibit good correlation to subjective visual quality assessments.

  20. Cross-layer Energy Optimization Under Image Quality Constraints for Wireless Image Transmissions.

    PubMed

    Yang, Na; Demirkol, Ilker; Heinzelman, Wendi

    2012-01-01

    Wireless image transmission is critical in many applications, such as surveillance and environment monitoring. In order to make the best use of the limited energy of the battery-operated cameras, while satisfying the application-level image quality constraints, cross-layer design is critical. In this paper, we develop an image transmission model that allows the application layer (e.g., the user) to specify an image quality constraint, and optimizes the lower layer parameters of transmit power and packet length, to minimize the energy dissipation in image transmission over a given distance. The effectiveness of this approach is evaluated by applying the proposed energy optimization to a reference ZigBee system and a WiFi system, and also by comparing to an energy optimization study that does not consider any image quality constraint. Evaluations show that our scheme outperforms the default settings of the investigated commercial devices and saves a significant amount of energy at middle-to-large transmission distances. PMID:23508852

  1. Characterization of image quality for 3D scatter-corrected breast CT images

    NASA Astrophysics Data System (ADS)

    Pachon, Jan H.; Shah, Jainil; Tornai, Martin P.

    2011-03-01

    The goal of this study was to characterize the image quality of our dedicated, quasi-monochromatic spectrum, cone beam breast imaging system under scatter corrected and non-scatter corrected conditions for a variety of breast compositions. CT projections were acquired of a breast phantom containing two concentric sets of acrylic spheres that varied in size (1-8mm) based on their polar position. The breast phantom was filled with 3 different concentrations of methanol and water, simulating a range of breast densities (0.79-1.0g/cc); acrylic yarn was sometimes included to simulate connective tissue of a breast. For each phantom condition, 2D scatter was measured for all projection angles. Scatter-corrected and uncorrected projections were then reconstructed with an iterative ordered subsets convex algorithm. Reconstructed image quality was characterized using SNR and contrast analysis, and followed by a human observer detection task for the spheres in the different concentric rings. Results show that scatter correction effectively reduces the cupping artifact and improves image contrast and SNR. Results from the observer study indicate that there was no statistical difference in the number or sizes of lesions observed in the scatter versus non-scatter corrected images for all densities. Nonetheless, applying scatter correction for differing breast conditions improves overall image quality.

  2. Assessing and improving cobalt-60 digital tomosynthesis image quality

    NASA Astrophysics Data System (ADS)

    Marsh, Matthew B.; Schreiner, L. John; Kerr, Andrew T.

    2014-03-01

    Image guidance capability is an important feature of modern radiotherapy linacs, and future cobalt-60 units will be expected to have similar capabilities. Imaging with the treatment beam is an appealing option, for reasons of simplicity and cost, but the dose needed to produce cone beam CT (CBCT) images in a Co-60 treatment beam is too high for this modality to be clinically useful. Digital tomosynthesis (DT) offers a quasi-3D image, of sufficient quality to identify bony anatomy or fiducial markers, while delivering a much lower dose than CBCT. A series of experiments were conducted on a prototype Co-60 cone beam imaging system to quantify the resolution, selectivity, geometric accuracy and contrast sensitivity of Co-60 DT. Although the resolution is severely limited by the penumbra cast by the ~2 cm diameter source, it is possible to identify high contrast objects on the order of 1 mm in width, and bony anatomy in anthropomorphic phantoms is clearly recognizable. Low contrast sensitivity down to electron density differences of 3% is obtained, for uniform features of similar thickness. The conventional shift-and-add reconstruction algorithm was compared to several variants of the Feldkamp-Davis-Kress filtered backprojection algorithm result. The Co-60 DT images were obtained with a total dose of 5 to 15 cGy each. We conclude that Co-60 radiotherapy units upgraded for modern conformal therapy could also incorporate imaging using filtered backprojection DT in the treatment beam. DT is a versatile and promising modality that would be well suited to image guidance requirements.

  3. Measuring Teacher Quality: Continuing the Search for Policy-Relevant Predictors of Student Achievement

    ERIC Educational Resources Information Center

    Knoeppel, Robert C.; Logan, Joyce P.; Keiser, Clare M.

    2005-01-01

    The purpose of this study was to investigate the potential viability of the variable certification by the National Board for Professional Teaching Standards (NBPTS) as a policy-relevant predictor of student achievement. Because research has identified the teacher as the most important school-related predictor of student achievement, more research…

  4. Does High Quality Childcare Narrow the Achievement Gap at Two Years of Age?

    ERIC Educational Resources Information Center

    Ruzek, Erik; Burchinal, Margaret; Farkas, George; Duncan, Greg; Dang, Tran; Lee, Weilin

    2011-01-01

    The authors use the ECLS-B, a nationally-representative study of children born in 2001 to report the child care arrangements and quality characteristics for 2-year olds in the United States and to estimate the effects of differing levels of child care quality on two-year old children's cognitive development. Their goal is to test whether high…

  5. Quality improvement initiatives in neonatal intensive care unit networks: achievements and challenges.

    PubMed

    Shah, Vibhuti; Warre, Ruth; Lee, Shoo K

    2013-01-01

    Neonatal intensive care unit networks that encompass regions, states, and even entire countries offer the perfect platform for implementing continuous quality improvement initiatives to advance the health care provided to vulnerable neonates. Through cycles of identification and implementation of best available evidence, benchmarking, and feedback of outcomes, combined with mutual collaborative learning through a network of providers, the performance of health care systems and neonatal outcomes can be improved. We use examples of successful neonatal networks from across North America to explore continuous quality improvement in the neonatal intensive care unit, including the rationale for the formation of neonatal networks, the role of networks in continuous quality improvement, quality improvement methods and outcomes, and barriers to and facilitators of quality improvement. PMID:24268090

  6. SU-E-J-36: Comparison of CBCT Image Quality for Manufacturer Default Imaging Modes

    SciTech Connect

    Nelson, G

    2015-06-15

    Purpose CBCT is being increasingly used in patient setup for radiotherapy. Often the manufacturer default scan modes are used for performing these CBCT scans with the assumption that they are the best options. To quantitatively assess the image quality of these scan modes, all of the scan modes were tested as well as options with the reconstruction algorithm. Methods A CatPhan 504 phantom was scanned on a TrueBeam Linear Accelerator using the manufacturer scan modes (FSRT Head, Head, Image Gently, Pelvis, Pelvis Obese, Spotlight, & Thorax). The Head mode scan was then reconstructed multiple times with all filter options (Smooth, Standard, Sharp, & Ultra Sharp) and all Ring Suppression options (Disabled, Weak, Medium, & Strong). An open source ImageJ tool was created for analyzing the CatPhan 504 images. Results The MTF curve was primarily dictated by the voxel size and the filter used in the reconstruction algorithm. The filters also impact the image noise. The CNR was worst for the Image Gently mode, followed by FSRT Head and Head. The sharper the filter, the worse the CNR. HU varied significantly between scan modes. Pelvis Obese had lower than expected HU values than most while the Image Gently mode had higher than expected HU values. If a therapist tried to use preset window and level settings, they would not show the desired tissue for some scan modes. Conclusion Knowing the image quality of the set scan modes, will enable users to better optimize their setup CBCT. Evaluation of the scan mode image quality could improve setup efficiency and lead to better treatment outcomes.

  7. TU-B-19A-01: Image Registration II: TG132-Quality Assurance for Image Registration

    SciTech Connect

    Brock, K; Mutic, S

    2014-06-15

    AAPM Task Group 132 was charged with a review of the current approaches and solutions for image registration in radiotherapy and to provide recommendations for quality assurance and quality control of these clinical processes. As the results of image registration are always used as the input of another process for planning or delivery, it is important for the user to understand and document the uncertainty associate with the algorithm in general and the Result of a specific registration. The recommendations of this task group, which at the time of abstract submission are currently being reviewed by the AAPM, include the following components. The user should understand the basic image registration techniques and methods of visualizing image fusion. The disclosure of basic components of the image registration by commercial vendors is critical in this respect. The physicists should perform end-to-end tests of imaging, registration, and planning/treatment systems if image registration is performed on a stand-alone system. A comprehensive commissioning process should be performed and documented by the physicist prior to clinical use of the system. As documentation is important to the safe implementation of this process, a request and report system should be integrated into the clinical workflow. Finally, a patient specific QA practice should be established for efficient evaluation of image registration results. The implementation of these recommendations will be described and illustrated during this educational session. Learning Objectives: Highlight the importance of understanding the image registration techniques used in their clinic. Describe the end-to-end tests needed for stand-alone registration systems. Illustrate a comprehensive commissioning program using both phantom data and clinical images. Describe a request and report system to ensure communication and documentation. Demonstrate an clinically-efficient patient QA practice for efficient evaluation of image

  8. Effect of nonlinear three-dimensional optimized reconstruction algorithm filter on image quality and radiation dose: Validation on phantoms

    SciTech Connect

    Bai Mei; Chen Jiuhong; Raupach, Rainer; Suess, Christoph; Tao Ying; Peng Mingchen

    2009-01-15

    A new technique called the nonlinear three-dimensional optimized reconstruction algorithm filter (3D ORA filter) is currently used to improve CT image quality and reduce radiation dose. This technical note describes the comparison of image noise, slice sensitivity profile (SSP), contrast-to-noise ratio, and modulation transfer function (MTF) on phantom images processed with and without the 3D ORA filter, and the effect of the 3D ORA filter on CT images at a reduced dose. For CT head scans the noise reduction was up to 54% with typical bone reconstruction algorithms (H70) and a 0.6 mm slice thickness; for liver CT scans the noise reduction was up to 30% with typical high-resolution reconstruction algorithms (B70) and a 0.6 mm slice thickness. MTF and SSP did not change significantly with the application of 3D ORA filtering (P>0.05), whereas noise was reduced (P<0.05). The low contrast detectability and MTF of images obtained at a reduced dose and filtered by the 3D ORA were equivalent to those of standard dose CT images; there was no significant difference in image noise of scans taken at a reduced dose, filtered using 3D ORA and standard dose CT (P>0.05). The 3D ORA filter shows good potential for reducing image noise without affecting image quality attributes such as sharpness. By applying this approach, the same image quality can be achieved whilst gaining a marked dose reduction.

  9. Color image quality in projection displays: a case study

    NASA Astrophysics Data System (ADS)

    Strand, Monica; Hardeberg, Jon Y.; Nussbaum, Peter

    2005-01-01

    Recently the use of projection displays has increased dramatically in different applications such as digital cinema, home theatre, and business and educational presentations. Even if the color image quality of these devices has improved significantly over the years, it is still a common situation for users of projection displays that the projected colors differ significantly from the intended ones. This study presented in this paper attempts to analyze the color image quality of a large set of projection display devices, particularly investigating the variations in color reproduction. As a case study, a set of 14 projectors (LCD and DLP technology) at Gjovik University College have been tested under four different conditions: dark and light room, with and without using an ICC-profile. To find out more about the importance of the illumination conditions in a room, and the degree of improvement when using an ICC-profile, the results from the measurements was processed and analyzed. Eye-One Beamer from GretagMacbeth was used to make the profiles. The color image quality was evaluated both visually and by color difference calculations. The results from the analysis indicated large visual and colorimetric differences between the projectors. Our DLP projectors have generally smaller color gamut than LCD projectors. The color gamuts of older projectors are significantly smaller than that of newer ones. The amount of ambient light reaching the screen is of great importance for the visual impression. If too much reflections and other ambient light reaches the screen, the projected image gets pale and has low contrast. When using a profile, the differences in colors between the projectors gets smaller and the colors appears more correct. For one device, the average ΔE*ab color difference when compared to a relative white reference was reduced from 22 to 11, for another from 13 to 6. Blue colors have the largest variations among the projection displays and makes them

  10. Color image quality in projection displays: a case study

    NASA Astrophysics Data System (ADS)

    Strand, Monica; Hardeberg, Jon Y.; Nussbaum, Peter

    2004-10-01

    Recently the use of projection displays has increased dramatically in different applications such as digital cinema, home theatre, and business and educational presentations. Even if the color image quality of these devices has improved significantly over the years, it is still a common situation for users of projection displays that the projected colors differ significantly from the intended ones. This study presented in this paper attempts to analyze the color image quality of a large set of projection display devices, particularly investigating the variations in color reproduction. As a case study, a set of 14 projectors (LCD and DLP technology) at Gjøvik University College have been tested under four different conditions: dark and light room, with and without using an ICC-profile. To find out more about the importance of the illumination conditions in a room, and the degree of improvement when using an ICC-profile, the results from the measurements was processed and analyzed. Eye-One Beamer from GretagMacbeth was used to make the profiles. The color image quality was evaluated both visually and by color difference calculations. The results from the analysis indicated large visual and colorimetric differences between the projectors. Our DLP projectors have generally smaller color gamut than LCD projectors. The color gamuts of older projectors are significantly smaller than that of newer ones. The amount of ambient light reaching the screen is of great importance for the visual impression. If too much reflections and other ambient light reaches the screen, the projected image gets pale and has low contrast. When using a profile, the differences in colors between the projectors gets smaller and the colors appears more correct. For one device, the average ΔE*ab color difference when compared to a relative white reference was reduced from 22 to 11, for another from 13 to 6. Blue colors have the largest variations among the projection displays and makes them

  11. Image Quality of the Helioseismic and Magnetic Imager (HMI) Onboard the Solar Dynamics Observatory (SDO)

    NASA Technical Reports Server (NTRS)

    Wachter, R.; Schou, Jesper; Rabello-Soares, M. C.; Miles, J. W.; Duvall, T. L., Jr.; Bush, R. I.

    2011-01-01

    We describe the imaging quality of the Helioseismic and Magnetic Imager (HMI) onboard the Solar Dynamics Observatory (SDO) as measured during the ground calibration of the instrument. We describe the calibration techniques and report our results for the final configuration of HMI. We present the distortion, modulation transfer function, stray light,image shifts introduced by moving parts of the instrument, best focus, field curvature, and the relative alignment of the two cameras. We investigate the gain and linearity of the cameras, and present the measured flat field.

  12. Image quality and localization accuracy in C-arm tomosynthesis-guided head and neck surgery

    SciTech Connect

    Bachar, G.; Siewerdsen, J. H.; Daly, M. J.; Jaffray, D. A.; Irish, J. C.

    2007-12-15

    . . An overall 3D localization accuracy of {approx}2.5 mm was achieved with {theta}{sub tot}{approx} 90 deg. for most tasks. The high in-plane spatial resolution, short scanning time, and low radiation dose characteristic of tomosynthesis may enable the surgeon to collect near real-time images throughout the procedure with minimal interference to surgical workflow. Therefore, tomosynthesis could provide a useful addition to the image-guided surgery arsenal, providing on-demand, high quality image updates, complemented by CBCT at critical milestones in the surgical procedure.

  13. Pleiades image quality: from users' needs to products definition

    NASA Astrophysics Data System (ADS)

    Kubik, Philippe; Pascal, Véronique; Latry, Christophe; Baillarin, Simon

    2005-10-01

    Pleiades is the highest resolution civilian earth observing system ever developed in Europe. This imagery programme is conducted by the French National Space Agency, CNES. It will operate in 2008-2009 two agile satellites designed to provide optical images to civilian and defence users. Images will be simultaneously acquired in Panchromatic (PA) and multispectral (XS) mode, which allows, in Nadir acquisition condition, to deliver 20 km wide, false or natural colored scenes with a 70 cm ground sampling distance after PA+XS fusion. Imaging capabilities have been highly optimized in order to acquire along-track mosaics, stereo pairs and triplets, and multi-targets. To fulfill the operational requirements and ensure quick access to information, ground processing has to automatically perform the radiometrical and geometrical corrections. Since ground processing capabilities have been taken into account very early in the programme development, it has been possible to relax some costly on-board components requirements, in order to achieve a cost effective on-board/ground compromise. Starting from an overview of the system characteristics, this paper deals with the image products definition (raw level, perfect sensor, orthoimage and along-track orthomosaics), and the main processing steps. It shows how each system performance is a result of the satellite performance followed by an appropriate ground processing. Finally, it focuses on the radiometrical performances of final products which are intimately linked to the following processing steps : radiometrical corrections, PA restoration, image resampling and PAN-sharpening.

  14. Comparison of image quality in computed laminography and tomography.

    PubMed

    Xu, Feng; Helfen, Lukas; Baumbach, Tilo; Suhonen, Heikki

    2012-01-16

    In computed tomography (CT), projection images of the sample are acquired over an angular range between 180 to 360 degrees around a rotation axis. A special case of CT is that of limited-angle CT, where some of the rotation angles are inaccessible, leading to artefacts in the reconstrucion because of missing information. The case of flat samples is considered, where the projection angles that are close to the sample surface are either i) completely unavailable or ii) very noisy due to the limited transmission at these angles. Computed laminography (CL) is an imaging technique especially suited for flat samples. CL is a generalization of CT that uses a rotation axis tilted by less than 90 degrees with respect to the incident beam. Thus CL avoids using projections from angles closest to the sample surface. We make a quantitative comparison of the imaging artefacts between CL and limited-angle CT for the case of a parallel-beam geometry. Both experimental and simulated images are used to characterize the effect of the artefacts on the resolution and visible image features. The results indicate that CL has an advantage over CT in cases when the missing angular range is a significant portion of the total angular range. In the case when the quality of the projections is limited by noise, CT allows a better tradeoff between the noise level and the missing angular range. PMID:22274425

  15. Derivation of the scan time requirement for maintaining a consistent PET image quality

    NASA Astrophysics Data System (ADS)

    Kim, Jin Su; Lee, Jae Sung; Kim, Seok-Ki

    2015-05-01

    Objectives: the image quality of PET for larger patients is relatively poor, even though the injection dose is optimized considering the NECR characteristics of the PET scanner. This poor image quality is due to the lower level of maximum NECR that can be achieved in these large patients. The aim of this study was to optimize the PET scan time to obtain a consistent PET image quality regardless of the body size, based on the relationship between the patient specific NECR (pNECR) and body weight. Methods: eighty patients (M/F=53/27, body weight: 059 ± 1 kg) underwent whole-body FDG PET scans using a Philips GEMINI GS PET/CT scanner after an injection of 0.14 mCi/kg FDG. The relationship between the scatter fraction (SF) and body weight was determined by repeated Monte Carlo simulations using a NEMA scatter phantom, the size of which varied according to the relationship between the abdominal circumference and body weight. Using this information, the pNECR was calculated from the prompt and delayed PET sinograms to obtain the prediction equation of NECR vs. body weight. The time scaling factor (FTS) for the scan duration was finally derived to make PET images with equivalent SNR levels. Results: the SF and NECR had the following nonlinear relationships with the body weight: SF=0.15 ṡ body weight0.3 and NECR = 421.36 (body weight)-0.84. The equation derived for FTS was 0.01ṡ body weight + 0.2, which means that, for example, a 120-kg person should be scanned 1.8 times longer than a 70 kg person, or the scan time for a 40-kg person can be reduced by 30%. Conclusion: the equation of the relative time demand derived in this study will be useful for maintaining consistent PET image quality in clinics.

  16. Using collective expert judgements to evaluate quality measures of mass spectrometry images

    PubMed Central

    Palmer, Andrew; Ovchinnikova, Ekaterina; Thuné, Mikael; Lavigne, Régis; Guével, Blandine; Dyatlov, Andrey; Vitek, Olga; Pineau, Charles; Borén, Mats; Alexandrov, Theodore

    2015-01-01

    Motivation: Imaging mass spectrometry (IMS) is a maturating technique of molecular imaging. Confidence in the reproducible quality of IMS data is essential for its integration into routine use. However, the predominant method for assessing quality is visual examination, a time consuming, unstandardized and non-scalable approach. So far, the problem of assessing the quality has only been marginally addressed and existing measures do not account for the spatial information of IMS data. Importantly, no approach exists for unbiased evaluation of potential quality measures. Results: We propose a novel approach for evaluating potential measures by creating a gold-standard set using collective expert judgements upon which we evaluated image-based measures. To produce a gold standard, we engaged 80 IMS experts, each to rate the relative quality between 52 pairs of ion images from MALDI-TOF IMS datasets of rat brain coronal sections. Experts’ optional feedback on their expertise, the task and the survey showed that (i) they had diverse backgrounds and sufficient expertise, (ii) the task was properly understood, and (iii) the survey was comprehensible. A moderate inter-rater agreement was achieved with Krippendorff’s alpha of 0.5. A gold-standard set of 634 pairs of images with accompanying ratings was constructed and showed a high agreement of 0.85. Eight families of potential measures with a range of parameters and statistical descriptors, giving 143 in total, were evaluated. Both signal-to-noise and spatial chaos-based measures performed highly with a correlation of 0.7 to 0.9 with the gold standard ratings. Moreover, we showed that a composite measure with the linear coefficients (trained on the gold standard with regularized least squares optimization and lasso) showed a strong linear correlation of 0.94 and an accuracy of 0.98 in predicting which image in a pair was of higher quality. Availability and implementation: The anonymized data collected from the survey

  17. High-volume image quality assessment systems: tuning performance with an interactive data visualization tool

    NASA Astrophysics Data System (ADS)

    Bresnahan, Patricia A.; Pukinskis, Madeleine; Wiggins, Michael

    1999-03-01

    Image quality assessment systems differ greatly with respect to the number and types of mags they need to evaluate, and their overall architectures. Managers of these systems, however, all need to be able to tune and evaluate system performance, requirements often overlooked or under-designed during project planning. Performance tuning tools allow users to define acceptable quality standards for image features and attributes by adjusting parameter settings. Performance analysis tools allow users to evaluate and/or predict how well a system performs in a given parameter state. While image assessment algorithms are becoming quite sophisticated, duplicating or surpassing the human decision making process in their speed and reliability, they often require a greater investment in 'training' or fine tuning of parameters in order to achieve optimum performance. This process may involve the analysis of hundreds or thousands of images, generating a large database of files and statistics that can be difficult to sort through and interpret. Compounding the difficulty is the fact that personnel charged with tuning and maintaining the production system may not have the statistical or analytical background required for the task. Meanwhile, hardware innovations have greatly increased the volume of images that can be handled in a given time frame, magnifying the consequences of running a production site with an inadequately tuned system. In this paper, some general requirements for a performance evaluation and tuning data visualization system are discussed. A custom engineered solution to the tuning and evaluation problem is then presented, developed within the context of a high volume image quality assessment, data entry, OCR, and image archival system. A key factor influencing the design of the system was the context-dependent definition of image quality, as perceived by a human interpreter. This led to the development of a five-level, hierarchical approach to image quality

  18. Patient dose and image quality from mega-voltage cone beam computed tomography imaging

    SciTech Connect

    Gayou, Olivier; Parda, David S.; Johnson, Mark; Miften, Moyed

    2007-02-15

    The evolution of ever more conformal radiation delivery techniques makes the subject of accurate localization of increasing importance in radiotherapy. Several systems can be utilized including kilo-voltage and mega-voltage cone-beam computed tomography (MV-CBCT), CT on rail or helical tomography. One of the attractive aspects of mega-voltage cone-beam CT is that it uses the therapy beam along with an electronic portal imaging device to image the patient prior to the delivery of treatment. However, the use of a photon beam energy in the mega-voltage range for volumetric imaging degrades the image quality and increases the patient radiation dose. To optimize image quality and patient dose in MV-CBCT imaging procedures, a series of dose measurements in cylindrical and anthropomorphic phantoms using an ionization chamber, radiographic films, and thermoluminescent dosimeters was performed. Furthermore, the dependence of the contrast to noise ratio and spatial resolution of the image upon the dose delivered for a 20-cm-diam cylindrical phantom was evaluated. Depending on the anatomical site and patient thickness, we found that the minimum dose deposited in the irradiated volume was 5-9 cGy and the maximum dose was between 9 and 17 cGy for our clinical MV-CBCT imaging protocols. Results also demonstrated that for high contrast areas such as bony anatomy, low doses are sufficient for image registration and visualization of the three-dimensional boundaries between soft tissue and bony structures. However, as the difference in tissue density decreased, the dose required to identify soft tissue boundaries increased. Finally, the dose delivered by MV-CBCT was simulated using a treatment planning system (TPS), thereby allowing the incorporation of MV-CBCT dose in the treatment planning process. The TPS-calculated doses agreed well with measurements for a wide range of imaging protocols.

  19. The quality transformation: A catalyst for achieving energy`s strategic vision

    SciTech Connect

    1995-01-01

    This plan describes the initial six corporate quality goals for DOE. It also includes accompanying performance measures which will help DOE determine progress towards meeting these goals. The six goals are: (1) There is effective use of performance measurement based on regular assessment of Energy operations using the Presidential Award for Quality, the Malcolm Baldrige National Quality Award, or equivalent criteria. (2) All managers champion continuous quality improvement training for all employees through planning, attendance, and active application. (3) The Department leadership has provided the environment in which employees are enabled to satisfy customer requirements and realize their full potential. (4) The Department management practices foster employee involvement, development and recognition. (5) The Department continuously improves customer service and satisfaction, and internal and external customers recognize Energy as an excellent service provider. (6) The Department has a system which aligns strategic and operational planning with strategic intent, ensures this planning drives resource allocation, provides for regular evaluation of results, and provides feedback.

  20. Decision theory applied to image quality control in radiology

    PubMed Central

    Lessa, Patrícia S; Caous, Cristofer A; Arantes, Paula R; Amaro, Edson; de Souza, Fernando M Campello

    2008-01-01

    Background The present work aims at the application of the decision theory to radiological image quality control (QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Methods Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Results Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. Conclusion The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision. PMID:19014545

  1. Image quality evaluation of breast tomosynthesis with synchrotron radiation

    SciTech Connect

    Malliori, A.; Bliznakova, K.; Speller, R. D.; Horrocks, J. A.; Rigon, L.; Tromba, G.; Pallikarakis, N.

    2012-09-15

    Purpose: This study investigates the image quality of tomosynthesis slices obtained from several acquisition sets with synchrotron radiation using a breast phantom incorporating details that mimic various breast lesions, in a heterogeneous background. Methods: A complex Breast phantom (MAMMAX) with a heterogeneous background and thickness that corresponds to 4.5 cm compressed breast with an average composition of 50% adipose and 50% glandular tissue was assembled using two commercial phantoms. Projection images using acquisition arcs of 24 Degree-Sign , 32 Degree-Sign , 40 Degree-Sign , 48 Degree-Sign , and 56 Degree-Sign at incident energy of 17 keV were obtained from the phantom with the synchrotron radiation for medical physics beamline at ELETTRA Synchrotron Light Laboratory. The total mean glandular dose was set equal to 2.5 mGy. Tomograms were reconstructed with simple multiple projection algorithm (MPA) and filtered MPA. In the latter case, a median filter, a sinc filter, and a combination of those two filters were applied on the experimental data prior to MPA reconstruction. Visual inspection, contrast to noise ratio, contrast, and artifact spread function were the figures of merit used in the evaluation of the visualisation and detection of low- and high-contrast breast features, as a function of the reconstruction algorithm and acquisition arc. To study the benefits of using monochromatic beams, single projection images at incident energies ranging from 14 to 27 keV were acquired with the same phantom and weighted to synthesize polychromatic images at a typical incident x-ray spectrum with W target. Results: Filters were optimised to reconstruct features with different attenuation characteristics and dimensions. In the case of 6 mm low-contrast details, improved visual appearance as well as higher contrast to noise ratio and contrast values were observed for the two filtered MPA algorithms that exploit the sinc filter. These features are better visualized

  2. Perceptual quality metric of color quantization errors on still images

    NASA Astrophysics Data System (ADS)

    Pefferkorn, Stephane; Blin, Jean-Louis

    1998-07-01

    A new metric for the assessment of color image coding quality is presented in this paper. Two models of chromatic and achromatic error visibility have been investigated, incorporating many aspects of human vision and color perception. The achromatic model accounts for both retinal and cortical phenomena such as visual sensitivity to spatial contrast and orientation. The chromatic metric is based on a multi-channel model of human color vision that is parameterized for video coding applications using psychophysical experiments, assuming that perception of color quantization errors can be assimilated to perception of supra-threshold local color-differences. The final metric is a merging of the chromatic model and the achromatic model which accounts for phenomenon as masking. The metric is tested on 6 real images at 5 quality levels using subjective assessments. The high correlation between objective and subjective scores shows that the described metric accurately rates the rendition of important features of the image such as color contours and textures.

  3. A hyperspectral imaging prototype for online quality evaluation of pickling cucumbers

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A hyperspectral imaging prototype was developed for online evaluation of external and internal quality of pickling cucumbers. The prototype had several new, unique features including simultaneous reflectance and transmittance imaging and inline, real time calibration of hyperspectral images of each ...

  4. Automating PACS Quality Control with the Vanderbilt Image Processing Enterprise Resource.

    PubMed

    Esparza, Michael L; Welch, E Brian; Landman, Bennett A

    2012-02-12

    Precise image acquisition is an integral part of modern patient care and medical imaging research. Periodic quality control using standardized protocols and phantoms ensures that scanners are operating according to specifications, yet such procedures do not ensure that individual datasets are free from corruption-for example due to patient motion, transient interference, or physiological variability. If unacceptable artifacts are noticed during scanning, a technologist can repeat a procedure. Yet, substantial delays may be incurred if a problematic scan is not noticed until a radiologist reads the scans or an automated algorithm fails. Given scores of slices in typical three-dimensional scans and wide-variety of potential use cases, a technologist cannot practically be expected inspect all images. In large-scale research, automated pipeline systems have had great success in achieving high throughput. However, clinical and institutional workflows are largely based on DICOM and PACS technologies; these systems are not readily compatible with research systems due to security and privacy restrictions. Hence, quantitative quality control has been relegated to individual investigators and too often neglected. Herein, we propose a scalable system, the Vanderbilt Image Processing Enterprise Resource-VIPER, to integrate modular quality control and image analysis routines with a standard PACS configuration. This server unifies image processing routines across an institutional level and provides a simple interface so that investigators can collaborate to deploy new analysis technologies. VIPER integrates with high performance computing environments has successfully analyzed all standard scans from our institutional research center over the course of the last 18 months. PMID:24357910

  5. Image quality: An overview; Proceedings of the Meeting, Arlington, VA, April 9, 10, 1985

    NASA Astrophysics Data System (ADS)

    Granger, E. M.; Baker, L. R.

    1985-12-01

    Various papers on image quality are presented. The subjects discussed include: image quality considerations in transform coding, psychophysical approach to image quality, a decision theory approach to tone reproduction, Fourier analysis of image raggedness, lens performance assessment by image quality criteria, results of preliminary work on objective MRTD measurement, resolution requirements for binarization of line art, and problems of the visual display in flight simulation. Also addressed are: emittance in thermal imaging applications, optical performance requirements for thermal imaging lenses, dynamic motion measurement using digital TV speckle interferometry, quality assurance for borescopes, versatile projector test device, operational MTF for Landsat Thematic Mapper, operational use of color perception to enhance satellite image quality, theoretical bases and measurement of the MTF of integrated image sensors, measurement of the MTF of thermal and other video systems, and underflight calibration of the Landsat Thematic Mapper.

  6. Functional magnetic resonance imaging of awake monkeys: some approaches for improving imaging quality

    PubMed Central

    Chen, Gang; Wang, Feng; Dillenburger, Barbara C.; Friedman, Robert M.; Chen, Li M.; Gore, John C.; Avison, Malcolm J.; Roe, Anna W.

    2011-01-01

    Functional magnetic resonance imaging (fMRI), at high magnetic field strength can suffer from serious degradation of image quality because of motion and physiological noise, as well as spatial distortions and signal losses due to susceptibility effects. Overcoming such limitations is essential for sensitive detection and reliable interpretation of fMRI data. These issues are particularly problematic in studies of awake animals. As part of our initial efforts to study functional brain activations in awake, behaving monkeys using fMRI at 4.7T, we have developed acquisition and analysis procedures to improve image quality with encouraging results. We evaluated the influence of two main variables on image quality. First, we show how important the level of behavioral training is for obtaining good data stability and high temporal signal-to-noise ratios. In initial sessions, our typical scan session lasted 1.5 hours, partitioned into short (<10 minutes) runs. During reward periods and breaks between runs, the monkey exhibited movements resulting in considerable image misregistrations. After a few months of extensive behavioral training, we were able to increase the length of individual runs and the total length of each session. The monkey learned to wait until the end of a block for fluid reward, resulting in longer periods of continuous acquisition. Each additional 60 training sessions extended the duration of each session by 60 minutes, culminating, after about 140 training sessions, in sessions that last about four hours. As a result, the average translational movement decreased from over 500 μm to less than 80 μm, a displacement close to that observed in anesthetized monkeys scanned in a 7 T horizontal scanner. Another major source of distortion at high fields arises from susceptibility variations. To reduce such artifacts, we used segmented gradient-echo echo-planar imaging (EPI) sequences. Increasing the number of segments significantly decreased susceptibility

  7. Image quality in CT: From physical measurements to model observers.

    PubMed

    Verdun, F R; Racine, D; Ott, J G; Tapiovaara, M J; Toroi, P; Bochud, F O; Veldkamp, W J H; Schegerer, A; Bouwman, R W; Giron, I Hernandez; Marshall, N W; Edyvean, S

    2015-12-01

    Evaluation of image quality (IQ) in Computed Tomography (CT) is important to ensure that diagnostic questions are correctly answered, whilst keeping radiation dose to the patient as low as is reasonably possible. The assessment of individual aspects of IQ is already a key component of routine quality control of medical x-ray devices. These values together with standard dose indicators can be used to give rise to 'figures of merit' (FOM) to characterise the dose efficiency of the CT scanners operating in certain modes. The demand for clinically relevant IQ characterisation has naturally increased with the development of CT technology (detectors efficiency, image reconstruction and processing), resulting in the adaptation and evolution of assessment methods. The purpose of this review is to present the spectrum of various methods that have been used to characterise image quality in CT: from objective measurements of physical parameters to clinically task-based approaches (i.e. model observer (MO) approach) including pure human observer approach. When combined together with a dose indicator, a generalised dose efficiency index can be explored in a framework of system and patient dose optimisation. We will focus on the IQ methodologies that are required for dealing with standard reconstruction, but also for iterative reconstruction algorithms. With this concept the previously used FOM will be presented with a proposal to update them in order to make them relevant and up to date with technological progress. The MO that objectively assesses IQ for clinically relevant tasks represents the most promising method in terms of radiologist sensitivity performance and therefore of most relevance in the clinical environment. PMID:26459319

  8. Quality of Education Predicts Performance on the Wide Range Achievement Test-4th Edition Word Reading Subtest

    PubMed Central

    Sayegh, Philip; Arentoft, Alyssa; Thaler, Nicholas S.; Dean, Andy C.; Thames, April D.

    2014-01-01

    The current study examined whether self-rated education quality predicts Wide Range Achievement Test-4th Edition (WRAT-4) Word Reading subtest and neurocognitive performance, and aimed to establish this subtest's construct validity as an educational quality measure. In a community-based adult sample (N = 106), we tested whether education quality both increased the prediction of Word Reading scores beyond demographic variables and predicted global neurocognitive functioning after adjusting for WRAT-4. As expected, race/ethnicity and education predicted WRAT-4 reading performance. Hierarchical regression revealed that when including education quality, the amount of WRAT-4's explained variance increased significantly, with race/ethnicity and both education quality and years as significant predictors. Finally, WRAT-4 scores, but not education quality, predicted neurocognitive performance. Results support WRAT-4 Word Reading as a valid proxy measure for education quality and a key predictor of neurocognitive performance. Future research should examine these findings in larger, more diverse samples to determine their robust nature. PMID:25404004

  9. Quality of education predicts performance on the Wide Range Achievement Test-4th Edition Word Reading subtest.

    PubMed

    Sayegh, Philip; Arentoft, Alyssa; Thaler, Nicholas S; Dean, Andy C; Thames, April D

    2014-12-01

    The current study examined whether self-rated education quality predicts Wide Range Achievement Test-4th Edition (WRAT-4) Word Reading subtest and neurocognitive performance, and aimed to establish this subtest's construct validity as an educational quality measure. In a community-based adult sample (N = 106), we tested whether education quality both increased the prediction of Word Reading scores beyond demographic variables and predicted global neurocognitive functioning after adjusting for WRAT-4. As expected, race/ethnicity and education predicted WRAT-4 reading performance. Hierarchical regression revealed that when including education quality, the amount of WRAT-4's explained variance increased significantly, with race/ethnicity and both education quality and years as significant predictors. Finally, WRAT-4 scores, but not education quality, predicted neurocognitive performance. Results support WRAT-4 Word Reading as a valid proxy measure for education quality and a key predictor of neurocognitive performance. Future research should examine these findings in larger, more diverse samples to determine their robust nature. PMID:25404004

  10. Degraded visual environment image/video quality metrics

    NASA Astrophysics Data System (ADS)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  11. Perceptual image quality assessment: recent progress and trends

    NASA Astrophysics Data System (ADS)

    Lin, Weisi; Narwaria, Manish

    2010-07-01

    Image quality assessment (IQA) is useful in many visual processing systems but challenging to perform in line with the human perception. A great deal of recent research effort has been directed towards IQA. In order to overcome the difficulty and infeasibility of subjective tests in many situations, the aim of such effort is to assess visual quality objectively towards better alignment with the perception of the Human Visual system (HVS). In this work, we review and analyze the recent progress in the areas related to IQA, as well as giving our views whenever possible. Following the recent trends, we discuss the engineering approach in more details, explore the related aspects for feature pooling, and present a case study with machine learning.

  12. Live births achieved via IVF are increased by improvements in air quality and laboratory environment

    PubMed Central

    Heitmann, Ryan J; Hill, Micah J; James, Aidita N; Schimmel, Tim; Segars, James H; Csokmay, John M; Cohen, Jacques; Payson, Mark D

    2016-01-01

    Infertility is a common disease, which causes many couples to seek treatment with assisted reproduction techniques. Many factors contribute to successful assisted reproduction technique outcomes. One important factor is laboratory environment and air quality. Our facility had the unique opportunity to compare consecutively used, but separate assisted reproduction technique laboratories, as a result of a required move. Environmental conditions were improved by strategic engineering designs. All other aspects of the IVF laboratory, including equipment, physicians, embryologists, nursing staff and protocols, were kept constant between facilities. Air quality testing showed improved air quality at the new IVF site. Embryo implantation (32.4% versus 24.3%; P < 0.01) and live birth (39.3% versus 31.8%, P < 0.05) were significantly increased in the new facility compared with the old facility. More patients met clinical criteria and underwent mandatory single embryo transfer on day 5 leading to both a reduction in multiple gestation pregnancies and increased numbers of vitrified embryos per patient with supernumerary embryos available. Improvements in IVF laboratory conditions and air quality had profound positive effects on laboratory measures and patient outcomes. This study further strengthens the importance of the laboratory environment and air quality in the success of an IVF programme. PMID:26194882

  13. Image quality of a cone beam O-arm 3D imaging system

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Weir, Victor; Lin, Jingying; Hsiung, Hsiang; Ritenour, E. Russell

    2009-02-01

    The O-arm is a cone beam imaging system designed primarily to support orthopedic surgery and is also used for image-guided and vascular surgery. Using a gantry that can be opened or closed, the O-arm can function as a 2-dimensional (2D) fluoroscopy device or collect 3-dimensional (3D) volumetric imaging data like a CT system. Clinical applications of the O-arm in spine surgical procedures, assessment of pedicle screw position, and kyphoplasty procedures show that the O-arm 3D mode provides enhanced imaging information compared to radiographs or fluoroscopy alone. In this study, the image quality of an O-arm system was quantitatively evaluated. A 20 cm diameter CATPHAN 424 phantom was scanned using the pre-programmed head protocols: small/medium (120 kVp, 100 mAs), large (120 kVp, 128 mAs), and extra-large (120 kVp, 160 mAs) in 3D mode. High resolution reconstruction mode (512×512×0.83 mm) was used to reconstruct images for the analysis of low and high contrast resolution, and noise power spectrum. MTF was measured using the point spread function. The results show that the O-arm image is uniform but with a noise pattern which cannot be removed by simply increasing the mAs. The high contrast resolution of the O-arm system was approximately 9 lp/cm. The system has a 10% MTF at 0.45 mm. The low-contrast resolution cannot be decided due to the noise pattern. For surgery where locations of a structure are emphasized over a survey of all image details, the image quality of the O-arm is well accepted clinically.

  14. Automated techniques for quality assurance of radiological image modalities

    NASA Astrophysics Data System (ADS)

    Goodenough, David J.; Atkins, Frank B.; Dyer, Stephen M.

    1991-05-01

    This paper will attempt to identify many of the important issues for quality assurance (QA) of radiological modalities. It is of course to be realized that QA can span many aspects of the diagnostic decision making process. These issues range from physical image performance levels to and through the diagnostic decision of the radiologist. We will use as a model for automated approaches a program we have developed to work with computed tomography (CT) images. In an attempt to unburden the user, and in an effort to facilitate the performance of QA, we have been studying automated approaches. The ultimate utility of the system is its ability to render in a safe and efficacious manner, decisions that are accurate, sensitive, specific and which are possible within the economic constraints of modern health care delivery.

  15. Sentinel-2 geometric image quality commissioning: first results

    NASA Astrophysics Data System (ADS)

    Languille, F.; Déchoz, C.; Gaudel, A.; Greslou, D.; de Lussy, F.; Trémas, T.; Poulain, V.

    2015-10-01

    In the frame of the Copernicus program of the European Comission, Sentinel-2 will offer multispectral highspatial- resolution optical images over global terrestrial surfaces. In cooperation with ESA, the Centre National d'Etudes Spatiales (CNES) is in charge of the image quality of the project, and will so ensure the CAL/VAL commissioning phase during the months following the launch. Sentinel-2 is a constellation of 2 satellites on a polar sun-synchronous orbit with a revisit time of 5 days (with both satellites), a high field of view - 290km, 13 spectral bands in visible and shortwave infrared, and high spatial resolution - 10m, 20m and 60m. The Sentinel-2 mission offers a global coverage over terrestrial surfaces. The satellites acquire systematically terrestrial surfaces under the same viewing conditions in order to have temporal images stacks. The first satellite has been launched in June 2015. Following the launch, the CAL/VAL commissioning phase will then last during 6 months for geometrical calibration. This paper first provides explanations about Sentinel-2 products delivered with geometric corrections. Then this paper details calibration sites, and the methods used for geometrical parameters calibration and presents the first linked results. The following topics are presented: viewing frames orientation assessment, focal plane mapping for all spectral bands, first results on geolocation assessment, and multispectral registration. There is a systematic images recalibration over a same reference which will be a set of S2 images produced during the 6 months of CAL/VAL. As it takes time to have all needed images, the geolocation performance with ground control points and the multitemporal performance are only first results and will be improved during the last phase of the CAL/VAL. So this paper mainly shows the system performances, the preliminary product performances and the way to perform them.

  16. SAR image quality effects of damped phase and amplitude errors

    NASA Astrophysics Data System (ADS)

    Zelenka, Jerry S.; Falk, Thomas

    The effects of damped multiplicative, amplitude, or phase errors on the image quality of synthetic-aperture radar systems are considered. These types of errors can result from aircraft maneuvers or the mechanical steering of an antenna. The proper treatment of damped multiplicative errors can lead to related design specifications and possibly an enhanced collection capability. Only small, high-frequency errors are considered. Expressions for the average intensity and energy associated with a damped multiplicative error are presented and used to derive graphic results. A typical example is used to show how to apply the results of this effort.

  17. An automated system for numerically rating document image quality

    SciTech Connect

    Cannon, M.; Kelly, P.; Iyengar, S.S.; Brener, N.

    1997-04-01

    As part of the Department of Energy document declassification program, the authors have developed a numerical rating system to predict the OCR error rate that they expect to encounter when processing a particular document. The rating algorithm produces a vector containing scores for different document image attributes such as speckle and touching characters. The OCR error rate for a document is computed from a weighted sum of the elements of the corresponding quality vector. The predicted OCR error rate will be used to screen documents that would not be handled properly with existing document processing products.

  18. New strategy for image and video quality assessment

    NASA Astrophysics Data System (ADS)

    Ma, Qi; Zhang, Liming; Wang, Bin

    2010-01-01

    Image and video quality assessment (QA) is a critical issue in image and video processing applications. General full-reference (FR) QA criteria such as peak signal-to-noise ratio (PSNR) and mean squared error (MSE) do not accord well with human subjective assessment. Some QA indices that consider human visual sensitivity, such as mean structural similarity (MSSIM) with structural sensitivity, visual information fidelity (VIF) with statistical sensitivity, etc., were proposed in view of the differences between reference and distortion frames on a pixel or local level. However, they ignore the role of human visual attention (HVA). Recently, some new strategies with HVA have been proposed, but the methods extracting the visual attention are too complex for real-time realization. We take advantage of the phase spectrum of quaternion Fourier transform (PQFT), a very fast algorithm we previously proposed, to extract saliency maps of color images or videos. Then we propose saliency-based methods for both image QA (IQA) and video QA (VQA) by adding weights related to saliency features to these original IQA or VQA criteria. Experimental results show that our saliency-based strategy can approach more closely to human subjective assessment compared with these original IQA or VQA methods and does not take more time because of the fast PQFT algorithm.

  19. Influence of slice overlap on positron emission tomography image quality

    NASA Astrophysics Data System (ADS)

    McKeown, Clare; Gillen, Gerry; Dempsey, Mary Frances; Findlay, Caroline

    2016-02-01

    PET scans use overlapping acquisition beds to correct for reduced sensitivity at bed edges. The optimum overlap size for the General Electric (GE) Discovery 690 has not been established. This study assesses how image quality is affected by slice overlap. Efficacy of 23% overlaps (recommended by GE) and 49% overlaps (maximum possible overlap) were specifically assessed. European Association of Nuclear Medicine (EANM) guidelines for calculating minimum injected activities based on overlap size were also reviewed. A uniform flood phantom was used to assess noise (coefficient of variation, (COV)) and voxel accuracy (activity concentrations, Bq ml-1). A NEMA (National Electrical Manufacturers Association) body phantom with hot/cold spheres in a background activity was used to assess contrast recovery coefficients (CRCs) and signal to noise ratios (SNR). Different overlap sizes and sphere-to-background ratios were assessed. COVs for 49% and 23% overlaps were 9% and 13% respectively. This increased noise was difficult to visualise on the 23% overlap images. Mean voxel activity concentrations were not affected by overlap size. No clinically significant differences in CRCs were observed. However, visibility and SNR of small, low contrast spheres (⩽13 mm diameter, 2:1 sphere to background ratio) may be affected by overlap size in low count studies if they are located in the overlap area. There was minimal detectable influence on image quality in terms of noise, mean activity concentrations or mean CRCs when comparing 23% overlap with 49% overlap. Detectability of small, low contrast lesions may be affected in low count studies—however, this is a worst-case scenario. The marginal benefits of increasing overlap from 23% to 49% are likely to be offset by increased patient scan times. A 23% overlap is therefore appropriate for clinical use. An amendment to EANM guidelines for calculating injected activities is also proposed which better reflects the effect overlap size has

  20. Improve the image quality of orbital 3 T diffusion-weighted magnetic resonance imaging with readout-segmented echo-planar imaging.

    PubMed

    Xu, Xiao-Quan; Liu, Jun; Hu, Hao; Su, Guo-Yi; Zhang, Yu-Dong; Shi, Hai-Bin; Wu, Fei-Yun

    2016-01-01

    The aim of our study is to compare the image quality of readout-segmented echo-planar imaging (rs-EPI) and that of standard single-shot EPI (ss-EPI) in orbital 3 T diffusion-weighted (DW) magnetic resonance (MR) imaging in healthy subjects. Forty-two volunteers underwent two sets of orbital DW imaging scan at a 3 T MR unit, and image quality was assessed qualitatively and quantitatively. As a result, we found that rs-EPI could provide better image quality than standard ss-EPI, while no significant difference was found on the apparent diffusion coefficient between the two sets of DW images. PMID:27317226

  1. Retinal Image Quality Assessment for Spaceflight-Induced Vision Impairment Study

    NASA Technical Reports Server (NTRS)

    Vu, Amanda Cadao; Raghunandan, Sneha; Vyas, Ruchi; Radhakrishnan, Krishnan; Taibbi, Giovanni; Vizzeri, Gianmarco; Grant, Maria; Chalam, Kakarla; Parsons-Wingerter, Patricia

    2015-01-01

    Long-term exposure to space microgravity poses significant risks for visual impairment. Evidence suggests such vision changes are linked to cephalad fluid shifts, prompting a need to directly quantify microgravity-induced retinal vascular changes. The quality of retinal images used for such vascular remodeling analysis, however, is dependent on imaging methodology. For our exploratory study, we hypothesized that retinal images captured using fluorescein imaging methodologies would be of higher quality in comparison to images captured without fluorescein. A semi-automated image quality assessment was developed using Vessel Generation Analysis (VESGEN) software and MATLAB® image analysis toolboxes. An analysis of ten images found that the fluorescein imaging modality provided a 36% increase in overall image quality (two-tailed p=0.089) in comparison to nonfluorescein imaging techniques.

  2. A color image quality assessment using a reduced-reference image machine learning expert

    NASA Astrophysics Data System (ADS)

    Charrier, Christophe; Lebrun, Gilles; Lezoray, Olivier

    2008-01-01

    A quality metric based on a classification process is introduced. The main idea of the proposed method is to avoid the error pooling step of many factors (in frequential and spatial domain) commonly applied to obtain a final quality score. A classification process based on final quality class with respect to the standard quality scale provided by the UIT. Thus, for each degraded color image, a feature vector is computed including several Human Visual System characteristics, such as, contrast masking effect, color correlation, and so on. Selected features are of two kinds: 1) full-reference features and 2) no-reference characteristics. That way, a machine learning expert, providing a final class number is designed.

  3. A novel technique of image quality objective measurement by wavelet analysis throughout the spatial frequency range

    NASA Astrophysics Data System (ADS)

    Luo, Gaoyong

    2005-01-01

    An essential determinant of the value of surrogate digital images is their quality. Image quality measurement has become crucial for most image processing applications. Over the past years , there have been many attempts to develop models or metrics for image quality that incorporate elements of human visual sensitivity. However, there is no current standard and objective definition of spectral image quality. This paper proposes a reliable automatic method for objective image quality measurement by wavelet analysis throughout the spatial frequency range. This is done by a detailed analysis of an image for a wide range of spatial frequency content, using a combination of modulation transfer function (MTF), brightness, contrast, saturation, sharpness and noise, as a more revealing metric for quality evaluation. A fast lifting wavelet algorithm is developed for computationally efficient spatial frequency analysis, where fine image detail corresponding to high spatial frequencies and image sharpness in regard to lower and mid -range spatial frequencies can be examined and compared accordingly. The wavelet frequency deconstruction is actually to extract the feature of edges in sub-band images. The technique provides a means to relate the quality of an image to the interpretation and quantification throughout the frequency range, in which the noise level is estimated in assisting with quality analysis. The experimental results of using this method for image quality measurement exhibit good correlation to subjective visual quality assessments.

  4. A novel technique of image quality objective measurement by wavelet analysis throughout the spatial frequency range

    NASA Astrophysics Data System (ADS)

    Luo, Gaoyong

    2004-10-01

    An essential determinant of the value of surrogate digital images is their quality. Image quality measurement has become crucial for most image processing applications. Over the past years , there have been many attempts to develop models or metrics for image quality that incorporate elements of human visual sensitivity. However, there is no current standard and objective definition of spectral image quality. This paper proposes a reliable automatic method for objective image quality measurement by wavelet analysis throughout the spatial frequency range. This is done by a detailed analysis of an image for a wide range of spatial frequency content, using a combination of modulation transfer function (MTF), brightness, contrast, saturation, sharpness and noise, as a more revealing metric for quality evaluation. A fast lifting wavelet algorithm is developed for computationally efficient spatial frequency analysis, where fine image detail corresponding to high spatial frequencies and image sharpness in regard to lower and mid -range spatial frequencies can be examined and compared accordingly. The wavelet frequency deconstruction is actually to extract the feature of edges in sub-band images. The technique provides a means to relate the quality of an image to the interpretation and quantification throughout the frequency range, in which the noise level is estimated in assisting with quality analysis. The experimental results of using this method for image quality measurement exhibit good correlation to subjective visual quality assessments.

  5. Measuring saliency in images: which experimental parameters for the assessment of image quality?

    NASA Astrophysics Data System (ADS)

    Fredembach, Clement; Woolfe, Geoff; Wang, Jue

    2012-01-01

    Predicting which areas of an image are perceptually salient or attended to has become an essential pre-requisite of many computer vision applications. Because observers are notoriously unreliable in remembering where they look a posteriori, and because asking where they look while observing the image necessarily in uences the results, ground truth about saliency and visual attention has to be obtained by gaze tracking methods. From the early work of Buswell and Yarbus to the most recent forays in computer vision there has been, perhaps unfortunately, little agreement on standardisation of eye tracking protocols for measuring visual attention. As the number of parameters involved in experimental methodology can be large, their individual in uence on the nal results is not well understood. Consequently, the performance of saliency algorithms, when assessed by correlation techniques, varies greatly across the literature. In this paper, we concern ourselves with the problem of image quality. Specically: where people look when judging images. We show that in this case, the performance gap between existing saliency prediction algorithms and experimental results is signicantly larger than otherwise reported. To understand this discrepancy, we rst devise an experimental protocol that is adapted to the task of measuring image quality. In a second step, we compare our experimental parameters with the ones of existing methods and show that a lot of the variability can directly be ascribed to these dierences in experimental methodology and choice of variables. In particular, the choice of a task, e.g., judging image quality vs. free viewing, has a great impact on measured saliency maps, suggesting that even for a mildly cognitive task, ground truth obtained by free viewing does not adapt well. Careful analysis of the prior art also reveals that systematic bias can occur depending on instrumental calibration and the choice of test images. We conclude this work by proposing a

  6. Image quality degradation and retrieval errors introduced by registration and interpolation of multispectral digital images

    SciTech Connect

    Henderson, B.G.; Borel, C.C.; Theiler, J.P.; Smith, B.W.

    1996-04-01

    Full utilization of multispectral data acquired by whiskbroom and pushbroom imagers requires that the individual channels be registered accurately. Poor registration introduces errors which can be significant, especially in high contrast areas such as boundaries between regions. We simulate the acquisition of multispectral imagery in order to estimate the errors that are introduced by co-registration of different channels and interpolation within the images. We compute the Modulation Transfer Function (MTF) and image quality degradation brought about by fractional pixel shifting and calculate errors in retrieved quantities (surface temperature and water vapor) that occur as a result of interpolation. We also present a method which might be used to estimate sensor platform motion for accurate registration of images acquired by a pushbroom scanner.

  7. THE USE OF EXISTING AND MODIFIED LAND USE INSTRUMENTS TO ACHIEVE ENVIRONMENTAL QUALITY

    EPA Science Inventory

    The report reviews the application and potential of the police power at the local level of government as it is used to achieve environmental planning objectives. The first section presents an overview of the interactions of various municipal regulations and ordinances as they aff...

  8. School Improvement Plans and Student Achievement: Preliminary Evidence from the Quality and Merit Project in Italy

    ERIC Educational Resources Information Center

    Caputo, Andrea; Rastelli, Valentina

    2014-01-01

    This study provides preliminary evidence from an Italian in-service training program addressed to lower secondary school teachers which supports school improvement plans (SIPs). It aims at exploring the association between characteristics/contents of SIPs and student improvement in math achievement. Pre-post standardized tests and text analysis of…

  9. Leveraging Quality Improvement to Achieve Student Learning Assessment Success in Higher Education

    ERIC Educational Resources Information Center

    Glenn, Nancy Gentry

    2009-01-01

    Mounting pressure for transformational change in higher education driven by technology, globalization, competition, funding shortages, and increased emphasis on accountability necessitates that universities implement reforms to demonstrate responsiveness to all stakeholders and to provide evidence of student achievement. In the face of the demand…

  10. The Quality of Content Analyses of State Student Achievement Tests and Content Standards

    ERIC Educational Resources Information Center

    Porter, Andrew C.; Polikoff, Morgan S.; Zeidner, Tim; Smithson, John

    2008-01-01

    This article examines the reliability of content analyses of state student achievement tests and state content standards. We use data from two states in three grades in mathematics and English language arts and reading to explore differences by state, content area, grade level, and document type. Using a generalizability framework, we find that…

  11. Student Achievement Conditioned upon School Selection: Religious and Secular Secondary School Quality in Bangladesh

    ERIC Educational Resources Information Center

    Niaz Asadullah, Mohammad; Chaudhury, Nazmul; Dar, Amit

    2007-01-01

    In this paper we present new evidence on the impact of school characteristics on secondary student achievement using a rich dataset from rural Bangladesh. We deal with a potentially important selectivity issue in the South Asian context: the non-random sorting of children into madrasas (Islamic faith schools). We do so by employing a combination…

  12. Goal Setting in Principal Evaluation: Goal Quality and Predictors of Achievement

    ERIC Educational Resources Information Center

    Sinnema, Claire E. L.; Robinson, Viviane M. J.

    2012-01-01

    This article draws on goal-setting theory to investigate the goals set by experienced principals during their performance evaluations. While most goals were about teaching and learning, they tended to be vaguely expressed and only partially achieved. Five predictors (commitment, challenge, learning, effort, and support) explained a significant…

  13. Achieving Quality Education in Ghana: The Spotlight on Primary Education within the Kumasi Metropolis

    ERIC Educational Resources Information Center

    Boakye-Amponsah, Abraham; Enninful, Ebenezer Kofi; Anin, Emmanuel Kwabena; Vanderpuye, Patience

    2015-01-01

    Background: Ghana being a member of the United Nations, committed to the Universal Primary Education initiative in 2000 and has since implemented series of educational reforms to meet the target for the Millennium Development Goal (MDG) 2. Despite the numerous government interventions to achieve the MDG 2, many children in Ghana have been denied…

  14. Image quality and stability of image-guided radiotherapy (IGRT) devices: A comparative study

    PubMed Central

    Stock, Markus; Pasler, Marlies; Birkfellner, Wolfgang; Homolka, Peter; Poetter, Richard; Georg, Dietmar

    2010-01-01

    Introduction Our aim was to implement standards for quality assurance of IGRT devices used in our department and to compare their performances with that of a CT simulator. Materials and methods We investigated image quality parameters for three devices over a period of 16 months. A multislice CT was used as a benchmark and results related to noise, spatial resolution, low contrast visibility (LCV) and uniformity were compared with a cone beam CT (CBCT) at a linac and simulator. Results All devices performed well in terms of LCV and, in fact, exceeded vendor specifications. MTF was comparable between CT and linac CBCT. Integral nonuniformity was, on average, 0.002 for the CT and 0.006 for the linac CBCT. Uniformity, LCV and MTF varied depending on the protocols used for the linac CBCT. Contrast-to-noise ratio was an average of 51% higher for the CT than for the linac and simulator CBCT. No significant time trend was observed and tolerance limits were implemented. Discussion Reasonable differences in image quality between CT and CBCT were observed. Further research and development are necessary to increase image quality of commercially available CBCT devices in order for them to serve the needs for adaptive and/or online planning. PMID:19695725

  15. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    SciTech Connect

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A.

    2014-03-15

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also

  16. Understanding and Achieving Quality in Sure Start Children's Centres: Practitioners' Perspectives

    ERIC Educational Resources Information Center

    Cottle, Michelle

    2011-01-01

    This article focuses on some of the issues that shape understandings of professional practice in the rapidly expanding context of children's centres in England. Drawing on data from an ESRC-funded project exploring practitioners' understandings of quality and success, the perspectives of 115 practitioners working in 11 Sure Start Children's…

  17. Forward-Oriented Designing for Learning as a Means to Achieve Educational Quality

    ERIC Educational Resources Information Center

    Ghislandi, Patrizia M. M.; Raffaghelli, Juliana E.

    2015-01-01

    In this paper, we reflect on how Design for Learning can create the basis for a culture of educational quality. We explore the process of Design for Learning within a blended, undergraduate university course through a teacher-led inquiry approach, aiming at showing the connections between the process of Design for Learning and academic…

  18. Quality, peer review, and the achievement of consensus in probabilistic risk analysis

    SciTech Connect

    Apostolakis, G.; Garrick, B.J.; Okrent, D.

    1983-01-01

    This article addresses some of the issues that arise in connection with the problems associated with probabilistic risk assessment (PRA). Some opinions are given on quality assurance, PRA scope, and peer review. Then the issue of consensus and some of the reasons that lead to disagreement are discussed.

  19. Teacher-Student Relationship Quality Type in Elementary Grades: Effects on Trajectories for Achievement and Engagement

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Hughes, Jan N.; Kwok, Oi-Man

    2010-01-01

    Teacher, peer, and student reports of the quality of the teacher-student relationship were obtained for an ethnically diverse and academically at-risk sample of 706 second- and third-grade students. Cluster analysis identified four types of relationships based on the consistency of child reports of support and conflict in the relationship with…

  20. Achieving Quality Assurance and Moving to a World Class University in the 21st Century

    ERIC Educational Resources Information Center

    Lee, Lung-Sheng Steven

    2013-01-01

    Globalization in the 21st century has brought innumerable challenges and opportunities to universities and countries. Universities are primarily concerned with how to ensure the quality of their education and how to boost their local and global competitiveness. The pressure from both international competition and public accountability on…

  1. Preschool Center Care Quality Effects on Academic Achievement: An Instrumental Variables Analysis

    ERIC Educational Resources Information Center

    Auger, Anamarie; Farkas, George; Burchinal, Margaret R.; Duncan, Greg J.; Vandell, Deborah Lowe

    2014-01-01

    Much of child care research has focused on the effects of the quality of care in early childhood settings on children's school readiness skills. Although researchers increased the statistical rigor of their approaches over the past 15 years, researchers' ability to draw causal inferences has been limited because the studies are based on…

  2. A Guide to the Librarian's Responsibility in Achieving Quality in Lighting and Ventilation.

    ERIC Educational Resources Information Center

    Mason, Ellsworth

    1967-01-01

    Quality, not intensity, is the keystone to good library lighting. The single most important problem in lighting is glare caused by extremely intense centers of light. Multiple interfiling of light rays is a factor required in library lighting. A fixture that diffuses light well is basic when light emerges from the fixture. It scatters widely,…

  3. Family Background, School Quality and Rural-Urban Disparities in Student Learning Achievement in Latvia

    ERIC Educational Resources Information Center

    Geske, Andrejs; Grinfelds, Andris; Dedze, Indra; Zhang, Yanhong

    2006-01-01

    Over the course of the fifteen years since 1991, Latvia has been undergoing rapid political changes from a party controlled state to a market economy. These changes have affected the system of education. The issue of quality and equity of educational outcomes is gaining increasing importance as schools are expected to adjust to the new economic…

  4. Relations between local and global perceptual image quality and visual masking

    NASA Astrophysics Data System (ADS)

    Alam, Md Mushfiqul; Patil, Pranita; Hagan, Martin T.; Chandler, Damon M.

    2015-03-01

    Perceptual quality assessment of digital images and videos are important for various image-processing applications. For assessing the image quality, researchers have often used the idea of visual masking (or distortion visibility) to design image-quality predictors specifically for the near-threshold distortions. However, it is still unknown that while assessing the quality of natural images, how the local distortion visibilities relate with the local quality scores. Furthermore, the summing mechanism of the local quality scores to predict the global quality scores is also crucial for better prediction of the perceptual image quality. In this paper, the local and global qualities of six images and six distortion levels were measured using subjective experiments. Gabor-noise target was used as distortion in the quality-assessment experiments to be consistent with our previous study [Alam, Vilankar, Field, and Chandler, Journal of Vision, 2014], in which the local root-mean-square contrast detection thresholds of detecting the Gabor-noise target were measured at each spatial location of the undistorted images. Comparison of the results of this quality-assessment experiment and the previous detection experiment shows that masking predicted the local quality scores more than 95% correctly above 15 dB threshold within 5% subject scores. Furthermore, it was found that an approximate squared summation of local-quality scores predicted the global quality scores suitably (Spearman rank-order correlation 0:97).

  5. Correlation of the clinical and physical image quality in chest radiography for average adults with a computed radiography imaging system

    PubMed Central

    Wood, T J; Beavis, A W; Saunderson, J R

    2013-01-01

    Objective: The purpose of this study was to examine the correlation between the quality of visually graded patient (clinical) chest images and a quantitative assessment of chest phantom (physical) images acquired with a computed radiography (CR) imaging system. Methods: The results of a previously published study, in which four experienced image evaluators graded computer-simulated postero-anterior chest images using a visual grading analysis scoring (VGAS) scheme, were used for the clinical image quality measurement. Contrast-to-noise ratio (CNR) and effective dose efficiency (eDE) were used as physical image quality metrics measured in a uniform chest phantom. Although optimal values of these physical metrics for chest radiography were not derived in this work, their correlation with VGAS in images acquired without an antiscatter grid across the diagnostic range of X-ray tube voltages was determined using Pearson’s correlation coefficient. Results: Clinical and physical image quality metrics increased with decreasing tube voltage. Statistically significant correlations between VGAS and CNR (R=0.87, p<0.033) and eDE (R=0.77, p<0.008) were observed. Conclusion: Medical physics experts may use the physical image quality metrics described here in quality assurance programmes and optimisation studies with a degree of confidence that they reflect the clinical image quality in chest CR images acquired without an antiscatter grid. Advances in knowledge: A statistically significant correlation has been found between the clinical and physical image quality in CR chest imaging. The results support the value of using CNR and eDE in the evaluation of quality in clinical thorax radiography. PMID:23568362

  6. Cone beam computed tomography radiation dose and image quality assessments.

    PubMed

    Lofthag-Hansen, Sara

    2010-01-01

    Diagnostic radiology has undergone profound changes in the last 30 years. New technologies are available to the dental field, cone beam computed tomography (CBCT) as one of the most important. CBCT is a catch-all term for a technology comprising a variety of machines differing in many respects: patient positioning, volume size (FOV), radiation quality, image capturing and reconstruction, image resolution and radiation dose. When new technology is introduced one must make sure that diagnostic accuracy is better or at least as good as the one it can be expected to replace. The CBCT brand tested was two versions of Accuitomo (Morita, Japan): 3D Accuitomo with an image intensifier as detector, FOV 3 cm x 4 cm and 3D Accuitomo FPD with a flat panel detector, FOVs 4 cm x 4 cm and 6 cm x 6 cm. The 3D Accuitomo was compared with intra-oral radiography for endodontic diagnosis in 35 patients with 46 teeth analyzed, of which 41 were endodontically treated. Three observers assessed the images by consensus. The result showed that CBCT imaging was superior with a higher number of teeth diagnosed with periapical lesions (42 vs 32 teeth). When evaluating 3D Accuitomo examinations in the posterior mandible in 30 patients, visibility of marginal bone crest and mandibular canal, important anatomic structures for implant planning, was high with good observer agreement among seven observers. Radiographic techniques have to be evaluated concerning radiation dose, which requires well-defined and easy-to-use methods. Two methods: CT dose index (CTDI), prevailing method for CT units, and dose-area product (DAP) were evaluated for calculating effective dose (E) for both units. An asymmetric dose distribution was revealed when a clinical situation was simulated. Hence, the CTDI method was not applicable for these units with small FOVs. Based on DAP values from 90 patient examinations effective dose was estimated for three diagnostic tasks: implant planning in posterior mandible and

  7. Prediction of water quality parameters from SAR images by using multivariate and texture analysis models

    NASA Astrophysics Data System (ADS)

    Shareef, Muntadher A.; Toumi, Abdelmalek; Khenchaf, Ali

    2014-10-01

    Remote sensing is one of the most important tools for monitoring and assisting to estimate and predict Water Quality parameters (WQPs). The traditional methods used for monitoring pollutants are generally relied on optical images. In this paper, we present a new approach based on the Synthetic Aperture Radar (SAR) images which we used to map the region of interest and to estimate the WQPs. To achieve this estimation quality, the texture analysis is exploited to improve the regression models. These models are established and developed to estimate six common concerned water quality parameters from texture parameters extracted from Terra SAR-X data. In this purpose, the Gray Level Cooccurrence Matrix (GLCM) is used to estimate several regression models using six texture parameters such as contrast, correlation, energy, homogeneity, entropy and variance. For each predicted model, an accuracy value is computed from the probability value given by the regression analysis model of each parameter. In order to validate our approach, we have used tow dataset of water region for training and test process. To evaluate and validate the proposed model, we applied it on the training set. In the last stage, we used the fuzzy K-means clustering to generalize the water quality estimation on the whole of water region extracted from segmented Terra SAR-X image. Also, the obtained results showed that there are a good statistical correlation between the in situ water quality and Terra SAR-X data, and also demonstrated that the characteristics obtained by texture analysis are able to monitor and predicate the distribution of WQPs in large rivers with high accuracy.

  8. Comprehensive model for predicting perceptual image quality of smart mobile devices.

    PubMed

    Gong, Rui; Xu, Haisong; Luo, M R; Li, Haifeng

    2015-01-01

    An image quality model for smart mobile devices was proposed based on visual assessments of several image quality attributes. A series of psychophysical experiments were carried out on two kinds of smart mobile devices, i.e., smart phones and tablet computers, in which naturalness, colorfulness, brightness, contrast, sharpness, clearness, and overall image quality were visually evaluated under three lighting environments via categorical judgment method for various application types of test images. On the basis of Pearson correlation coefficients and factor analysis, the overall image quality could first be predicted by its two constituent attributes with multiple linear regression functions for different types of images, respectively, and then the mathematical expressions were built to link the constituent image quality attributes with the physical parameters of smart mobile devices and image appearance factors. The procedure and algorithms were applicable to various smart mobile devices, different lighting conditions, and multiple types of images, and performance was verified by the visual data. PMID:25967010

  9. No-reference remote sensing image quality assessment using a comprehensive evaluation factor

    NASA Astrophysics Data System (ADS)

    Wang, Lin; Wang, Xu; Li, Xiao; Shao, Xiaopeng

    2014-05-01

    The conventional image quality assessment algorithm, such as Peak Signal to Noise Ratio (PSNR), Mean Square Error(MSE) and structural similarity (SSIM), needs the original image as a reference. It's not applicable to the remote sensing image for which the original image cannot be assumed to be available. In this paper, a No-reference Image Quality Assessment (NRIQA) algorithm is presented to evaluate the quality of remote sensing image. Since blur and noise (including the stripe noise) are the common distortion factors affecting remote sensing image quality, a comprehensive evaluation factor is modeled to assess the blur and noise by analyzing the image visual properties for different incentives combined with SSIM based on human visual system (HVS), and also to assess the stripe noise by using Phase Congruency (PC). The experiment results show this algorithm is an accurate and reliable method for Remote Sensing Image Quality Assessment.

  10. Performance of electronic portal imaging devices (EPIDs) used in radiotherapy: image quality and dose measurements.

    PubMed

    Cremers, F; Frenzel, Th; Kausch, C; Albers, D; Schönborn, T; Schmidt, R

    2004-05-01

    The aim of our study was to compare the image and dosimetric quality of two different imaging systems. The first one is a fluoroscopic electronic portal imaging device (first generation), while the second is based on an amorphous silicon flat-panel array (second generation). The parameters describing image quality include spatial resolution [modulation transfer function (MTF)], noise [noise power spectrum (NPS)], and signal-to-noise transfer [detective quantum efficiency (DQE)]. The dosimetric measurements were compared with ionization chamber as well as with film measurements. The response of the flat-panel imager and the fluoroscopic-optical device was determined performing a two-step Monte Carlo simulation. All measurements were performed in a 6 MV linear accelerator photon beam. The resolution (MTF) of the fluoroscopic device (f 1/2 = 0.3 mm(-1)) is larger than of the amorphous silicon based system (f 1/2 = 0.21 mm(-1)), which is due to the missing backscattered photons and the smaller pixel size. The noise measurements (NPS) show the correlation of neighboring pixels of the amorphous silicon electronic portal imaging device, whereas the NPS of the fluoroscopic system is frequency independent. At zero spatial frequency the DQE of the flat-panel imager has a value of 0.008 (0.8%). Due to the minor frequency dependency this device may be almost x-ray quantum limited. Monte Carlo simulations verified these characteristics. For the fluoroscopic imaging system the DQE at low frequencies is about 0.0008 (0.08%) and degrades with higher frequencies. Dose measurements with the flat-panel imager revealed that images can only be directly converted to portal dose images, if scatter can be neglected. Thus objects distant to the detector (e.g., inhomogeneous dose distribution generated by a modificator) can be verified dosimetrically, while objects close to a detector (e.g., a patient) cannot be verified directly and must be scatter corrected prior to verification. This is

  11. Image quality, tissue heating, and frame rate trade-offs in acoustic radiation force impulse imaging.

    PubMed

    Bouchard, Richard R; Dahl, Jeremy J; Hsu, Stephen J; Palmeri, Mark L; Trahey, Gregg E

    2009-01-01

    The real-time application of acoustic radiation force impulse (ARFI) imaging requires both short acquisition times for a single ARFI image and repeated acquisition of these frames. Due to the high energy of pulses required to generate appreciable radiation force, however, repeated acquisitions could result in substantial transducer face and tissue heating. We describe and evaluate several novel beam sequencing schemes which, along with parallel-receive acquisition, are designed to reduce acquisition time and heating. These techniques reduce the total number of radiation force impulses needed to generate an image and minimize the time between successive impulses. We present qualitative and quantitative analyses of the trade-offs in image quality resulting from the acquisition schemes. Results indicate that these techniques yield a significant improvement in frame rate with only moderate decreases in image quality. Tissue and transducer face heating resulting from these schemes is assessed through finite element method modeling and thermocouple measurements. Results indicate that heating issues can be mitigated by employing ARFI acquisition sequences that utilize the highest track-to-excitation ratio possible. PMID:19213633

  12. Open source database of images DEIMOS: extension for large-scale subjective image quality assessment

    NASA Astrophysics Data System (ADS)

    Vítek, Stanislav

    2014-09-01

    DEIMOS (Database of Images: Open Source) is an open-source database of images and video sequences for testing, verification and comparison of various image and/or video processing techniques such as compression, reconstruction and enhancement. This paper deals with extension of the database allowing performing large-scale web-based subjective image quality assessment. Extension implements both administrative and client interface. The proposed system is aimed mainly at mobile communication devices, taking into account advantages of HTML5 technology; it means that participants don't need to install any application and assessment could be performed using web browser. The assessment campaign administrator can select images from the large database and then apply rules defined by various test procedure recommendations. The standard test procedures may be fully customized and saved as a template. Alternatively the administrator can define a custom test, using images from the pool and other components, such as evaluating forms and ongoing questionnaires. Image sequence is delivered to the online client, e.g. smartphone or tablet, as a fully automated assessment sequence or viewer can decide on timing of the assessment if required. Environmental data and viewing conditions (e.g. illumination, vibrations, GPS coordinates, etc.), may be collected and subsequently analyzed.

  13. Quality Enhancement and Nerve Fibre Layer Artefacts Removal in Retina Fundus Images by Off Axis Imaging

    SciTech Connect

    Giancardo, Luca; Meriaudeau, Fabrice; Karnowski, Thomas Paul; Li, Yaquin; Tobin Jr, Kenneth William; Chaum, Edward

    2011-01-01

    Retinal fundus images acquired with non-mydriatic digital fundus cameras are a versatile tool for the diagnosis of various retinal diseases. Because of the ease of use of newer camera models and their relative low cost, these cameras are employed worldwide by retina specialists to diagnose diabetic retinopathy and other degenerative diseases. Even with relative ease of use, the images produced by these systems sometimes suffer from reflectance artefacts mainly due to the nerve fibre layer (NFL) or other camera lens related reflections. We propose a technique that employs multiple fundus images acquired from the same patient to obtain a single higher quality image without these reflectance artefacts. The removal of bright artefacts, and particularly of NFL reflectance, can have great benefits for the reduction of false positives in the detection of retinal lesions such as exudate, drusens and cotton wool spots by automatic systems or manual inspection. If enough redundant information is provided by the multiple images, this technique also compensates for a suboptimal illumination. The fundus images are acquired in straightforward but unorthodox manner, i.e. the stare point of the patient is changed between each shot but the camera is kept fixed. Between each shot, the apparent shape and position of all the retinal structures that do not exhibit isotropic reflectance (e.g. bright artefacts) change. This physical effect is exploited by our algorithm in order to extract the pixels belonging to the inner layers of the retina, hence obtaining a single artefacts-free image.

  14. Human vision model for the objective evaluation of perceived image quality applied to MRI and image restoration

    NASA Astrophysics Data System (ADS)

    Salem, Kyle A.; Wilson, David L.

    2002-12-01

    We are developing a method to objectively quantify image quality and applying it to the optimization of interventional magnetic resonance imaging (iMRI). In iMRI, images are used for live-time guidance of interventional procedures such as the minimally invasive treatment of cancer. Hence, not only does one desire high quality images, but they must also be acquired quickly. In iMRI, images are acquired in the Fourier domain, or k-space, and this allows many creative ways to image quickly such as keyhole imaging where k-space is preferentially subsampled, yielding suboptimal images at very high frame rates. Other techniques include spiral, radial, and the combined acquisition technique. We have built a perceptual difference model (PDM) that incorporates various components of the human visual system. The PDM was validated using subjective image quality ratings by naive observers and task-based measures defined by interventional radiologists. Using the PDM, we investigated the effects of various imaging parameters on image quality and quantified the degradation due to novel imaging techniques. Results have provided significant information about imaging time versus quality tradeoffs aiding the MR sequence engineer. The PDM has also been used to evaluate other applications such as Dixon fat suppressed MRI and image restoration. In image restoration, the PDM has been used to evaluate the Generalized Minimal Residual (GMRES) image restoration method and to examine the ability to appropriately determine a stopping condition for such iterative methods. The PDM has been shown to be an objective tool for measuring image quality and can be used to determine the optimal methodology for various imaging applications.

  15. Diffusion imaging quality control via entropy of principal direction distribution.

    PubMed

    Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A

    2013-11-15

    Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called

  16. Diffusion imaging quality control via entropy of principal direction distribution

    PubMed Central

    Oguz, Ipek; Smith, Rachel G.; Verde, Audrey R.; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L.; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C.; Paterson, Sarah; Evans, Alan C.; Styner, Martin A.

    2013-01-01

    Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, “venetian blind” artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here

  17. Integrating empowerment evaluation and quality improvement to achieve healthcare improvement outcomes

    PubMed Central

    Wandersman, Abraham; Alia, Kassandra Ann; Cook, Brittany; Ramaswamy, Rohit

    2015-01-01

    While the body of evidence-based healthcare interventions grows, the ability of health systems to deliver these interventions effectively and efficiently lags behind. Quality improvement approaches, such as the model for improvement, have demonstrated some success in healthcare but their impact has been lessened by implementation challenges. To help address these challenges, we describe the empowerment evaluation approach that has been developed by programme evaluators and a method for its application (Getting To Outcomes (GTO)). We then describe how GTO can be used to implement healthcare interventions. An illustrative healthcare quality improvement example that compares the model for improvement and the GTO method for reducing hospital admissions through improved diabetes care is described. We conclude with suggestions for integrating GTO and the model for improvement. PMID:26178332

  18. A widefield fluorescence microscope with a linear image sensor for image cytometry of biospecimens: Considerations for image quality optimization

    NASA Astrophysics Data System (ADS)

    Hutcheson, Joshua A.; Majid, Aneeka A.; Powless, Amy J.; Muldoon, Timothy J.

    2015-09-01

    Linear image sensors have been widely used in numerous research and industry applications to provide continuous imaging of moving objects. Here, we present a widefield fluorescence microscope with a linear image sensor used to image translating objects for image cytometry. First, a calibration curve was characterized for a custom microfluidic chamber over a span of volumetric pump rates. Image data were also acquired using 15 μm fluorescent polystyrene spheres on a slide with a motorized translation stage in order to match linear translation speed with line exposure periods to preserve the image aspect ratio. Aspect ratios were then calculated after imaging to ensure quality control of image data. Fluorescent beads were imaged in suspension flowing through the microfluidics chamber being pumped by a mechanical syringe pump at 16 μl min-1 with a line exposure period of 150 μs. The line period was selected to acquire images of fluorescent beads with a 40 dB signal-to-background ratio. A motorized translation stage was then used to transport conventional glass slides of stained cellular biospecimens. Whole blood collected from healthy volunteers was stained with 0.02% (w/v) proflavine hemisulfate was imaged to highlight leukocyte morphology with a 1.56 mm × 1.28 mm field of view (1540 ms total acquisition time). Oral squamous cells were also collected from healthy volunteers and stained with 0.01% (w/v) proflavine hemisulfate to demonstrate quantifiable subcellular features and an average nuclear to cytoplasmic ratio of 0.03 (n = 75), with a resolution of 0.31 μm pixels-1.

  19. A widefield fluorescence microscope with a linear image sensor for image cytometry of biospecimens: Considerations for image quality optimization.

    PubMed

    Hutcheson, Joshua A; Majid, Aneeka A; Powless, Amy J; Muldoon, Timothy J

    2015-09-01

    Linear image sensors have been widely used in numerous research and industry applications to provide continuous imaging of moving objects. Here, we present a widefield fluorescence microscope with a linear image sensor used to image translating objects for image cytometry. First, a calibration curve was characterized for a custom microfluidic chamber over a span of volumetric pump rates. Image data were also acquired using 15 μm fluorescent polystyrene spheres on a slide with a motorized translation stage in order to match linear translation speed with line exposure periods to preserve the image aspect ratio. Aspect ratios were then calculated after imaging to ensure quality control of image data. Fluorescent beads were imaged in suspension flowing through the microfluidics chamber being pumped by a mechanical syringe pump at 16 μl min(-1) with a line exposure period of 150 μs. The line period was selected to acquire images of fluorescent beads with a 40 dB signal-to-background ratio. A motorized translation stage was then used to transport conventional glass slides of stained cellular biospecimens. Whole blood collected from healthy volunteers was stained with 0.02% (w/v) proflavine hemisulfate was imaged to highlight leukocyte morphology with a 1.56 mm × 1.28 mm field of view (1540 ms total acquisition time). Oral squamous cells were also collected from healthy volunteers and stained with 0.01% (w/v) proflavine hemisulfate to demonstrate quantifiable subcellular features and an average nuclear to cytoplasmic ratio of 0.03 (n = 75), with a resolution of 0.31 μm pixels(-1). PMID:26429450

  20. A widefield fluorescence microscope with a linear image sensor for image cytometry of biospecimens: Considerations for image quality optimization

    SciTech Connect

    Hutcheson, Joshua A.; Majid, Aneeka A.; Powless, Amy J.; Muldoon, Timothy J.

    2015-09-15

    Linear image sensors have been widely used in numerous research and industry applications to provide continuous imaging of moving objects. Here, we present a widefield fluorescence microscope with a linear image sensor used to image translating objects for image cytometry. First, a calibration curve was characterized for a custom microfluidic chamber over a span of volumetric pump rates. Image data were also acquired using 15 μm fluorescent polystyrene spheres on a slide with a motorized translation stage in order to match linear translation speed with line exposure periods to preserve the image aspect ratio. Aspect ratios were then calculated after imaging to ensure quality control of image data. Fluorescent beads were imaged in suspension flowing through the microfluidics chamber being pumped by a mechanical syringe pump at 16 μl min{sup −1} with a line exposure period of 150 μs. The line period was selected to acquire images of fluorescent beads with a 40 dB signal-to-background ratio. A motorized translation stage was then used to transport conventional glass slides of stained cellular biospecimens. Whole blood collected from healthy volunteers was stained with 0.02% (w/v) proflavine hemisulfate was imaged to highlight leukocyte morphology with a 1.56 mm × 1.28 mm field of view (1540 ms total acquisition time). Oral squamous cells were also collected from healthy volunteers and stained with 0.01% (w/v) proflavine hemisulfate to demonstrate quantifiable subcellular features and an average nuclear to cytoplasmic ratio of 0.03 (n = 75), with a resolution of 0.31 μm pixels{sup −1}.

  1. Optimization of exposure in panoramic radiography while maintaining image quality using adaptive filtering.

    PubMed

    Svenson, Björn; Larsson, Lars; Båth, Magnus

    2016-01-01

    Objective The purpose of the present study was to investigate the potential of using advanced external adaptive image processing for maintaining image quality while reducing exposure in dental panoramic storage phosphor plate (SPP) radiography. Materials and methods Thirty-seven SPP radiographs of a skull phantom were acquired using a Scanora panoramic X-ray machine with various tube load, tube voltage, SPP sensitivity and filtration settings. The radiographs were processed using General Operator Processor (GOP) technology. Fifteen dentists, all within the dental radiology field, compared the structural image quality of each radiograph with a reference image on a 5-point rating scale in a visual grading characteristics (VGC) study. The reference image was acquired with the acquisition parameters commonly used in daily operation (70 kVp, 150 mAs and sensitivity class 200) and processed using the standard process parameters supplied by the modality vendor. Results All GOP-processed images with similar (or higher) dose as the reference image resulted in higher image quality than the reference. All GOP-processed images with similar image quality as the reference image were acquired at a lower dose than the reference. This indicates that the external image processing improved the image quality compared with the standard processing. Regarding acquisition parameters, no strong dependency of the image quality on the radiation quality was seen and the image quality was mainly affected by the dose. Conclusions The present study indicates that advanced external adaptive image processing may be beneficial in panoramic radiography for increasing the image quality of SPP radiographs or for reducing the exposure while maintaining image quality. PMID:26478956

  2. Beyond image quality: designing engaging interactions with digital products

    NASA Astrophysics Data System (ADS)

    de Ridder, Huib; Rozendaal, Marco C.

    2008-02-01

    Ubiquitous computing (or Ambient Intelligence) promises a world in which information is available anytime anywhere and with which humans can interact in a natural, multimodal way. In such world, perceptual image quality remains an important criterion since most information will be displayed visually, but other criteria such as enjoyment, fun, engagement and hedonic quality are emerging. This paper deals with engagement, the intrinsically enjoyable readiness to put more effort into exploring and/or using a product than strictly required, thus attracting and keeping user's attention for a longer period of time. The impact of the experienced richness of an interface, both visually and degree of possible manipulations, was investigated in a series of experiments employing game-like user interfaces. This resulted in the extension of an existing conceptual framework relating engagement to richness by means of two intermediating variables, namely experienced challenge and sense of control. Predictions from this revised framework are evaluated against results of an earlier experiment assessing the ergonomic and hedonic qualities of interactive media. Test material consisted of interactive CD-ROM's containing presentations of three companies for future customers.

  3. Measuring the image quality of digital-camera sensors by a ping-pong ball

    NASA Astrophysics Data System (ADS)

    Pozo, Antonio M.; Rubiño, Manuel; Castro, José J.; Salas, Carlos; Pérez-Ocón, Francisco

    2014-07-01

    In this work, we present a low-cost experimental setup to evaluate the image quality of digital-camera sensors, which can be implemented in undergraduate and postgraduate teaching. The method consists of evaluating the modulation transfer function (MTF) of digital-camera sensors by speckle patterns using a ping-pong ball as a diffuser, with two handmade circular apertures acting as input and output ports, respectively. To specify the spatial-frequency content of the speckle pattern, it is necessary to use an aperture; for this, we made a slit in a piece of black cardboard. First, the MTF of a digital-camera sensor was calculated using the ping-pong ball and the handmade slit, and then the MTF was calculated using an integrating sphere and a high-quality steel slit. Finally, the results achieved with both experimental setups were compared, showing a similar MTF in both cases.

  4. Concepts for evaluation of image quality in digital radiology

    NASA Astrophysics Data System (ADS)

    Zscherpel, U.; Ewert, U.; Jechow, M.

    2012-05-01

    Concepts for digital image evaluation are presented for Computed Radiography (CR) and Digital Detector Arrays (DDAs) used for weld inspection. The precise DDA calibration yields an extra ordinary increase of contrast sensitivity up to 10 times in relation to film radiography. Restrictions in spatial resolution caused by pixel size of the DDA are compensated by increased contrast sensitivity. First CR standards were published in 2005 to support the application of phosphor imaging plates in lieu of X-ray film, but they need already a revision based on experiences reported by many users. One of the key concepts is the usage of signal-to-noise (SNR) measurements as equivalent to the optical density of film and film system class. The contrast sensitivity, measured by IQI visibility, depends on three essential parameters: The basic spatial resolution (SRb) of the radiographic image, the achieved signal-to-noise ratio (SNR) and the specific contrast (μeff - effective attenuation coefficient). Knowing these 3 parameters for the given exposure condition, inspected material and monitor viewing condition permits the calculation of the just visible IQI element. Furthermore, this enables the optimization of exposure conditions. The new ISO/FDIS 17636-2 describes the practice for digital radiography with CR and DDAs. It considers the first time compensation principles, derived from the three essential parameters. The consequences are described.

  5. Damage and quality assessment in wheat by NIR hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Delwiche, Stephen R.; Kim, Moon S.; Dong, Yanhong

    2010-04-01

    Fusarium head blight is a fungal disease that affects the world's small grains, such as wheat and barley. Attacking the spikelets during development, the fungus causes a reduction of yield and grain of poorer processing quality. It also is a health concern because of the secondary metabolite, deoxynivalenol, which often accompanies the fungus. While chemical methods exist to measure the concentration of the mycotoxin and manual visual inspection is used to ascertain the level of Fusarium damage, research has been active in developing fast, optically based techniques that can assess this form of damage. In the current study a near-infrared (1000-1700 nm) hyperspectral image system was assembled and applied to Fusarium-damaged kernel recognition. With anticipation of an eventual multispectral imaging system design, 5 wavelengths were manually selected from a pool of 146 images as the most promising, such that when combined in pairs or triplets, Fusarium damage could be identified. We present the results of two pairs of wavelengths [(1199, 1474 nm) and (1315, 1474 nm)] whose reflectance values produced adequate separation of kernels of healthy appearance (i.e., asymptomatic condition) from kernels possessing Fusarium damage.

  6. Multiscale bilateral filtering for improving image quality in digital breast tomosynthesis

    PubMed Central

    Lu, Yao; Chan, Heang-Ping; Wei, Jun; Hadjiiski, Lubomir M.; Samala, Ravi K.

    2015-01-01

    and enhanced the CNR of microcalcifications compared to the TpV method, thus preserving the image quality of the structured background. The MSBF method achieved the highest CNR of microcalcifications among the three methods. The FWHM of the microcalcifications and mass spiculations resulting from the MSBF method was comparable to that without regularization, and superior to that of the TpV method. Conclusions: The SART regularized by the multiscale bilateral filtering method enhanced the CNR of microcalcifications and preserved the sharpness of microcalcifications and spiculated masses. The MSBF method provided better image quality of the structured background and was superior to TpV and NR for enhancing microcalcifications while preserving the appearance of mass margins. PMID:25563259

  7. Comparison of no-reference image quality assessment machine learning-based algorithms on compressed images

    NASA Astrophysics Data System (ADS)

    Charrier, Christophe; Saadane, AbdelHakim; Fernandez-Maloigne, Christine

    2015-01-01

    No-reference image quality metrics are of fundamental interest as they can be embedded in practical applications. The main goal of this paper is to perform a comparative study of seven well known no-reference learning-based image quality algorithms. To test the performance of these algorithms, three public databases are used. As a first step, the trial algorithms are compared when no new learning is performed. The second step investigates how the training set influences the results. The Spearman Rank Ordered Correlation Coefficient (SROCC) is utilized to measure and compare the performance. In addition, an hypothesis test is conducted to evaluate the statistical significance of performance of each tested algorithm.

  8. Broadcast quality 3840 × 2160 color imager operating at 30 frames/s

    NASA Astrophysics Data System (ADS)

    Iodice, Robert M.; Joyner, Michael; Hong, Canaan S.; Parker, David P.

    2003-05-01

    Both the active column sensor (ACS) pixel sensing technology and the PVS-Bus multiplexer technology have been applied to a color imaging array to produce an extraordinarily high resolution, color imager of greater than 8 million pixels with image quality and speed suitable for a broad range of applications including digital cinema, broadcast video and security/surveillance. The imager has been realized in a standard 0.5 μm CMOS technology using double-poly and triple metal (DP3M) construction and features a pixel size of 7.5 μm by 7.5 μm. Mask level stitching enables the construction of a high quality, low dark current imager having an array size of 16.2 mm by 28.8 mm. The image array aspect ratio is 16:9 with a diagonal of 33 mm making it suitable for HDTV applications using optics designed for 35 mm still photography. A high modulation transfer function (MTF) is maintained by utilizing micro lenses along with an RGB Bayer pattern color filter array. The frame rate of 30 frames/s in progressive mode is achieved using the PVS-Bus technology with eight output ports, which corresponds to an overall pixel rate of 248 M-pixel per second. High dynamic range and low fixed pattern noise are achieved by combining photodiode pixels with the ACS pixel sensing technology and a modified correlated double-sampling (CDS) technique. Exposure time can be programmed by the user from a full frame of integration to as low as a single line of integration in steps of 14.8 μs. The output gain is programmable from 0dB to +12dB in 256 steps; the output offset is also programmable over a range of 765 mV in 256 steps. This QuadHDTV imager has been delivered to customers and has been demonstrated in a prototype camera that provides full resolution video with all image processing on board. The prototype camera operates at 2160p24, 2160p30 and 2160i60.

  9. Content-weighted video quality assessment using a three-component image model

    NASA Astrophysics Data System (ADS)

    Li, Chaofeng; Bovik, Alan Conrad

    2010-01-01

    Objective image and video quality measures play important roles in numerous image and video processing applications. In this work, we propose a new content-weighted method for full-reference (FR) video quality assessment using a three-component image model. Using the idea that different image regions have different perceptual significance relative to quality, we deploy a model that classifies image local regions according to their image gradient properties, then apply variable weights to structural similarity image index (SSIM) [and peak signal-to-noise ratio (PSNR)] scores according to region. A frame-based video quality assessment algorithm is thereby derived. Experimental results on the Video Quality Experts Group (VQEG) FR-TV Phase 1 test dataset show that the proposed algorithm outperforms existing video quality assessment methods.

  10. Potential of organic filter materials for treating greywater to achieve irrigation quality: a review.

    PubMed

    Dalahmeh, Sahar S; Hylander, Lars D; Vinnerås, Björn; Pell, Mikael; Oborn, Ingrid; Jönsson, Håkan

    2011-01-01

    The objectives of this literature review were to: (i) evaluate the impact of greywater generated in rural communities, with the emphasis on Jordanian conditions, on soil, plant and public health and assess the need for treatment of this greywater before it is used for irrigation, and (ii) assess the potential of different types of organic by-products as carrier material in different filter units for removal of pollutants from greywater. Greywater with high BOD5, COD, high concentrations of SS, fat, oil and grease and high levels of surfactants is commonly found in rural areas in Jordan. Oxygen depletion, odour emission, hydrophobic soil phenomena, plant toxicity, blockage of piping systems and microbiological health risks are common problems associated with greywater without previous treatment. Organic by-products such as wood chips, bark, peat, wheat straw and corncob may be used as carrier material in so-called mulch filters for treating wastewater and greywater from different sources. A down-flow-mode vertical filter is a common setup used in mulch filters. Wastewaters with a wide range of SS, cBOD5 and COD fed into different mulch filters have been studied. The different mulch materials achieved SS removal ranging between 51 and 91%, a BOD5 reduction range of 55-99.9%, and COD removal of 51-98%. Most types of mulches achieved a higher organic matter removal than that achieved by an ordinary septic tank. Bark, peat and wood chips filters removed organic matter better than sand and trickling filters, under similar conditions. Release of filter material and increase in COD in the effluent was reported using some mulch materials. In conclusion, some mulch materials such as bark, peat and woodchips seem to have a great potential for treatment of greywater in robust, low-tech systems. They can be expected to be resilient in dealing with variable low and high organic loads and shock loads. PMID:21902020

  11. Characterizing image quality in a scanning laser ophthalmoscope with differing pinholes and induced scattered light

    NASA Astrophysics Data System (ADS)

    Hunter, Jennifer J.; Cookson, Christopher J.; Kisilak, Marsha L.; Bueno, Juan M.; Campbell, Melanie C. W.

    2007-05-01

    We quantify the effects on scanning laser ophthalmoscope image quality of controlled amounts of scattered light, confocal pinhole diameter, and age. Optical volumes through the optic nerve head were recorded for a range of pinhole sizes in 12 subjects (19-64 years). The usefulness of various overall metrics in quantifying the changes in fundus image quality is assessed. For registered and averaged images, we calculated signal-to-noise ratio, entropy, and acutance. Entropy was best able to distinguish differing image quality. The optimum confocal pinhole diameter was found to be 50 μm (on the retina), providing improved axial resolution and image quality under all conditions.

  12. Image Quality Performance Measurement of the microPET Focus 120

    NASA Astrophysics Data System (ADS)

    Ballado, Fernando Trejo; López, Nayelli Ortega; Flores, Rafael Ojeda; Ávila-Rodríguez, Miguel A.

    2010-12-01

    The aim of this work is to evaluate the characteristics involved in the image reconstruction of the microPET Focus 120. For this evaluation were used two different phantoms; a miniature hot-rod Derenzo phantom and a National Electrical Manufacturers Association (NEMA) NU4-2008 image quality (IQ) phantom. The best image quality was obtained when using OSEM3D as the reconstruction method reaching a spatial resolution of 1.5 mm with the Derenzo phantom filled with 18F. Image quality test results indicate a superior image quality for the Focus 120 when compared to previous microPET models.

  13. Task-based measures of image quality and their relation to radiation dose and patient risk

    PubMed Central

    Barrett, Harrison H.; Myers, Kyle J.; Hoeschen, Christoph; Kupinski, Matthew A.; Little, Mark P.

    2015-01-01

    The theory of task-based assessment of image quality is reviewed in the context of imaging with ionizing radiation, and objective figures of merit (FOMs) for image quality are summarized. The variation of the FOMs with the task, the observer and especially with the mean number of photons recorded in the image is discussed. Then various standard methods for specifying radiation dose are reviewed and related to the mean number of photons in the image and hence to image quality. Current knowledge of the relation between local radiation dose and the risk of various adverse effects is summarized, and some graphical depictions of the tradeoffs between image quality and risk are introduced. Then various dose-reduction strategies are discussed in terms of their effect on task-based measures of image quality. PMID:25564960

  14. Task-based measures of image quality and their relation to radiation dose and patient risk

    NASA Astrophysics Data System (ADS)

    Barrett, Harrison H.; Myers, Kyle J.; Hoeschen, Christoph; Kupinski, Matthew A.; Little, Mark P.

    2015-01-01

    The theory of task-based assessment of image quality is reviewed in the context of imaging with ionizing radiation, and objective figures of merit (FOMs) for image quality are summarized. The variation of the FOMs with the task, the observer and especially with the mean number of photons recorded in the image is discussed. Then various standard methods for specifying radiation dose are reviewed and related to the mean number of photons in the image and hence to image quality. Current knowledge of the relation between local radiation dose and the risk of various adverse effects is summarized, and some graphical depictions of the tradeoffs between image quality and risk are introduced. Then various dose-reduction strategies are discussed in terms of their effect on task-based measures of image quality.

  15. Task-based measures of image quality and their relation to radiation dose and patient risk.

    PubMed

    Barrett, Harrison H; Myers, Kyle J; Hoeschen, Christoph; Kupinski, Matthew A; Little, Mark P

    2015-01-21

    The theory of task-based assessment of image quality is reviewed in the context of imaging with ionizing radiation, and objective figures of merit (FOMs) for image quality are summarized. The variation of the FOMs with the task, the observer and especially with the mean number of photons recorded in the image is discussed. Then various standard methods for specifying radiation dose are reviewed and related to the mean number of photons in the image and hence to image quality. Current knowledge of the relation between local radiation dose and the risk of various adverse effects is summarized, and some graphical depictions of the tradeoffs between image quality and risk are introduced. Then various dose-reduction strategies are discussed in terms of their effect on task-based measures of image quality. PMID:25564960

  16. SENTINEL-2 image quality and level 1 processing

    NASA Astrophysics Data System (ADS)

    Meygret, Aimé; Baillarin, Simon; Gascon, Ferran; Hillairet, Emmanuel; Dechoz, Cécile; Lacherade, Sophie; Martimort, Philippe; Spoto, François; Henry, Patrice; Duca, Riccardo

    2009-08-01

    In the framework of the Global Monitoring for Environment and Security (GMES) programme, the European Space Agency (ESA) in partnership with the European Commission (EC) is developing the SENTINEL-2 optical imaging mission devoted to the operational monitoring of land and coastal areas. The Sentinel-2 mission is based on a twin satellites configuration deployed in polar sun-synchronous orbit and is designed to offer a unique combination of systematic global coverage with a wide field of view (290km), a high revisit (5 days at equator with two satellites), a high spatial resolution (10m, 20m and 60 m) and multi-spectral imagery (13 bands in the visible and the short wave infrared spectrum). SENTINEL-2 will ensure data continuity of SPOT and LANDSAT multispectral sensors while accounting for future service evolution. This paper presents the main geometric and radiometric image quality requirements for the mission. The strong multi-spectral and multi-temporal registration requirements constrain the stability of the platform and the ground processing which will automatically refine the geometric physical model through correlation technics. The geolocation of the images will take benefits from a worldwide reference data set made of SENTINEL-2 data strips geolocated through a global space-triangulation. These processing are detailed through the description of the level 1C production which will provide users with ortho-images of Top of Atmosphere reflectances. The huge amount of data (1.4 Tbits per orbit) is also a challenge for the ground processing which will produce at level 1C all the acquired data. Finally we discuss the different geometric (line of sight, focal plane cartography, ...) and radiometric (relative and absolute camera sensitivity) in-flight calibration methods that will take advantage of the on-board sun diffuser and ground targets to answer the severe mission requirements.

  17. Image reconstruction for PET/CT scanners: past achievements and future challenges

    PubMed Central

    Tong, Shan; Alessio, Adam M; Kinahan, Paul E

    2011-01-01

    PET is a medical imaging modality with proven clinical value for disease diagnosis and treatment monitoring. The integration of PET and CT on modern scanners provides a synergy of the two imaging modalities. Through different mathematical algorithms, PET data can be reconstructed into the spatial distribution of the injected radiotracer. With dynamic imaging, kinetic parameters of specific biological processes can also be determined. Numerous efforts have been devoted to the development of PET image reconstruction methods over the last four decades, encompassing analytic and iterative reconstruction methods. This article provides an overview of the commonly used methods. Current challenges in PET image reconstruction include more accurate quantitation, TOF imaging, system modeling, motion correction and dynamic reconstruction. Advances in these aspects could enhance the use of PET/CT imaging in patient care and in clinical research studies of pathophysiology and therapeutic interventions. PMID:21339831

  18. Sparse Representation-Based Image Quality Index With Adaptive Sub-Dictionaries.

    PubMed

    Li, Leida; Cai, Hao; Zhang, Yabin; Lin, Weisi; Kot, Alex C; Sun, Xingming

    2016-08-01

    Distortions cause structural changes in digital images, leading to degraded visual quality. Dictionary-based sparse representation has been widely studied recently due to its ability to extract inherent image structures. Meantime, it can extract image features with slightly higher level semantics. Intuitively, sparse representation can be used for image quality assessment, because visible distortions can cause significant changes to the sparse features. In this paper, a new sparse representation-based image quality assessment model is proposed based on the construction of adaptive sub-dictionaries. An overcomplete dictionary trained from natural images is employed to capture the structure changes between the reference and distorted images by sparse feature extraction via adaptive sub-dictionary selection. Based on the observation that image sparse features are invariant to weak degradations and the perceived image quality is generally influenced by diverse issues, three auxiliary quality features are added, including gradient, color, and luminance information. The proposed method is not sensitive to training images, so a universal dictionary can be adopted for quality evaluation. Extensive experiments on five public image quality databases demonstrate that the proposed method produces the state-of-the-art results, and it delivers consistently well performances when tested in different image quality databases. PMID:27295675

  19. Functional imaging using the retinal function imager: direct imaging of blood velocity, achieving fluorescein angiography-like images without any contrast agent, qualitative oximetry, and functional metabolic signals.

    PubMed

    Izhaky, David; Nelson, Darin A; Burgansky-Eliash, Zvia; Grinvald, Amiram

    2009-07-01

    The Retinal Function Imager (RFI; Optical Imaging, Rehovot, Israel) is a unique, noninvasive multiparameter functional imaging instrument that directly measures hemodynamic parameters such as retinal blood-flow velocity, oximetric state, and metabolic responses to photic activation. In addition, it allows capillary perfusion mapping without any contrast agent. These parameters of retinal function are degraded by retinal abnormalities. This review delineates the development of these parameters and demonstrates their clinical applicability for noninvasive detection of retinal function in several modalities. The results suggest multiple clinical applications for early diagnosis of retinal diseases and possible critical guidance of their treatment. PMID:19763751

  20. Douglas Battery Mfg. Co. achieves outstanding air quality, energy savings - with dust collection/recirculating system

    SciTech Connect

    Not Available

    1988-08-01

    Douglas Battery Manufacturing Company of Winston-Salem, NC has engineered a filtration system that not only delivers excellent air quality - it also reduces heating costs in the plant, since the filtered air is recirculated through the work area after it leaves the dust-collection unit. Douglas engineers reviewed several alternatives, including pulse jet baghouses, before selecting a Tenkay aspirated cartridge dust collection from Farr Company, El Segundo, CA. At present, Douglas is operating four Tenkay collectors. The average air to filter surface ratio of a Farr cartridge is 1.5:1. Two of the units handle 21,500 cfm each, the others handle 25,000 cfm each. Testing by U.S. EPA Reference Method 12 confirmed that the unit's emissions are significantly lower than those established by federal New Source Performance Standards.

  1. Effect of labeling density and time post labeling on quality of antibody-based super resolution microscopy images

    NASA Astrophysics Data System (ADS)

    Bittel, Amy M.; Saldivar, Isaac; Dolman, Nicholas; Nickerson, Andrew K.; Lin, Li-Jung; Nan, Xiaolin; Gibbs, Summer L.

    2015-03-01

    Super resolution microscopy (SRM) has overcome the historic spatial resolution limit of light microscopy, enabling fluorescence visualization of intracellular structures and multi-protein complexes at the nanometer scale. Using single-molecule localization microscopy, the precise location of a stochastically activated population of photoswitchable fluorophores is determined during the collection of many images to form a single image with resolution of ~10-20 nm, an order of magnitude improvement over conventional microscopy. One of the key factors in achieving such resolution with single-molecule SRM is the ability to accurately locate each fluorophore while it emits photons. Image quality is also related to appropriate labeling density of the entity of interest within the sample. While ease of detection improves as entities are labeled with more fluorophores and have increased fluorescence signal, there is potential to reduce localization precision, and hence resolution, with an increased number of fluorophores that are on at the same time in the same relative vicinity. In the current work, fixed microtubules were antibody labeled using secondary antibodies prepared with a range of Alexa Fluor 647 conjugation ratios to compare image quality of microtubules to the fluorophore labeling density. It was found that image quality changed with both the fluorophore labeling density and time between completion of labeling and performance of imaging study, with certain fluorophore to protein ratios giving optimal imaging results.

  2. Investigation of the effect of subcutaneous fat on image quality performance of 2D conventional imaging and tissue harmonic imaging.

    PubMed

    Browne, Jacinta E; Watson, Amanda J; Hoskins, Peter R; Elliott, Alex T

    2005-07-01

    Tissue harmonic imaging (THI) has been reported to improve contrast resolution, tissue differentiation and overall image quality in clinical examinations. However, a study carried out previously by the authors (Brown et al. 2004) found improvements only in spatial resolution and not in contrast resolution or anechoic target detection. This result may have been due to the homogeneity of the phantom. Biologic tissues are generally inhomogeneous and THI has been reported to improve image quality in the presence of large amounts of subcutaneous fat. The aims of the study were to simulate the distortion caused by subcutaneous fat to image quality and thus investigate further the improvements reported in anechoic target detection and contrast resolution performance with THI compared with 2D conventional imaging. In addition, the effect of three different types of fat-mimicking layer on image quality was examined. The abdominal transducer of two ultrasound scanners with 2D conventional imaging and THI were tested, the 4C1 (Aspen-Acuson, Siemens Co., CA, USA) and the C5-2 (ATL HDI 5000, ATL/Philips, Amsterdam, The Netherlands). An ex vivo subcutaneous pig fat layer was used to replicate beam distortion and phase aberration seen clinically in the presence of subcutaneous fat. Three different types of fat-mimicking layers (olive oil, lard and lard with fish oil capsules) were evaluated. The subcutaneous pig fat layer demonstrated an improvement in anechoic target detection with THI compared with 2D conventional imaging, but no improvement was demonstrated in contrast resolution performance; a similar result was found in a previous study conducted by this research group (Brown et al. 2004) while using this tissue-mimicking phantom without a fat layer. Similarly, while using the layers of olive oil, lard and lard with fish oil capsules, improvements due to THI were found in anechoic target detection but, again, no improvements were found for contrast resolution for any of the

  3. A comparative study based on image quality and clinical task performance for CT reconstruction algorithms in radiotherapy.

    PubMed

    Li, Hua; Dolly, Steven; Chen, Hsin-Chen; Anastasio, Mark A; Low, Daniel A; Li, Harold H; Michalski, Jeff M; Thorstad, Wade L; Gay, Hiram; Mutic, Sasa

    2016-01-01

    CT image reconstruction is typically evaluated based on the ability to reduce the radiation dose to as-low-as-reasonably-achievable (ALARA) while maintaining acceptable image quality. However, the determination of common image quality metrics, such as noise, contrast, and contrast-to-noise ratio, is often insufficient for describing clinical radiotherapy task performance. In this study we designed and implemented a new comparative analysis method associating image quality, radiation dose, and patient size with radiotherapy task performance, with the purpose of guiding the clinical radiotherapy usage of CT reconstruction algorithms. The iDose4 iterative reconstruction algorithm was selected as the target for comparison, wherein filtered back-projection (FBP) reconstruction was regarded as the baseline. Both phantom and patient images were analyzed. A layer-adjustable anthropomorphic pelvis phantom capable of mimicking 38-58 cm lateral diameter-sized patients was imaged and reconstructed by the FBP and iDose4 algorithms with varying noise-reduction-levels, respectively. The resulting image sets were quantitatively assessed by two image quality indices, noise and contrast-to-noise ratio, and two clinical task-based indices, t